We use bower and npm for our project. Every couple of months github is under attack by a DDoS or not working correctly leaving us with broken deploy scripts. What is the best way to fix this? We don't like the idea of commiting the node_modules or bower_components folder. Is there a tool which will cache the npm and bower sources so they only have to be downloaded if something changes?
You could install something like Angry Caching Proxy (https://www.npmjs.com/package/angry-caching-proxy) to cache commonly downloaded packages in your local network. That should also please whoever's paying for the NPM repository.
Another thing I used to do was have a simple script that tried to do npm install, if that failed, npm install using the NPM Europe mirror - although I see now that that one's been deprecated because NPM's primary repository has become more reliable and hosted at a CDN with multiple locations across the world, etc.
Another option is Sinopia (https://www.npmjs.com/package/sinopia) which is a private npm repository server. After installing it, you'll have to change the npm registry url to your new private server and any npm requests (install, update, etc.) will go through your private server and fallback onto the official server if a package is unavailable.
Sinopia is great. It is a good idea to run your own npm repository anyway; not only do you have some protection from tampered packages (in combination with shrinkwrap), but you can also publish your own minor bugfix updates and private packages, without having to rely on GitHub URLs.
You're also insulated from npm and GitHub downtime - win-win-win.
Committing downloaded packages is not a bad practice. Yes, it can be a bit big, but otherwise I don't see much problem with it. You will be always sure that the installed packages are compatible with each other.
To be sure everything that we know works together we use things like npm-shrinkwrap files. We don't like it because it makes the git changelogs a lot bigger and almost unreadable if you want to compare a pull request.
Yes the way you should do it is with shrinwrap to ensure the consistency of your dependencies. As for the actual files if you depend on it you should have your own npm repo caching them that you deploy from and have that mirror the public one.
but for small projects or quick deploys absolutely just go ahead and commit the modules.
It's much more robust to have a dedicated build step that is separate from your deploy step.
The build step is where you install deps, do compilation, etc. Then it saves the output. The deploy step can just copy all the files from the last known good build to the new/updated servers.
This way you can always deploy a last-known-good release quickly and without external dependencies.
It also lets you take advantage of package caching on the build server so that as long as your deps don't change, you can deploy new releases to old or new servers without hitting any external services.
I recently rolled out a Sonatype Nexus server at work to handle both NPM and Maven artifacts. Allows us to publish internally, and a combined internal/proxy repo group allows us to serve both our packages and cache global registry stuff. Took ~3 minutes off some of our CI builds.
Hello,
I wrote a script called node-eyefi. https://github.com/komola/node-eyefi it is an alternative to the windows client and acts as the server. We are currently rewriting it in coffeescript.
After a download is complete it can call a custom script.
I use Jing to do just this. It works on Windows and Mac and next to Screenshots also allows to capture up to 5 Minutes of Screencast at a time. Then I have the save location set to the Public Dropbox folder and the link is automatically put into my clipboard.