I run a small website, which shall remain unnamed so as to avoid unneccessary threads. I'm pushing probably a few terabytes of data a month now, so to improve speed whilst keeping costs low, I plan to serve files from multiple machines. A mini-CDN if you will.
The files will be the same across each distribution server, but I am not sure of the best way to replicate the uploaded files across machines.
Example: Someone uploads xyz.tar.gz to Machine #1. I need it replicated, as fast as possible, to Machine #2 and #3, so that when people visit the site, if they get http://cdn-3.mysite.com/, they'll get the file.
Does HN have any suggestions as to the best way to go about this?
Edit: My files aren't very large, they're maybe 5 gb in total and don't grow too fast, they're just accessed a lot.
If you already have a reliable central filestore, varnish or squid might accomplish faster distribution without having to replicate all your files.
Otherwise, I'm curious to see everyone's suggestions. I've looked at more *sync programs than I can count to handle this use case and come up empty-handed.