Hacker News new | past | comments | ask | show | jobs | submit login
Tell HN: Have a look at Varnish
26 points by eisokant on Aug 16, 2010 | hide | past | favorite | 23 comments
Hey Everyone!

I just wanted to share with you the link to an incredible HTTP accelerator: http://varnish-cache.org/

We've been testing a lot of different stacks for our startup (Tyba) and using NGINX (behind) + Varnish (upfront) has allowed us to go from 9 reqs/sec (loading a Wordpress site) to 2500 reqs/sec.

I know it's not a novelty but I wanted to share it in the hope it helps the few people who don't know about it.

Kind regards,

Eiso




Why would Varnish be required in that situation if you're already running nginx which is perfectly capable of caching a simple wordpress site in the same manner.


I don't think it's required, but, really, why wouldn't you be running varnish? It's just amazing.


Because it doesn't add anything useful to the mix - nginx's cache works just fine.


Sure it does, it adds ESI (which is fantastic). It also keeps backend connections open rather can creating/destroying them for each request. Varnish does restarts nicely too, we have requests that get passed on to another group of servers if the backend returns errors. It's really a very nice way of brokering http requests.


I think the point is 'in this situation'. Of course there are places where Varnish's features are useful, just not here.


More context on why Varnish is superior to Squid: http://varnish-cache.org/wiki/ArchitectNotes


This article should be required reading for anyone building programs with massive memory consumption!

To give some context, this describes how Varnish combines mmapping with a nice memory partitioning scheme to get free caching from the OS, instead of fighting the OS' tendency to cache and managing it yourself.


Varnish = reverse proxy Squid = forwarding proxy


Squid can also be used as a reverse proxy, so it is a reasonable competitor to Varnish. (As is Apache mod_proxy.)


Agreed, varnish is awesome, we use it on a home-brew CDN and it has been a fantastic experience so far.

Also, the guy that wrote it (Poul Henning-Kamp) is very responsive when it comes to answering questions or fixing issues, they have an IRC channel at #varnish on irc.linpro.no.

Definitely a piece of software worth looking in to when your website gets bigger.


PHK has a history of smash hits; he is responsible for much of the work on FreeBSD's container-based virtualisation (jails), modular disk device framework (Geom), & disk encryption (gbde).


For people who want to use NGINX caching on top of Apache/Wordpress, I found out this tutorial : http://www.myatus.co.uk/2010/06/28/a-simplified-nginx-apache...


Thanks for sharing! We've built a drupal powered site using Varnish as well. With a few tweaks, our initial tests showed ~3000 req/sec! Good stuff!


Agreed, nginx+varnish (optionally on top of an existing apache setup) seems to be best practice right now for heavy traffic sites.


This might be most popular setup right now, but I wouldn't consider it the best practice. Why in the world would you use both nginx and varnish?


Because until 0.7 nginx lacked a caching system, and so it made sense to use both. Now it doesn't.


And even now there's no way of manually expiring a cached page without using a third party unsupported plugin (IIRC)

I still love Nginx though :-)


I'm the author of ngx_cache_purge [1]. Would you mind explaining why do you think that it's "unsupported"? This module is fully maintained, it works with every recent nginx release and supports purging content from cache of every upstream supported by nginx (FastCGI, proxy, SCGI, uWSGI).

[1] http://labs.frickle.com/nginx_ngx_cache_purge/


Because varnish is more flexible (VCL) and more time tested.

It's also likely[1] faster and more well behaved under high loads, simply because it has been around for much longer and seen more tuning in large deployments.

[1] Disclaimer: I haven't benchmarked them against one another, that's merely my common-sense assessment.


How is VCL more flexible?


This is a question that interests me as a Varnish user with a several-hundred-line VCL. We exclude certain URLs from caching altogether, drop certain cookies but not others, drop certain querystring params but not others, cache separate versions of the same page for mobile and non-mobile browsers, do limited caching behind basic auth, purge certain pages from cache when a post gets published, etc. VCL isn't the most elegant thing there is, but there's usually a way to do whatever I need. I don't know that Nginx isn't capable of just as much flexibility, but all the HOWTOs I've found are pretty simplistic and I haven't found the docs to be that helpful. If anybody has resources or experience to share regarding complex Nginx caching setups, I'd love to know about it.


I haven't used nginx caching in depth, but I can make a more general point about nginx scripting: It sucks. Hard.

You can get fancy[1] by writing your own modules or by building on top of helpers like ngx_devel_kit, but it gets messy very quick.

Implementing complex rules inside the config file is possible to a degree - but an utter nightmare. The syntax is extremely limited and the semantics are hard to predict (order of execution).

So, what nginx is lacking is a proper scripting interface with full control over the request pipeline. ngx_lua and ngx_v8 are in the works but not production-ready, yet.

Personally I'd go as far and say the entire config file should simply be ripped out and replaced with a scripting language.

[1] http://agentzh.org/misc/slides/nginx-conf-scripting/nginx-co... (press cursor right to walk through slides)





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: