Personally, I put our main node.js based app behind nginx which is behind varnish. Varnish caches any get requests, Nginx balances multiple instances of our app running (currently 5 instances) as well as handles x-sendfile requests.
Varnish runs on port 80 - the only real rules here is to make sure you always return(pass) on anything other than GET and HEAD requests.
Nginx runs on port 8080, which Varnish passes all requests through. I set up an upstream cluster pointing to 5 local instances running on the local server (ports in use in this case are 8081 through 8085). By proxy_pass'ing it to the upstream cluster it will round robin through my instances (no need for ip_hash or weighting currently).
My node.js app allows me to send it it's port as a command line option. By using monit I can ensure that each of the 5 instances are running.
One last thing - I was having too many performance issues with having node.js send large files (the application in question is a document management system) back to the client. It does it, but the RPS was too low for our requirements. By implementing XSendfile in Nginx, we can have node.js pass along the appropriate header to Nginx, which then sends the file along. At this point it would also become cached in Varnish. RPS increased to more than acceptable levels, and the system runs a lot smoother now.
If a better writeup is needed, I would be more than happy to try and get something together today or tomorrow..