Hacker News new | past | comments | ask | show | jobs | submit login

You wouldn't. Redis doesn't support implementing an LRU cache. Redis does function well as a cache, and is far better than memcached for any data you don't want randomly disappearing, but it just doesn't do an LRU cache.

Redis will only evict data when it's explicitly removed or expired, and it will store as much data as you put into it.

memcached will evict the least-recently used data based upon memory pressure, and will only use as much memory as you configure it to.

They overlap in functionality, but I find they work better in complement to each other than having one replace the other. Redis for data you want to persist, possibly with a timeout -- user sessions, for example. memcached for data you want to cache, as long you have the free memory for it.




You can configure a redis DB using the maxmemory flag instead of setting expiries, so yes it can be used as an LRU cache.

Probably not as fast as memcached, but one situation where it would be appropriate is if you're already using redis in your stack for something else (fast writes). Rather than adding another piece of complexity to your stack you can instead get double use out of redis as a cache as well.

See: http://antirez.com/post/redis-as-LRU-cache.html


My mistake, I was not aware of those configuration options. Thanks for the correction.


Well in your defence, I don't think they're in the current stable release, but rather the 2.2 branch (RC).


Except that algorithm isn't lru


It is in 2.2. Check out:

maxmemory-policy allkeys-lru

How do I know? We ( http://bu.mp ) push 100s of GB though redis every day in LRU mode.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: