Hacker News new | past | comments | ask | show | jobs | submit login

> It may make sense to run an hourly or daily job to collect data from the API and then implement the filters exclusively within your back-end.

Absolutely! I thought I could get away with just in-memory caching for the MVP but it looks like I can't.




When it comes to filtering, there's enough unique selections a user can make that, if you're letting the eBay API handle filtering for you, will cause far too many cache misses.

At a previous job I considered a system that would use synchronous API calls to the backend API until it went down (or we got rate limited). When the backend was unavailable we'd switch to filtering in our service using the data we'd previously cached.

I.E. if a cached query asked for (cpu>=3.0Ghz, cores>=2) we can also answer (cores>=4) by filtering the previous result. This wouldnt be able to find any CPUs with less than 3Ghz, unless it there were other cached responses. This works well when a "best effort" response is desirable, even when it's incomplete.


That's a very good idea, thanks! I think I'll have to do exactly that. Maybe in the fallback scenario, I can display a warning that data might be incomplete.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: