Hacker News new | past | comments | ask | show | jobs | submit login

Ze founder here. We adapted the scraper to our needs, and figured others have probably wanted to do what you're suggesting, so we decided to share it. In fact we pick up the scrape and ingest it into a distributed column store ourselves, where we use it with logs and do anomaly detection with it. I think Prometheus is pretty good at what it does. But like you're saying, depending on what you want to do with the data, sometimes a different backend is useful.



The big innovation here is the data compression. The lack of metric type for the standard remote storage interface is a good note.

Wouldn't adding that let you switch back to the main project and lower the local storage buffer to as small as possible?


Hi.

I would be interested in a demo of the logs AI stuff using real data. Something like, https://www.honeycomb.io/play/ would do.

Do you have such?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: