Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Grafana Loki stores every data chunk in a separate file [1]. Data chunks are created every 2 hours per every stream [2] which receives at least a single log entry during the last 2 hours. This creates 12 * 30 = 360 chunk files per month per every active stream. If Grafana Loki is used for collecting logs from a thousand of services, and each service generates 10 different log streams, then the number of chunk files created by Loki during month will be 1000 * 10 * 360 = 3.6 millions [3]. This sounds like very strange design decision.

This was one of the reasons why we at VictoriaMetrics decided to start working on better solution for logs - VictoriaLogs [4].

[1] https://utcc.utoronto.ca/~cks/space/blog/sysadmin/GrafanaLok...

[2] https://grafana.com/docs/loki/latest/fundamentals/labels/

[3] https://utcc.utoronto.ca/~cks/space/blog/sysadmin/GrafanaLok...

[4] https://www.youtube.com/watch?v=Gu96Fj2l7ls&t=1950s

[5] https://www.slideshare.net/VictoriaMetrics/victorialogs-prev...



As noted [1] they are configurable.

  ingester:
    chunk_retain_period: 30s
    chunk_idle_period: 5m0s
    chunk_block_size: 262144
    chunk_target_size: 1572864
    chunk_encoding: gzip
    max_chunk_age: 2h0m0s

[1] https://grafana.com/docs/loki/latest/operations/storage/file...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: