Grafana Loki stores every data chunk in a separate file [1]. Data chunks are created every 2 hours per every stream [2] which receives at least a single log entry during the last 2 hours. This creates 12 * 30 = 360 chunk files per month per every active stream. If Grafana Loki is used for collecting logs from a thousand of services, and each service generates 10 different log streams, then the number of chunk files created by Loki during month will be 1000 * 10 * 360 = 3.6 millions [3]. This sounds like very strange design decision.
This was one of the reasons why we at VictoriaMetrics decided to start working on better solution for logs - VictoriaLogs [4].
This was one of the reasons why we at VictoriaMetrics decided to start working on better solution for logs - VictoriaLogs [4].
[1] https://utcc.utoronto.ca/~cks/space/blog/sysadmin/GrafanaLok...
[2] https://grafana.com/docs/loki/latest/fundamentals/labels/
[3] https://utcc.utoronto.ca/~cks/space/blog/sysadmin/GrafanaLok...
[4] https://www.youtube.com/watch?v=Gu96Fj2l7ls&t=1950s
[5] https://www.slideshare.net/VictoriaMetrics/victorialogs-prev...