Hacker News new | past | comments | ask | show | jobs | submit login

> This makes sense and I don't understand why self-hosted sites are doing it. The logs are there. Analyzing them is all that is needed.

1) Laziness. A third-party JS tracker usually comes with a complete dashboard, full of pretty (and sometimes even useful) graphs.

2) Data. Client-side trackers can spy on users more, giving you more information you can e.g. missapply in an A/B test trying to drive "engagement".

RE 1, there exist tools aimed at analyzing server logs. I played with GoAccess a bit, it's quite OK. https://goaccess.io/




You don't need a thirds party tracker to get fancy graphs. Matomo is free, open source, self-hosted, works on logs, and has fancy graphs. An ELK stack would also work fine.

Lazy-loaded 1px images can be used for tracking how far users have read and a/b testing can be done by compiling multiple versions and redirecting users to the version you want them to try.


If I could toot my own horn for a moment, Gravwell (gravwell.io) has a 2GB/day free license which should be plenty to ingest web logs. We've got a GeoIP module to resolve IPs to locations, we can display geographic heatmaps (see https://dev.gravwell.io/docs/#!search/map/map.md), a variety of charts, tables, etc.

Matomo looks really polished and if it provides the features you need, it seems like an obvious choice. If OTOH you're looking at rolling your own with ELK, Gravwell might make sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: