|
|
| | Ask HN: Best way to approach user-facing analytics | | 3 points by superarch 7 months ago | hide | past | favorite | 4 comments | | I’m looking for advice on technical strategies for user-facing analytics features (think real-time customizable dashboard). I’ve ended up in a situation where I want to design some analytics dashboards for end users that display aggregates/historical data for a number of years. The queries I want to run take way too long to run on-the-fly so I’ve resorted to pre-computing the data I care about on a schedule. This approach works fine but it’s not real time and it adds some friction to introducing new features/views (every new feature requires backfilling years worth of data). I’m curious if others have better strategies for developing analytics software like this that’s real-time, responsive, and quick to iterate on? OLAP DBs like clickhouse? Reverse ETL? Better OLTP indexing? |
|

Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
|
If replica server cannot execute queries fast enough, a specialized (optimized for OLAP-workload) database should be used instead. This can be a cloud service (like BigQuery, Redshift, Snowflake, Motherduck) or self-hosted solutions (ClickHouse, PostgreSql with pg_analytics extension, or even in-process DuckDB). Data sync is performed either with scheduled full-copy (simple, but not suitable for near real-time analytics) or via CDC (see Airbyte).