Hacker News new | past | comments | ask | show | jobs | submit login

A short story in the data analysis space, a place where clever and underpaid people are discouraged from materializing logic into tables.

It took me a while to realize the data consumers (those who need to look at and understand data) didnt have any orchestration tools, except a creaky UI cron type of scheduler.

For my work I refused to compromise and spend all my days clicking in a UI, so I searched around for a good tool, decided on Prefect. It served me well, but I didn't realize going in that Prefect/Airflow/Argo etc are really orchestration engines and you still need to write the client software around it to serve your own (or the team) productivity.

For example, connections to sources, methods to extract from source with a sql file, where to put the data after, injecting into publication channels, etc. I all gradually ended up writing functions to support this for my own personal IC work. I sank a ton of time into learning, had to learn how to develop python packages (I chose poetry), on an on. And yet despite all that time spent grinding away on indirectly productive learning I have still been more productive than my peers.

It was a mystery so I spent time learning about my peer workflows, which people are so cagey about at my company. Anyway, everybody just crams 100s of lines of business logic into Tableau custom sql sources to avoid scrutiny of data artifacts like tables they might create on the data lake. I guess these are tableau flavored views, but its so hard to read and understand all this logic in the UI -- oh and the calculated fields also.

I guess to sum up, if I am keeping up with peers using the latest expensive enterprise 'data storytelling' service while I self educate on basically data engineering to stats presentation plots, spark, julia, python, etc. I think I have concluded that Tableau should be considered harmful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: