Postgresql is awesome, but I don't see any advantage when you need to add like 40 aggregations plus joining to 10 tables.. (easy if you denormalize to elastic)
I can think of a bunch off the top of my head that mostly stem from a serious reduction in complexity:
- having one data store instead of two
- no data store synchronization and related consistency headaches
- the ability to easily join/restrict based on business logic (especially access control)
- easier deployments
- easier qa/local setups
- fewer failure scenarios
- although non-standard, you're dealing with sql and can explain/analyze your queries with tools one already knows if one understands postgres
- the ability to extend my existing (Spring Boot) integration tests to test new searching logic without having to worry about a separate test db.
I've been building a travel blogging platform as a side project that has about ~250 users. Through a couple of choice materialized views to create search documents from a number of different tables, Postgres FTS, and some GIN indices, I was able to make a lighting fast search with intelligent suggestions(ie using user specific restrictions such as follower/following, places a user has posted, etc).
Is my search perfect? No. I could do a better job normalizing diacritics, stemming could be improved, etc. But it was dead simple to implement well enough in a very short amount of time. I'm very happy with my current solution. There's a time and place for a dedicated search solution but it's certainly not in fledgling systems without massive load.
Missed the part about serving the response bellow 50ms :)
Not too much traffic.. between 40/50 req/s.. and a lot of other more important things to be stored on postgres (transactions, stocks.. )