Consider a common use case of wanting to mix SQL structured data and JSON structured data:
You want a logging service that records events in an SQL table. Each event has base attributes common to all events such as date, event name, component name, user id ... these parameters become columns in the SQL table. Some events can also include arbitrary custom data particular to that kind of event, and you want that data to also be in the database but you don't want to play whack-a-mole with the table schema to make sure every possible custom parameter has a column in the SQL table, in this case just stash the custom data as JSON, and hopefully the database server offers JSON comprehensions that let you access any fields within the JSON data via SQL.
Gin index on all the text arrays. There are about 150k events saved per day and PostgreSQL is running on the same kinds of hardware as SQL Server was. The application is quite read heavy.
There are times actually I find it helpful to break 1NF in PostgreSQL. This is certainly not free but it really helps some sorts of queries, and it avoids having to do a sort of pseudo-EAV to get certain things working.
You want a logging service that records events in an SQL table. Each event has base attributes common to all events such as date, event name, component name, user id ... these parameters become columns in the SQL table. Some events can also include arbitrary custom data particular to that kind of event, and you want that data to also be in the database but you don't want to play whack-a-mole with the table schema to make sure every possible custom parameter has a column in the SQL table, in this case just stash the custom data as JSON, and hopefully the database server offers JSON comprehensions that let you access any fields within the JSON data via SQL.