No it wont! In general it will have worse performance. Deliberate denormalization can be used to optimize for particular queries but it will make other operations more expensive and/or error prone. So it has to be done very carefully and with a full understanding of cost vs benefit.
And many database systems supports materialized queries which give the same optimization without having to compromise the schema.
Have you ever used a complex database in a formal, by-the-book, third normal form?
The number of tables explodes, the number of joins become a pain and every single data change requires careful planning.
As for the "many database systems support materialized queries", I only know of a few that truly do: Oracle and DB2.
PostsgreSQL supports them, but you need to handle updating them yourself. SQL server kind of supports them, but are limited.
In other RDBMs you can work around that using triggers (or similar concepts), but that IMHO is definitely a bad practice and the times I've used that (usually because there was no other option) has only brought me grief.
Denormalization means you (by definition) have duplicate and redundant data in the tables. This certainly requires careful planning around updates to avoid data corruption.
> The number of tables explodes
This is IMHO a backwards way to understand normalization. Normalization tell you how many "logical relations" you have in the data - whether you like it or not. If you decide to squash multiple logical relations into the same physical table then you get redundant data (by definition). This might be a fine optimization, but it doesn't reduce the inherent complexity of the data model. It just saves some joins as the cost of more complex update logic.
I think there's a decent amount of hyperbole, but even taken at face value: Every single data change should require careful planning. Especially if it's a schema change, and not just a query.
You can do a lot with generated code, refactoring, etc. Data is much less agile, so I'm a lot more conservative when it comes to that end. Never mind that with decent naming, table comments etc. it's a good documentation in itself.
> No it wont! In general it will have worse performance.
Since the advice is to only denormalize to improve performance, this would be a contradiction. Yes it makes everything more error-prone, but if you're profiling to check performance, then you only keep the denormalizations that improve the performance you care about.
Yep, exactly, materialised views are the ideal way to create query specific schema - but there's always a cost, namely, refreshing them needs to be done thoughtfully.
No it wont! In general it will have worse performance. Deliberate denormalization can be used to optimize for particular queries but it will make other operations more expensive and/or error prone. So it has to be done very carefully and with a full understanding of cost vs benefit.
And many database systems supports materialized queries which give the same optimization without having to compromise the schema.