For 99% of code, worrying about the number of cycles in an operation is an utter waste of time. It just doesn't matter. Correctness and readability trump machine efficiency.
Do you work on real-time systems, embedded code, or something similar?
You are repeating a point of view you do not understand.
You don't know WHY people started saying that, WHEN they started and WHO started.
It was started by old people a while ago who told even older people that for the simple stuff they were writing for DESKTOP computers, it didn't matter anymore.
Indeed, if you have a 486dx4 and all you want to do is word processing, it didn't matter much wether it was optimized or microsoft word as the thing was way too powerful for that kind of stuff already.
Today, battery life is a concern, virtualization is a reality, scalability is a CORE issue, there are low power states etc.
Today, making your application 100 times more efficient gives you 10x more battery life,100x lower cloud hosting costs, 100x better scalability, etc.
Think that's unlikely ? You've been stacking inefficient blocks for a lifetime, sometimes with inefficiencies multiplying, where do you think you are today ?
Simple example, from a pgsql>jdbc>jboss>j2ee>hibernate>java report factory to a dumb php script that did simple SQL, you already have factors above 20 in favor of the simple solution.
That's before you make a better data model or even try using a fast language or a more suiting data store depending on your needs.
Besides, your argument is nonsense, the simplest most correct way IS the most efficient, that's the power of programming, there is absolutely NO compromise between reliability and efficiency in terms of code.
Readability is over rated, as long as you don't code crap, any COMPETENT coder will be able to read and understand fast enough, even without comments.
Do you work on overweight UIs that drain phone batteries or cloud-hosted applications or anything that needs scaling ?
Do you work on real-time systems, embedded code, or something similar?