I can agree with this sentiment. There's a big narratives on HN about corporate drones using C# and Java to implement specifications from on high versus the elite hackers using cool languages such as Clojure, Ruby, Python etc. The truth, of course, is that narrative is not 100% accurate. I think it really boils down to the individual's interest in technology and willingness to engage in the workplace. Those who develop software as a profession, regardless of the workplace, will seek to engaged and active in the technology being used, even if it just means debating with coworkers. Those who are in technology for a job, on the other hand, will simply do what is asked of them without thought. Despite claims to the obvious, both paths are valid.
Some of the best experiences I've had developing have been with the uncool languages where we've had the ability to design the system end-to-end, get feedback from stakeholders, and adjust as necessary. Crunch was rare because we knew the system in and out, and could easily manage our technical debt as we went. These days, everyone [here] thinks programming is little more than throwing as many ready-made pieces at a problem and duct-taping them together to call it a solution, because productivity and don't reinvent the wheel and community.
The result is mediocre products with bog-standard interfaces (productivity), suboptimal architectures (so devs are easily replaced), and speed prized over quality. We cling to design fads, rather than creating original designs.
In short, it reflects a culture-wide lack of ambition. It's easy to create a cat picture sharing app. It's not easy to think beyond that because we're mentally beholden to the idea that there isn't enough money, and we need to get ours now, and then we can go after our big ideas.
Tech was supposed to be transformative. Now it asks us to be impressed with business models that prey on addictive behaviors, advertising, and annoying users because we lack the imagination and will to do anything better.