Didn't see the original comment, but why wouldn't Apple do this? It could be as simple as allowing more complex visual effects (blurring, stereoscopic views), more features in general causing low-RAM devices to suffer.
It doesn't necessarily have to be evil or crazy that they do this. In fact it would be strange if they worried excessively about preserving the performance of all legacy devices.
I think there's malicious and then there's new expectation. Recently on the desktop side, Apple's legacy retention is great- you have iMacs from 2007 I think running El Capitan, albeit a mildly neutered version of it, and treating it like any other update. I've worked with a lot of now legacy machines that spec wise are fit for purpose (write a paper, read some stuff on facebook), but due to software restrictions (individual browsers), they were formerly unable to simply because the browser was no longer supported on OS X10.7 or lower. Now these older machines have a new life.
iOS is slightly different, as each time apple works to retain legacy devices and pushes out the major iteration of iOS, the performanceon other devices do suffer a little, but even with the most recent iOS release they started focusing on slicing down unnecessary parts of apps to save space.
Apple has really been working decently ont he preservation leg of their line up.
I'm not disputing that newer OS's run slower on older devices.
I'm disputing that this is done intentionally to degrade performance and make it more attractive to upgrade - which is what the original comment stated.
They actually disable most of the new effects on older hardware which indicates that they want the software to perform acceptably.
And more to the point, supporting older hardware at all extends the useful life of that hardware by allowing it to run modern applications and have access to new features.