This change is partly because of web developers' push for a "richer" and more full-featured -- that is, closer to native code parity -- web platform. There's a spectrum in terms of how conscious web developers and web standards developers may have been of the privacy risks (and many people in W3C committees have worked hard to remove fingerprinting risks from their specs before deployment), but there's been a steady tension between privacy and new functionality for web developers in the web platform.
Of course, things don't have to have been designed as tracking mechanisms in order to end up as tracking mechanisms. One of my favorite (?) examples is HSTS, which can be used as a cookie because you can tell a browser to load 00001.example.com, 00100.example.com, and 01000.example.com securely (while not mentioning 10000.example.com or 00010.example.com). Then if you tell a browser to load one-pixel images from the HTTP versions 00001, 00010, 00100, 01000, and 10000.example.com, you can see which subdomains it adds the HTTPS on and which it doesn't. (This risk is mentioned in section 14.9 of RFC 6797.)
An important historical example that stuck around for a long time was the CSS history bug (Javascript in a web page could query to see if a link's style had changed to match the style of a visited link).
An example that shows completely unintended tracking consequences were sneaking into web protocols a long time ago was Martin Pool's "meantime", described back in 2000. (He has a broken link that suggests that someone may have expressed concern about this back in 1995.)
It's apparently possible to break many of these tracking methods, as the Tor Browser systematically tries to do, but you have to give up a lot of local history and a bit of web platform functionality.
Given what I've heard about web developer pushback against fixing the CSS history query bug, I'm scared to imagine the response to trying to mainstream some of the Tor Browser's fixes in popular browsers!
Of course, things don't have to have been designed as tracking mechanisms in order to end up as tracking mechanisms. One of my favorite (?) examples is HSTS, which can be used as a cookie because you can tell a browser to load 00001.example.com, 00100.example.com, and 01000.example.com securely (while not mentioning 10000.example.com or 00010.example.com). Then if you tell a browser to load one-pixel images from the HTTP versions 00001, 00010, 00100, 01000, and 10000.example.com, you can see which subdomains it adds the HTTPS on and which it doesn't. (This risk is mentioned in section 14.9 of RFC 6797.)
An important historical example that stuck around for a long time was the CSS history bug (Javascript in a web page could query to see if a link's style had changed to match the style of a visited link).
An example that shows completely unintended tracking consequences were sneaking into web protocols a long time ago was Martin Pool's "meantime", described back in 2000. (He has a broken link that suggests that someone may have expressed concern about this back in 1995.)
http://sourcefrog.net/projects/meantime
And there are a lot of unintended browser fingerprinting effects from various UA properties that sites can ask about using Javascript:
https://panopticlick.eff.org/
It's apparently possible to break many of these tracking methods, as the Tor Browser systematically tries to do, but you have to give up a lot of local history and a bit of web platform functionality.
https://www.torproject.org/projects/torbrowser/design/#Imple...
Given what I've heard about web developer pushback against fixing the CSS history query bug, I'm scared to imagine the response to trying to mainstream some of the Tor Browser's fixes in popular browsers!