These kinds of articles always make me think about how big a privacy risk the laws that make these possible are. We're worried about user privacy, so we mandate that companies build a way to connect PII to otherwise-pseudonymous browsing data in case the user asks for it. What if someone else asks for it? Huge security risk.
That's not to say these laws are bad, just that the giant additional privacy risk they pose is kind of funny to think about.
It's a somewhat common pattern that regulation intended to stop the encroachment of some bad thing also effectively causes a standard to form around the (now) worst thing you can do and still be legal.
Some people would be paid less - those that produce output below min wage level but the employer has no way to avoid hiring them and just eats the difference. Also those that produce output above min wage but there are many people willing to do the same work for min wage so they don't have a choice to agree to that wage too. Some would be paid more - those that produce output below min wage level and the potential employer doesn't hire them because they can't eat the difference, so they get no wage at all now.
In the end, the free market would decide how much 1 hour of work is worth. If employees are sufficiently desperate, this could be below what it takes to keep the human doing the work alive.
IDK about that. I did say "somewhat common." It emerges in some circumstances.
GDPR is a decent example. The minimum level of privacy required by GDPR is now the standard required whenever standards are required. EG banks, regulators and such now expect you to have as little privacy as GDPR allows. Everything allowable under GDPR is now semi-mandatory for KYC.
Privacy/data related regs really have this kind of tendency. If you must destroy records after X years, this often develops into a mandate to keep records for x years.
That's not to say these laws are bad, just that the giant additional privacy risk they pose is kind of funny to think about.