Hacker News new | past | comments | ask | show | jobs | submit login

> The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”

You already need point a) to be in place to comply with EU laws and directives (DSA, anti-terrorism [1]) anyway, and I think the UK has anti-terrorism laws with similar wording, and the US with CSAM laws.

Point b) is required if you operate in Germany, there have been a number of court rulings that platforms have to take down repetitive uploads of banned content [2].

Point c) is something that makes sense, it's time to crack down hard on "nudifiers" and similar apps.

Point d) is the one I have the most issues with, although that's nothing new either, unmasking users via a barely fleshed out subpoena or dragnet orders has been a thing for many many years now.

This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies. They can afford to hire proper moderation staff to handle such complaints, they just don't want to because it impacts their bottom line - at the cost of everyone affected by AI slop.

[1] https://eucrim.eu/news/rules-on-removing-terrorist-content-o...

[2] https://www.lto.de/recht/nachrichten/n/vizr6424-bgh-renate-k...






This is one of those cases where the need to "do something" is strong, but that doesn't excuse terrible implementations.

Especially at a time when the US is becoming increasingly authoritarian.


The EU has a different approach to this kind of regulation than the USA [0]. EU regulations are more about principles and outcomes, while US regulation is more about strict rules and compliance with procedures. The EU tends to only impose fines if the regulations are deliberately being ignored, while the US imposes fines for any non-compliance with the regs.

So while you can compare the two, it's not an apples-to-apples comparison. You need to squint a bit.

The DMCA has proven to be way too broad, but there's no appetite to change that because it's very useful for copyright holders, and only hurts small content producers/owners. This looks like it's heading the same way.

> This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies.

I don't see any exemptions for small businesses, so how do you conclude this?

[0] https://www.grcworldforums.com/risk/bridging-global-business... mentions this but I couldn't find a better article specifically addressing the differences in approach.


Yeah one thing the EU as well as most local European courts care about is showing good faith, not just to the court but also to the other party. (Due to how US law works, this isn't quite the case in the US: respect for the court is enforced, respect for the other party isn't required.)

One of the big reasons why CJEU is handing out massive fines to the big tech companies is because of the blatant non-compliance to orders from local courts and DPAs by constantly demanding appeals to them and refusing to even attempt to do anything until CJEU affirms the local courts (and usually ramping up the fines a bit more at that). It's a deliberate perversion of the judicial process so that GAFAM can keep violating the law by delaying the proper final judgement. They'd not have gotten higher fines if they just complied with the local courts while the appeal was going on; that'd be a show of good faith, which European courts tend to value.


Ding ding ding. You are right on the money. Our entire legal system is based on first time offenders getting off with a slap on the wrist (if even that, you'll most likely get away with merely a warning if it's a new law or you were just incompetent or had bad luck) and only if you keep offending you'll get screwed eventually. Our aim is to have as much voluntary compliance as possible to the laws.

In contrast, American jurisdiction is "come down hard from the get-go if it's warranted, but the richer you are the better your chances are at just lawyering yourself out of trouble", and the cultural attitude is "it's better to ask for forgiveness than permission". That fundamentally clashes with our attitude, and not just the general public but especially the courts don't like it when companies blatantly ignore our democratic decisions just to make short term money (e.g. Uber and AirBnB).


As I understood, it propose for broad filter so more content which should fall under "fair use" will now be take down faster.

> not your small mom-and-pop startup

not sure why you said this, it's the artists / content makers that suffer.


None of which is acceptable

[flagged]


The fallacy is in expecting corporations to play the role of the government.

Suppose someone posts a YouTube video that you claim is defamatory. How is Google supposed to know if it is or not? It could be entirely factual information that you're claiming is false because you don't want to be embarrassed by the truth. Google is not a reasonable forum for third parties to adjudicate legal disputes because they have no capacity to ascertain who is lying.

What the government is supposed to be doing in these cases is investigating crimes and bringing charges against the perpetrators. Only then they have to incur the costs of investigating the things they want to pass laws against, and take the blame for charges brought against people who turn out to be innocent etc.

So instead the politicians want to pass the buck and pretend that it's an outrage when corporations with neither the obligation nor the capacity to be the police predictably fail in the role that was never theirs.


[flagged]


You need to expound on why as your replies are not only unacceptable but remarkably useless.

Try dialogue.


Is it? Why should the big tech giants be exempted from the laws and regulations that apply for everyone else?

We don’t punish telecoms, ISPs or the mail company for “facilitating terrorism”. Where do you draw the line?

These rules have serious consequences for privacy, potential for abuse, and also raise the barriers immensely for new companies to start up.

The problem is quite obvious when you consider that Trump supporters label anything they dislike as fake news, even when the facts are known and available to everyone. These rules would allow any opposition to be easily silenced. Restricting the measures to terrorism, illegal pornography, and other serious crimes would be more acceptable.

Your question is like asking “why don’t we have metal detectors and body scanners on every school and public building”. Just because you can, and it would absolutely increase safety, does not mean it’s a good idea.

IMO legislation should focus on how individuals can be made responsible, and prosecuted when they break the law – not mandating tech companies to become arms of a nanny state.


> We don’t punish telecoms, ISPs or the mail company for “facilitating terrorism”. Where do you draw the line?

None of those attempt to curate, moderate, or pass judgment on the content they carry. They are essentially pipes that pass content through.

Social Media (and forums like HN) act more as publishers: they decide what is and is not allowed, and through that moderation, they are more responsible for what user content they choose to make available.

I’m not in favor of this law, but we should not pretend that Social Media is blameless when naughty people use it to publish their stuff.


Slippery slope. See how far we've fallen.

And yet, why are we here? It would seem some bad actors have lead us to the now familiar, "That's why we can't have nice things." (It's as though permissiveness and anonymity have a slippery slope as well.)

You can hate the legislation, but it would be nice to hear some alternative ideas that go beyond, "We just have to accept all the horrific stuff that the bad actors out there want to throw onto the internet.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: