I don't think we're ready for the consequences of this technology. My parents are immigrants and my first thought was this might get someone killed or locked up in the old country.
These free generators need to include some kind of audio watermark or key to indicate they are ai imitations. At least raise the barrier for this kind of action to being able to run your own llm or something.
> These free generators need to include some kind of audio watermark or key to indicate they are ai imitations. At least raise the barrier for this kind of action to being able to run your own llm or something.
It might be worth trying, but I'd bet that it's less than 6 months before running it locally means "download the app off the front page of your app store of choice".
There could be a requirement that app store listed apps need to include some sort of audio watermark. While that wouldn't be perfect, since there will always be ways around it, this would still raise the barrier significantly and cut down on much of the abuse.
Most criminals are lazy and/or not very tech savvy. Raising barriers and prosecuting the worst offenders cuts down on all sorts of malicious behavior that is technically feasible.
> At least raise the barrier for this kind of action to being able to run your own llm or something.
I think that would result in the average person being less aware of the capabilities existing and therefore being less prepared to defend against it. It isn't like this would be a world law that was universally enforced anyway.
It's the same pattern over and over - they develop a technology, acknowledge the risk of it being abused and the need for safeguards, but then realize that building in those safeguards will get in the way of turning it into a product and just YOLO release it into the wild anyway. The same thing happened with LLMs, which were deemed "too dangerous to release" due to the risk of producing a massive tidal wave of spam and propaganda, and yet here we are under a massive tidal wave of LLM spam and about to head into the first US election in the unrestricted LLM era. The very first paper on image generation diffusion models called out the risk of it being used for malicious purposes, such as deepfake nudes, and yet here we are in the era of one-click zero-effort deepfake nude generation services using that very technology.
What's the point of considering potential abuses if you're just going to facilitate them regardless? If anything that's worse than not considering abuse at all, because it implies that you know what you've created will result in kids killing themselves after fake nudes of them spread around their school, or enable rampant fraud and extortion through voice cloning, but you believe that's just the price of progress.
What? Are you saying OpenAI, Google, Meta, etc did nothing to combat abuse and provide safeguards?
If so, that is easily provably wrong, unless you think there's an enormous conspiracy involving thousands of people to pretend that they're working on it when they really aren't.
That's a good point, we might not even need new policies. I bet the detectives or a court subpoena could get records of internet history. People might just be able to sue whoever generated the deep fakes that caused damages.
I'm really not sure why you're getting downvoted. It's almost as though HN readers are fully on the AI bandwagon and can't let anything bad be said about it. I assume this is the same crowd that also scoffs at any regulations in tech.
You were right the first time: You really don't understand why HN readers are downvoting the OP comment. Yet that didn't appear to have stopped you from spinning up this AI-bandwagon-riding, regulation-hating strawman to further entrench your hatred towards.
I'm not a downvoter, and I fully agree philosophically with your entire comment, so this is me trying to be helpful by telling you what I would guess based on way too much time over the years on HN.
> These free generators need to include some kind of audio watermark or key to indicate they are ai imitations. At least raise the barrier for this kind of action to being able to run your own llm or something.
My guess is that one of the reasons for downvotes is that your proposed solution is impossible, and even to attempt it would require a massively invasive pro-active law enforcement effort that looks at everything people buy, electricity usage patterns, everything people post online, and more. It would require a regulatory and licensing department that has to approve and monitor people who want to run LLMs, even on their mobile phones. Just trying to define what is and isn't an LLM would be difficult. And even then, plenty will slip through the cracks. So you've essentially turned all of computing into a regulatory and authoritarian nightmare (as the governmnent absolutely would iterate on the regulation toward suppressing stuff they don't like), and still haven't met the original goals. You don't have to look much further than movies/music/etc piracy to see what it would be like. There have been significant efforts to eliminate piracy, and yet it is still very achievable for nearly anyone who wants to do it. Running LLMs would be the same way.
I would love it if we could require watermarks without having to pay the price, when you consider the cost it doesn't seem remotely worth it to me.
Now that said, I don't think people should have downvoted you. Much better would have been to articulate why they disagree with you in a comment. But downvotes are much easier than intellectual engagement.
That’s not what a straw man is (it’s two words btw).
This is a known and often repeated trend in tech: make something under the guise of disruption, with no regard for safety or regulations. Complain that those things are a hindrance to progress and then spend billions and destroy lives permanently to eventually fix the problems when the regulators issue an ultimatum that professionals warned you about.
These free generators need to include some kind of audio watermark or key to indicate they are ai imitations. At least raise the barrier for this kind of action to being able to run your own llm or something.