Hacker News new | past | comments | ask | show | jobs | submit login

I'd say it is highly unethical. Imagine thousands of people listening to fake podcasts as if they were real. Not only are they being lied to, but if they find out many are sure to become very angry on learning they wasted their time.

Fortunately, it's not there yet, since it's obviously fake.




I don’t think it’s worse than how people already believe headlines and even pictures of headlines (popular on Twitter, Reddit) while reacting to it much less demanding to read the actual article, if there even was one.

You can even just make up a caption next to an image like “this is Todd from Finland who raised $100k for the war in Ukraine” and this pic might spread virally as everyone pats Todd on the back “omg faith in humanity restored”.

And these scenarios are even less work and easier to consume than your doomsday. It’s already here and you don’t even have to listen to ab audio clip.


True. But listening to an hour-long or half-hour long podcast is considerable more investment than a simple headline. In terms of the investment in time the listener makes.


If you prohibit it then it will only be done secretly by governments and then used against naive populations. At least this way the population can learn how to identify and defend against it.

If we had transparent governance then maybe this wouldn't be a problem.


Why wouldn't people learn to spot it when the government does it? And I don't know if your argument follows - people clearly either don't know or don't care about the rise of Facebook false reporting.


How do people learn better: lots and lots of practice with varying levels of quality and published examples, or against a single powerful adversary that's trying to conceal what they are doing?

I.e. your question is stupid.


> many are sure to become very angry on learning they wasted their time

That is the best case outcome. Given ample real world evidence, however, I think most people will not get to this stage. They will be angry, yes, but at you for trying to introduce facts into their reality.


> wasted their time

That’s the best case scenario. It gets a lot worse than that.

I saw a (scam) video on YouTube that had Elon Musk saying “if you want to make money in crypto, the best way is doing xxx, for example using the website yyy.com”.

I had to do a spit take because it was very convincing for a layman. Both the audio/video were deepfaked, but you couldn’t really tell other than the fact that it was uploaded in 240p.

If my mother had seen that, I’m sure she would have believed it since it’s something Elon would plausibly say. We’re entering a scary time.


I remember that. I reported it to YouTube and never heard anything back. Days later it was still running.

I remember looking at the chain and seeing over 6 figures sent in to the address. It was also timed to line up with one of Musk's "big announcements", which helped it be plausible.

As you said, it was very convincing to a layman - but YouTube aren't laymen, and they must have had dozens if not hundreds of reports about that scam.

Thanks for reminding me of that.


Well he only promotes scams so it was extra believable!


Eh, celebs promote scams all the time. This new way just saves the cost of the bribe, which is a net positive.

More of a concern is putting lies into the mouth of someone who actually is legitimately credible.


> Dennett told Motherboard that these sorts of ethical considerations will be important in the future, when natural language processing systems become more available. “There are very dangerous prospects lying in the near future of this technology,” he said. “Copyright doesn’t come close to dealing with all of them. GPT-3 is a sort of automatic plagiarist, and unless great care is taken in how it is used, it can do great damage!”

Dennett after none of the experts familiar with his work was able to select the real reply from GPT-generated ones. Seems like we are there, or at least, started on the path.

I'd say it is highly unethical if this technology is used for deception, or if permission is not asked. Not unthinkable to soon be able to publish a high-quality book, by finetuning on the books another author already wrote.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: