Hacker News new | past | comments | ask | show | jobs | submit login

> Dennett told Motherboard that these sorts of ethical considerations will be important in the future, when natural language processing systems become more available. “There are very dangerous prospects lying in the near future of this technology,” he said. “Copyright doesn’t come close to dealing with all of them. GPT-3 is a sort of automatic plagiarist, and unless great care is taken in how it is used, it can do great damage!”

Dennett after none of the experts familiar with his work was able to select the real reply from GPT-generated ones. Seems like we are there, or at least, started on the path.

I'd say it is highly unethical if this technology is used for deception, or if permission is not asked. Not unthinkable to soon be able to publish a high-quality book, by finetuning on the books another author already wrote.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: