Hacker News new | past | comments | ask | show | jobs | submit login

Even if thats true why would that be illegal or unethical? She can't possibly have a copyright on all voices that sounds like "her"



There have been cases where it was decided that a person had rights[0] to their distinctive voice, as an extension of the Right of Publicity[1]. For example Midler v. Ford Motor Co.[2], and one or two other cases I've seen mentioned but can't remember.

[0]: Though not necessarily "copyrights"?

[1]: https://higgslaw.com/celebrities-sue-over-unauthorized-use-o...

[2]: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


Midler v. Ford is a poor comparison for this case specifically because of: 1. hiring an impersonator 2. copying a song 3. intending to confuse.

If what OpenAI is saying is true, then none of the conditions apply. I'd say (1) is unlikely, (2) is very unlikely, and (3) is maybe, at least to some degree.


I would suggest that (3) is a solid "yes" given the other communications around this, and honestly the similarity of the final tone of voice.

Very little suggests an intent to confuse more than tweets from company leaders confirming that there was intent to confuse. What is left on the table is whether actual confusion occurred, which is different.


NIL rights are pretty broad, and more like trademark rights than patents or copyrights. The main test isn't similarity, it's whether there is brand confusion. Karpathy and Altman's tweets indicate brand confusion.

Still, this isn't recognized in every state or country, and there aren't many cases yet (although there are laws).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: