Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think the comparison with "employees must wash hands" signs is apt. Should we see legal action against large language model of questionable heritage produced code, the consequences will be dire for companies.

Being able to say to their (probably then ex-) employees "I told you so" will gain them nothing. There will not be nearly enough to get from the breaching ex-employee to compensate for any damages.



Let's say I go to a restaurant, get food poisoning, and sue the restaurant. Turns out the employee didn't wash their hands (or at least, that's the legal theory).

If the restaurant has signs like that up, and consistent messaging to their employees, and new employee training, and other "best practices", then I may still be able to sue them and win. But if they don't have all that stuff, I may be able to sue them for treble damages because of negligence.

That is, such measures may not remove liability, but it limits it.

Note well: IANAL. Others who know more, feel free to offer corrections.


I agree with that. I think the fundamental difference in Google's case is that the suing party is different from the addressee of the warning. Moreover the suing party is powerful and the addressee is completely negligible. Hopefully courts sees that, but who knows?


> Should we see legal action against large language model of questionable heritage produced code, the consequences will be dire for companies.

I think we're already past the point of no return. Almost every active codebase in the world (at least the JS part of it) has been tainted by LLM generated code at this point if dependencies count.

If courts decided the output of LLMs trained on GPL code was subject to GPL, all the active code in the world would need to be released - which seems impossible to enforce.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: