Hacker News new | past | comments | ask | show | jobs | submit login

You can get models that run offline. The other risk is copyright/licensing exposure; e.g. the AI regurgitates a recognisably large chunk of GPL code, and suddenly you have a legal landmine in your project waiting to be discovered. There's no sane way for a reviewer to spot this situation in general.

You can ask a human to not do that, and there are various risks to them personally if they do so regardless. I'd like to see the AI providers take on some similar risks instead of disclaiming them in their EULAs before I trust them the way I might a human.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: