Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been extremely critical of "AI Safety" since "how do I hotwire a car?" became the defacto 'things we can't let our LLM say'.

There are plenty of good reasons why hot wiring a car might be necessary, or might save your life. Imagine dying because your helpful AI companion won't tell how to save yourself because that might be dangerous or illegal.

At the end of the day, a person has to do what the AI says, and they have to query the AI.



"I can't do that, Dave."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: