Hacker News new | past | comments | ask | show | jobs | submit login

How can you sign a statement that AI presents an extinction risk on par with nuclear weapons and then even consider open sourcing your research?

We don't provide nuclear weapons for everyone to keep in their basement, why would someone who believes AI is an existential risk provide their code?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: