Hacker News new | past | comments | ask | show | jobs | submit login

If you're interested in existential risks check out effective altruism. It's a philanthropic movement whose idea is to maximize the amount of good we can do with our resources. Much of the research is focused on x-risks. I've come to the conclusion that non-profit is the only sector that can tackle x-risks because it's not profitable and there's no political will, which pretty much leaves ea researchers as the only people looking at the area.



Note that some popular effective altruists have a little bit of tunnel vision when it comes to X-risks, which leads them to overfund X-risk-due-to-AI over something like, say, X-risk-due-to-asteroids. (X-risk-due-to-nanotech is being suitably supported by not funding nanotech.)

Plus, some people working on the AI X-risk problem are doing it for nothing, which means the state of funding is a bit weird.


> tunnel vision when it comes to X-risks, which leads them to overfund X-risk-due-to-AI over something like, say, X-risk-due-to-asteroids.

Risks from natural events like asteroids are actually quite well understood and we have tight bounds that they aren't that risky per century. The natural risks cause area probably deserves more funding on a global level, but EAs are definitely thinking about it. It's literally the first section in 80k's intro article about X-risks: https://80000hours.org/articles/extinction-risk/




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: