Hacker News new | past | comments | ask | show | jobs | submit login

I think common sense filters are what prevents accurate predictions of black swan events. Like AI singularity or COVID outbreak. People who usually are capable of reasoning get to a conclusion--e.g AI doom and then dismiss it out of hand by things like it's more likely the logic is flawed bc the prior is so low. But if you're confident in your reasoning--sometimes you have to accept that extremely low prior things can happen



Look, judging things by how they feel has gotten me this far. Why change now?


The pursuit of optimality would be one possible reason.

Judging things by how they feel is a big part of what brought climate change to our door.


It's gone 100% perfectly and you've never ever had a shower thought where you second guessed yourself?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: