I think common sense filters are what prevents accurate predictions of black swan events. Like AI singularity or COVID outbreak. People who usually are capable of reasoning get to a conclusion--e.g AI doom and then dismiss it out of hand by things like it's more likely the logic is flawed bc the prior is so low. But if you're confident in your reasoning--sometimes you have to accept that extremely low prior things can happen