> That humans regard the expected value of (arbitrarily large impact) * (arbitrarily small probability) as zero?
There are many arguments that go something like this: We don't know the probability of <extinction-level event>, but because it is considered a maximally bad outcome, any means to prevent it are justified. You will see these types of arguments made to justify radical measures against climate change or AI research, but also in favor space colonization.
These types of arguments are "not even wrong", they can't be mathematically rigorous, because all terms in that equation are undefined, even if you move away from infinities. The nod to mathematics is purely for aesthetics.
There are many arguments that go something like this: We don't know the probability of <extinction-level event>, but because it is considered a maximally bad outcome, any means to prevent it are justified. You will see these types of arguments made to justify radical measures against climate change or AI research, but also in favor space colonization.
These types of arguments are "not even wrong", they can't be mathematically rigorous, because all terms in that equation are undefined, even if you move away from infinities. The nod to mathematics is purely for aesthetics.