The relative surface area AI represents in the existential threats pile is notable but only a slice of the pie.
But there's not much in the pile for mitigation of all those other human-driven threats other than pulling a deus ex machina out of our rear ends.
So while we should definitely discuss the inherent risks, we should also be discussing the risk of opportunity costs in delaying greater scalable intelligence being added to the mix.
This letter seems a bit like myopic fear mongering by people still more caught up in 70s thought experiments about paperclips than the realities facing us in the present day.
The relative surface area AI represents in the existential threats pile is notable but only a slice of the pie.
But there's not much in the pile for mitigation of all those other human-driven threats other than pulling a deus ex machina out of our rear ends.
So while we should definitely discuss the inherent risks, we should also be discussing the risk of opportunity costs in delaying greater scalable intelligence being added to the mix.
This letter seems a bit like myopic fear mongering by people still more caught up in 70s thought experiments about paperclips than the realities facing us in the present day.