Some of the problems you are mentioning are already solved by airline industry. How can an airplane fly on autopilot mode in a heavy fog conditions ? Radars and satellite navigation.
And the disasters you've mentioned seems like most happened due to errors in human judgement. A machine can take lot of information at a very high rate and make decisions that are similar if not better than humans.
Aircraft operational environments tend to include far fewer navigational hazards (though drone proliferation could change this).
Aircraft final-approach zones are very tightly controlled. Any idiot, or even a dumb lump of wood, can float through the water.
Not all that long ago: rowing through a local harbor (a transport mode in which one faces sternwards and periodically checks over the shoulder for hazards) I managed to spot a large timber likely broken from a pier. Perhaps 0.3-0.6 meters across and 7-8 meters long, with an iron spike or bolt some 30 cm long protruding from one end. I passed only 3-4 meters from it. A week or so later another boat in our fleet struck a piece of driftwood and suffered a hull puncture (fortunately just above waterline).
Timber, lumber, telephone poles, and other hazards are frequent in water. They can be significant hazards not only for small craft but for larger vessels with fiberglass, wood, or even steel hulls. It's rare for an aircraft to encounter a tree at 10,000m.
Yes, human error is a significant factor in many disasters, but human response is also frequently a factor in surviving disasters. This is particularly the case where automated or navigational inputs turn out to be erroneous.
The Air France 447 incident is an interesting case study in this: a design flaw (pitot tube) lead to an operational obstruction (pitot-tube icing) resulting in loss of navigational data (airspeed), causing automated navigational systems to disengage (autopilot disconnect) and fly-by-wire systems to reconfigure to alternate-law mode, crew failures in supplying inputs, following loss-of-airspeed-indication procedures, and responding to flight-path deviation, lack of proper crew response to stall, attempted dual control of aircraft by both co-pilot and captain (Marc Dubois). Compounding all of this was the question of what instrumentation and alerts to trust (airspeed, stall, altitude). Something was lying, but what wasn't clear.
One more point on the AF 447 incident: sidestick control interface which failed to provide feedback for inputs supplied by the other pilot. Dubois was attempting to correct the aircraft's attitude but was frustrated in this by the copilot's erroneous control inputs. The flight control systems averaged the inputs.
And the disasters you've mentioned seems like most happened due to errors in human judgement. A machine can take lot of information at a very high rate and make decisions that are similar if not better than humans.