Except it hasn't. Solar irradiance has steadily increased over billions of years, and fluctuates regularly with the solar cycle. I'm not saying that this couldn't have unexpected consequences, but you can't start from the premise that solar irradiance has always been constant.
From the perspective of users of solar energy on Earth, wouldn't an increase over billions of years appear constant? My intuition tells me that evolution is much faster than that, and any fluctuations due to solar cycle dynamics would already be mitigated as expected by the same process, if it is indeed so regular.
What I was getting as is that you can't start with a false claim to make a convincing argument. I didn't see what the expected reduction is from the MIT research, but if it falls within known effects of the solar cycle, Milankovitch cycles, and recorded volcanic activity it's less risky than stating that it's a change that the earth has never seen.
Not my area of expertise, can you please share reasoning as to why reflecting sunlight is disastrous. also please share evidence that attempting multiple strategies is guaranteed to lead to failure or is even probabilistically worse than only relying on current trends of attempting to decrease co2 emissions
Simple: reflecting sunlight does nothing to remove CO2 from the atmosphere, or to reduce the amount going into the atmosphere. The CO2 is the problem, not the temperature. The temperature is a measure of the CO2 problem. Force the temperature, and it ceases to become an accurate measure.
As CO2 continues to build up, ocean pH decreases, reflecting increasing acidification. As pH decreases, the base of the ocean food chain begins to collapse. When the ocean food chain collapses, the main protein source for much of humanity vanishes. Global war follows, and civilization collapse. Slightly lower temperature is unnoticed.
So all the experts who talk about the 2 degree celsius goal did set the goal on the wrong measure? Please be more convincing than just restating your previous hypothesis.
As far as I understand the situation, the increased average temperature is the problem. Not because every day would be exactly n degrees warmer/hotter, but because it leads to way more variance in the atmosphere, i.e. storms, hot and cold extreme wheather etc. The CO2 itself might also induce problems. But they do not dominate the situation.
Failing to control CO2 leads inexorably to global collapse of civilization, regardless of temperature.
Civilization would also collapse as a consequence of extreme temperature.
Temperature increase is easier to limit, but redirecting resources to limiting temperature accelerates CO2 increase, thus collapse from that.
Directing resources to reducing CO2 also limits temperature rise.
Each dollar directed to intervention A is a dollar not directed to intervention B.
Extreme fever can kill the patient. Plunging the patient in ice water cuts fever, but fails to save the patient. Antibiotics may take longer to reduce fever, but offers the possibility of saving the patient.
But betting everything on one horse, company, or intervention is rarely a good strategy. I think Nicolas Nassim Taleb makes compelling arguments in his books, starting from the Black Swan.
It also changes with snow cover. More snow means more reflection, meaning cooler temperatures. A reason the Earth spends such long periods in ice ages.
We are still coming out of the last one.
The net solar flux hasn't changed. What you are talking about is increased albedo in specific areas. This is lowering solar flux globally. How will this effect agriculture? Plant life? Ocean life in twilight zones?
This is radically dangerous, and is only being proposed because we refuse let extractive industries die or change our lifestyles even slightly.
> "This is radically dangerous, and is only being proposed because we refuse let extractive industries die or change our lifestyles even slightly."
So much exactly this… I hear people talk so much about how intelligent and adaptive humans are, and how we're sure to survive almost anything because of that adaptability and intelligence, but then comes time to change something small to actually adapt to a big deal situation (like climate change for one example among many) and nearly all of humanity bands together to fight against even the tiniest change in how we do things, because apparently "the way it's always been done" is by far the best (even when it's provably wrong or bad). I truly hate humanity at this point because of this (among many other quite valid reasons I won't go into here). The Universe will be a better place when we're all gone.
That seems on par with claiming global warming is no big deal because the earth has been warmer or cooler at some point in the past. Those billions of years are irrelevant.
Nuclear winters occur somewhat regularly in earth’s history, usually after a very large volcano erupted but also sometimes after a large asteroid strikes. Unless I’m mistaken the last such natural nuclear winter was after the Pinatubo eruption in 1991, which caused a small but significant global cooling effect of 0.5˚C between 1991 and 1993.
Volcanoes of the same scale happen around every 50 to 100 years, but larger ones with more sever global cooling effects happen every 1000 or so years. Every 50 000 years we can expect a mega-colossal super volcano. The Youngest Toba eruption over 70 kya caused a nuclear winter for over 5 years with an accompanying cooling which lasted possibly for another 1000 years.
Even though these events are natural and happen regularly, they are usually devastating for the life on earth, usually with several species going extinct as a result. There are theories that the Toba eruption almost wiped out all of the human races and created a “bottleneck” in our evolution.
So evidence suggests that a quick dimming event range from being insignificant to catastrophic for the life on earth. There is for sure a reason for caution here.