Before LLMs were mainstream, rationalists and EA types would come on Hacker News to convince people that worrying about how "weak" AI would be used was a waste of time, because the real problem was the risk of "strong" AI.
Those arguments looked incredibly weak and stupid when they were making them, and they look even stupider now.
And this isn't even their biggest error, which, in my opinion, was classifying AI as a bigger existential risk than climate change.
An entire generation of putatively intelligent people lost in their own nightmares, who, through their work, have given birth to chaos.
Weak ai is a problem, but isn't going to lead to 100% human extinction
Human extinction won't happen until a couple years later, with stronger ai (if it does happen, which I unfortunately think it will- if we remain on our current trajectory)
"This theoretical event that I just made up would lead to 100% human extinction"
Neat, go write science fiction.
Hundreds of billions of dollars are currently being lit on fire to deploy AI datacenters while there's an ecosystem destabilizing heat wave in the ocean. Climate change is a real, measurable, present threat to human civilization. "Strong AI" is something made up by a fan fiction author. Grow up.
It can't be true because it sounds like science fiction to you?
Everything about every part of AI in 2025 sounds exactly like science fiction in every way. We are essentially living in the exact world described in science fiction books this very moment, even though I wish we didn't.
Have you ever used an ai chatbot? How is that not exactly like something you'd find in science fiction?
The idea of “Strong AI” as an “existential risk” is based entirely on thought experiments popularized by a small, insular, drug-soaked fan fiction community. I am begging you to touch grass.
From what I can see, Artificial General Intelligence is a drug-fueled millenarian cult, and attempts to define it that don't consider this angle will fail.
> We need to build and maintain vast AI infrastructure and the energy to power it. To do that, we will continue to reject radical climate dogma and bureaucratic red tape, as the Administration has done since Inauguration Day. Simply put, we need to “Build, Baby, Build!”
This is how you know these people are not serious:
> Prioritize the interconnection of reliable, dispatchable power sources as quickly as possible and embrace new energy generation sources at the technological frontier (e.g., enhanced geothermal, nuclear fission, and nuclear fusion). Reform power markets to align financial incentives with the goal of grid stability, ensuring that investment in power generation reflects the system’s needs.
None of these are "dispatchable power sources." Grid-scale batteries, for which technology and raw materials are abundant in the United States, are dispatchable power sources, and are, for some reason, not mentioned here.
What they will actually do is eviscerate regulations to allow for more construction of natural gas power plants, but they won't mention that here, because any sane person would immediately identify that as a terrible idea.
Additionally, the DOE has been pulling funds from interconnect projects that have been years in the works! Apparently there is a modest gas turbine shortage so even natural gas won’t get that far. I’d say it’s a great way to hit a hard wall fast but again, they are not serious. We’re gonna get nowhere fast, maybe even drift backwards a bit.
Nuclear fission is most often categorized as dispatchable, though it's typically in with the slowest of that group when it is. Of anything out of this administration, a push for more fission power might be the thing I agree with the most as well though, so perhaps that's biasing my read.
Commercial nuclear fusion is just a dream at this point. We might as well debate whether my private island has enough room for an airplane runway or not instead. But hey, I'm not against continuing fusion research if that's all they mean.
EGS I'm far less familiar with but it'd be odd for the current admin to agree with the previous admin unless they had to https://www.energy.gov/sites/default/files/2022-09/EERE-ES-E... and it would, on the surface, make sense one could design these systems to support flow rate variability?
Grid scale batteries are power storage, not power sources. I do agree it's a damn shame they aren't brought up elsewhere in the report though. Same as anything else about renewables missing in tandem with that.
you can figure out with math, climate change is solvable with tech advancement. also the US has pretty clean energy, and likely always will because of fear from future administration changes.
one should be more worried about china or india polluting than the US.
Sorry to burst your bubble, but the US has higher greenhouse gas emissions per capita than both countries you mentioned. Also, in terms of just total greenhouse gasses, the US also emits more than India and is only outpaced by China.
Billions will die from starvation and conflict in a world where we deploy trillions of dollars to increase electricity usage for AI data centers but nowhere near the same amount of capital to decarbonize electricity production, which we can already do with existing technology. This is the world we live in now.
I never really get the cryptocurrency comparison. AI has an application beyond grift. Like, even if they stopped developing it now, an AI “style hints” in the style of spellcheck and grammar rule check would be a no-brainer as a thing to add to an office suite.
The valuations are totally and completely nuts. But, LLMs have little legitimate applications in a way that cryptocurrencies never will.
Also I’m going to go ahead and say that “it’s slightly better than classical NLP for grammar check but requires 10,000x as much compute resources” is not an improvement
Millions of people are using (and paying for) LLMs for their daily work. The number of people using crypto as an actual currency is a rounding error by comparison.
There's definitely similarities when it comes to the wave of hype and greed behind them both, but the fundamentals really are completely different.
I work at a company with hundreds of thousands of employees, and they're mandating the use of AI, monitoring it, and pushing like crazy. Like their life depends on it. You get threatening emails if several days pass without you using AI.
Now tell me again what the usage numbers mean in resepect to usefulness.
This is a huge red flag imo. Mandated usage and yet nothing to show for it.
To top it off Sam Altman is a known dishonest actor (and has already shown his true colors at openai). AI is here to stay and has some truly cool uses. But there’s far too many snake oil salesman involved currently.
I’ve tried to use AI for “real work” a handful of times and have mostly come away disappointed, unimpressed, or annoyed that I wasted my time.
Given the absolutely insane hard resource requirements for these systems that are kind of useful, sometimes, in very limited contexts, I don’t believe its adoption is inevitable.
Maybe one of the reasons for that is that I work in the energy industry and broadly in climate tech. I am painfully aware of how much we need to do with energy in the coming decades to avoid civilizational collapse, and how difficult all of that will be, without adding all of these AI data centers into the mix. Without several breakthroughs in one or more hard engineering disciplines, the mass adoption of AI is not currently physically possible.
That's how people probably felt about the first cars, the first laptops, the first <anything>.
People like you grumbled when their early car broke down in the middle of a dirt road in the boondocks and they had to eat grass and shoot rabbits until the next help arrived. "My horse wouldn't have broken down", they said.
We actually don’t know whether or not meaningful performance gains with LLMs are available using current approaches, and we do know that there are hard physical limits to electricity generation. Yes, technologies mature over time. The history of most AI approaches since the 60s is a big breakthrough followed by diminishing returns. I have not seen any credible argument that this time is different.
There is a weird combination of "this is literal magic and everybody should be using them for everything immediately and the bosses can fire half their workforce and replace them with LLMs" and "well obviously the early technology will be barely functional but in the future it'll be amazing" in this thread.
The first car and first laptop were infinitely better than no car and no laptop. LLMs is like having a drunk junior developer, that's not an improvement at all.
We have been in the phase of diminishing returns for years with LLMs now. There is no more data to train them on. The hallucinations are baked in at a fundamental level and they have no ability to emulate "reasoning" past what's already in their training data. This is not a matter of opinion.
As long as energy production and consumption has severe downstream impacts, yes, we do need to wade into this territory.
All serious, viable plans for decarbonization include a massive increase in electricity consumption, due to electrification of transportation, industrial processes, etc, along with an increase in renewable energy production. This isn't new, but AI datacenters are a very large net new single user of electricity.
If the amount of money already poured into AI had gone into the rollout of clean energy infrastructure, we wouldn't even be having this conversation, but here we are.
It makes perfect sense from a policy perspective, given that there are a small number of players in this space with more resources than most governments, to piggyback on this wave of infrastructure buildout.
It also makes plenty of financial sense. Initial capex for adding clean energy generation is high, but given both the high electricity usage of AI datacenters, and the long-term impact on the grid that someone will eventually have to pay for, companies deploying AI infrastructure would be smart to use the huge amount of capital at their disposal to generate their own electricity.
It's also, from a deployment standpoint, pretty straightforward — we're talking about massive, rectangular, warehouse-like buildings with flat roofs. We should have already mandated that all such buildings be covered in solar panels with on-site storage, at a minimum.
Sadly we’re already in the long term impact of the previous energy revolution, so we’d better get starting now instead of when we’ll feel the impact of this next compute evolution.
Those arguments looked incredibly weak and stupid when they were making them, and they look even stupider now.
And this isn't even their biggest error, which, in my opinion, was classifying AI as a bigger existential risk than climate change.
An entire generation of putatively intelligent people lost in their own nightmares, who, through their work, have given birth to chaos.
reply