This is very interesting to me because a plant this old might be cheaper to operate than a new plant, but might be like the space shuttle in that replacement parts aren’t readily available and thus expensive to custom manufacture.
If you were to step into the control room you’d see analog phones, tiny incandescent bulbs behind plastic covers… looks like a sci-fi set from the 60s.
The expensive part of a reactor isn’t really the reactor or tech itself, it’s the government regulation from the DOE and NRC.
I worked at Areva/Framatome/B&W and IIRC they still have the archival room where hundreds of 4 inch D ring binders held the original design docs that had to be submitted for approval.
Not too disagree with the bulk of comment, but this sentence is not true. They're 90V neon indicator lamps, a technology that's really cool but also so inefficient that people rip it out and replace it.
> The debt facility is being made through the Department of Energy’s Loan Programs Office (LPO), which was formed under the Energy Policy Act of 2005 to foster the growth of clean energy technologies.
> The Inflation Reduction Act, which passed during the Biden administration, created another pot of money under the LPO known as the Energy Infrastructure Reinvestment program. That program was created to restore existing power plants to operation provided they avoid or reduce pollutants or greenhouse gas emissions. The Trump administration kept it largely in tact, rebranding it the Energy Dominance Financing Program.
Congress passed the Energy Policy act of 2005 and then the Inflation Reduction Act allocating money to the DoE to make these loans.
> The debt facility is being made through the Department of Energy’s Loan Programs Office (LPO), which was formed under the Energy Policy Act of 2005 to foster the growth of clean energy technologies
and, more importantly:
> The Inflation Reduction Act, which passed during the Biden administration, created another pot of money under the LPO known as the Energy Infrastructure Reinvestment program. That program was created to restore existing power plants to operation provided they avoid or reduce pollutants or greenhouse gas emissions. The Trump administration kept it largely in tact, rebranding it the Energy Dominance Financing Program.
>That’s cheaper than a brand-new nuclear power plant would cost, but it’s a hefty premium over wind, solar, and geothermal, according to a comparison of energy costs from Lazard.
Nuclear is more expensive because there are extensive regulations. "Green" energy not only does not face so many regulations but it benefits from incentives.
Also, when comparing nuclear with "green" energy, most studies don't take into account the costs of energy storage.
Studies also don't take into account that by subsidizing solar you're subsidizing an industry that's already subsidized by the Chinese gov, it's cool in the short term but we're making the same mistake we've been making for the last 50 years
Home grown nuclear programs will always be better than solar propped up by foreign entities.
The smart thing to do is take advantage of China's subsidies and import every solar panel they have on offer. It's like they're handing out free money.
They stop selling? No problem, the ones you have will work for the next 2 decades. You'll be lucky to build one new nuclear plant in that much time.
75%+ of all batteries/panels/windmill blades are made in China. And if you decide to make them locally they'll be 5-10x more expensive and much less competitive because you don't have:
- super low wages and borderline slave labor
- easy and cheap access to rare earth
- the CCP boosting your industries to flood the world
Frankly, the reason China is the last man standing on solar was their aggressive subsidy in the 2010s. Killed all of the American and European manufacturers, then the subsidy ended and they were the last man standing.
The other things you are said are also true, I just wanted to provide a little historical context.
Chinese subsidies were smaller than European and American subsidies on a relative basis. (but not absolute). The difference was that European and American subsidies also subsidized Chinese panels.
Depends what your goals are. We're sitting in a fabulous position now. Solar is by far the cheapest energy available, which is and will continue to accelerate our transition away from fossil fuels. China is essentially giving the panels away for basically no profit, and supporting very few jobs doing so. America & Europe are getting huge benefits for those subsidies.
But what proportion of the cost of a solar panel is actually made up of rare earth minerals or labour? My understanding was that the cost of installation dominated the cost of manufacture for both wind and solar.
And, sure, plenty of rare earths are needed both for the drivetrains in wind turbines, and for the power electronics used by solar farms. But they'll also be needed for the steam turbines and power electronics in nuclear plants. Seems like it's pretty much a wash to me.
I don't have a dog in this fight, but it seems deeply weird for America to be refusing to even try to meet the challenge.
Reading comprehension must not be your forte, or you're arguing in bad faith. American nuclear reactors aren't built with Chinese tech or by Chinese engineers as far as I can tell.
If you give up your sovereignty on topics like defence, energy or agriculture don't come crying in 20 years when you're someone else's bitch. Ask German's how it's going with the cheap russian gas lmao
More critically: just because a company is worth $1 trillion doesn't mean it has anywhere close to $1 billion in cash at all, let alone able to be earmarked to a given project (at the presumed exclusion of other projects).
Granted, a project like this probably doesn't strictly need all $1 billion all at once, but I'd argue it's better to get whatever necessary funding upfront instead of risking having sunk a partial investment without being able to obtain the rest should the company's financial situation change.
Microsoft had over 94 billion in cash and cash equivalent as of June 2025.
My assumption is that the real reason this is a loan from the government and not paid directly by Microsoft has to do with other factors, like Microsoft not waiting to be in the hook for the billion dollars if the partner company folds, or the potential for loan forgiveness, or other incentives that make the effectively loan cheaper than cash.
Maybe someone can elaborate on this, since I know basically nothing about chemistry or nuclear physics; isn't Three Mile Island still completely irradiated and unsafe for humans to inhabit?
Unit 2 is the reactor that melted down and it has been shut down ever since (and partially decommissioned). Unit 1, a separate reactor at the same site, was operated normally until 2019 when it was shut down due to high costs. It was originally scheduled to be decommissioned by 2079 (sic) but is now being brought back online.
Microsoft committed to purchase the plant's capacity for 20 years. And US electricity demand grew very slowly from 2005 to 2020. It is growing rapidly now.
At around $110/MWh, according to the article. This is about 50% higher than what utility-scale PV or wind would cost. Guess they're using OpenAI accounting.
Are you comparing cost against what electricity currently costs or what it would cost to add capacity? I feel like Microsoft is not acting on hype here, they're going to pay a premium just because it's cool to refire a nuclear plant? Surely they've done the math to decide the feasibility of building out a few acres of solar panels.
There could also be incentives beyond the loan or political pressure we’re not privy to. Such pressure is part of the reason Boeing ended up acquiring McDonnell Douglass even though it wasn’t exactly the financial best move for Boeing. If the US government is serious about restarting its nuclear industry then this is a small first step to build up the skills for building new reactors or refurbishing old ones.
It’s not really that farfetched, either. If the government expects a conflict in the next few decades, solar build out might become much more expensive or impossible since our domestic production might not be enough to support NATO’s growth.
The electricity cost is actually very low compared to the capital cost of the stuff the the electricity runs. But not having access to the electricity means that all that capital is going to waste.
So Microsoft is less price sensitive than other electricity customers.
Plus they get the PR and hype boost from saying they are using nuclear, which is huge right now. Which is big enough that the other hyperscalers thought they had to announce new nuclear projects, even though it will be a decade before those new nuclear projects could ever come on line.
> Running a data center on unreliable energy would be shockingly stupid.
For the right kind of workloads and at sufficient scale, I wonder if this is actually true. (It probably is, but it's fun to hypothesize.) I'm assuming the workloads are mostly AI-related.
AI training presumably isn't super time-sensitive, so could you just pause it while it's cloudy?
AI inference, at least for language models, presumably isn't particularly network-intensive nor latency-sensitive (it's just text). So if one region is currently cloudy... spin it down and transfer load to a different region, where it's sunny? It's kind of like the "wide area grid" concept without actually needing to run power lines.
Yes, I know that in reality the capex of building and equipping a whole DC means you'll want to run it 24/7, but it is fun to think about ways you could take advantage of low cost energy. Maybe in a world where hardware somehow got way cheaper but energy usage remained high we'd see strategies like this get used.
> So if one region is currently cloudy... spin it down and transfer load to a different region, where it's sunny? It's kind of like the "wide area grid" concept without actually needing to run power lines.
> Yes, I know that in reality the capex of building and equipping a whole DC means you'll want to run it 24/7, but it is fun to think about ways you could take advantage of low cost energy.
There's some balance between maximizing capex, business continuity planning, room for growth, and natural peak and trough throughout the day.
You probably don't really want all your DCs maxxed out at the daily peak. Then you have no spare capacity for when you've lost N DCs on you biggest day of the year. N might typically be one, but if you have many DCs, you probably want to plan for two or three down.
Anyway, so on a normal day, when all your DCs are running, you do likely have some flexibility on where tasks run/where traffic lands. It makes sense to move traffic where it costs less to serve, within some reasonable bounds of service degradation. Even if electricity prices are the same, you might move traffic where the ambient temperature is lower, as that would reduce energy used for cooling and with it the energy bill.
You might have some non-interactive, non-time sensitive background jobs that could fill up spare DC capacity... but maybe it's worth putting a dollar amount on those --- if it's sunny and windy and energy is cheap, go ahead ... when it's cloudy and still and energy is expensive, some jobs may need to be descheduled.
> AI training presumably isn't super time-sensitive, so could you just pause it while it's cloudy?
or pause it when "organic traffic" has a peak demand, and resume in off-peak hours, so that the nuclear powerplant can operate efficiently without too much change in its output.
A machine that operates continuously is a perfect machine, and no machine is perfect.
The greater the number and diversity of machines, as well as their geographical dispersion, the greater their availability.
In this respect, a mix of renewables (solar, wind, geothermal, biomass, etc.) deployed on a continental scale, along with storage (batteries and V2(G|H), hydro, green hydrogen...) is unbeatable (total cost, availability, risk, etc.).
I imagine data centers make the best economic sense when they can run full tilt 24/7. You’ll double your payoff time if you can only run work when the sun shines.
Do you have a source for that, when i googled it came up closer to $200/MWh for new york, but that was from older sources. The only thing i saw approaching this price point was if you were somewhere like las vegas.
I also think you would need more than 24 hours battery. You have to prepare for freak weather events that reduce system capacity.
I also wonder what time horizon we are talking. solar and batteries presumably have to be replaced more often than nuclear.
The article cited a report which said new build solar and storage could cost from $50 to $131. And new build wind and storage could cost $44 to $123.[1]
Civilian nuclear reactors replace fuel gradually over 3 to 6 years typically. 20 year old solar panels work now. New solar panels are expected to work over 30 years. Utility scale lithium ion batteries are expected to last 10 to 15 years.
> I also think you would need more than 24 hours battery. You have to prepare for freak weather events that reduce system capacity.
In general, yes. Not really in the context of utility generation for a DC, though. A DC should have onsite backup generation, at least to supply critical loads. If your contracted utility PV + storage runs out, and there's no spare grid capacity available (or it's too expensive) you can switch to onsite power for the duration. The capex for backup power is already required, so you're just looking at additional spending for fuel, maybe maintenance if the situation requires enough hours on backup.
The article said Jefferies analysts estimated Microsoft might pay $110 to $115.
The article cited Lazard analysts' estimates to say this was more expensive than solar or wind. But Lazard's report said new build solar and storage could cost from $50 to $131. And new build wind and storage could cost $44 to $123.
Costs will rise over 20 years almost certainly.
And Microsoft had already large solar and wind power purchase agreements.[1] These could be affected by China's rare earth export controls scheduled to start next year. Hedging this position would be sensible.
You ignore the fact that these datacenters also operate at night and in windless times.
PV did get spectacularly cheaper, but is not a panacea.
Nuclear is great fit for constant load, for example a cloud datacenter where relatively constant utilization is also a business goal and multiple incentives are in place to promote this. (eg. spot pricing to move part of the load off from peaks)
Nuclear power is reliable 24/7 while wind and solar are not and handling this costs money. Microsoft has said that they have more GPUs than electricity to run them so even at $110/MWh it makes sense for them.
I don't know where this '24/7' stuff comes from; they have maintenance outages like anything else. Refueling takes months every couple years, so you're going to have to "handle this" even with nuclear.
"they have maintenance outages like anything else"
not often and most importantly they are PREDICTABLE. You do understand why being able to control when a power plant is operating is a very important thing, right?
i thought the conversation was regarding utilization of capital, in which case 80% is 80%, predictability doesnt change the fact you have to let GPUs sit idle 20% of the time.
I guess if I knew there would be two months with less power I might design my data center to fit into 40 foot containers so I could deploy wherever power and latency are cheapest
The point here being that every single datacenter that's running off nuclear also has a natural gas pipeline running to it or else a massive propane infrastructure because nuclear alone can't get the job done. If your 'clean energy' solution requires a gas pipe, you're misrepresenting its ability to drive the datacenter.
That is just plain incompetence. These are recent US nuclear capacity factors
2023: 93.0%
2022: 92.1%
2021: 92.7%
2020: 92.5%
Nuclear has the highest capacity factor of any other energy source—producing reliable and secure power more than 92% of the time in 2024. That’s nearly twice as much as a coal (42.36%) or natural gas (59.9%) plant that are used more flexibly to meet changing grid demands and almost 3 times more often than wind (34.3%) and solar (23.4%) plants
Nuclear power plants had a 8% share of the total U.S. generation capacity in 2023 but actually produced 18% of the country’s electricity due to its high capacity factor.
As for France's capacity factor, that has a lot to do with the presence of intermittents on the continental grid, combined with the EU's Renewable Energy Directive making France liable to pay fines if they use nuclear power in preference to wind/solar.
The Wikipedia page makes it seem like it's been largely cleaned up for decades:
> In 1988, the NRC announced that, although it was possible to further decontaminate the Unit 2 site, the remaining radioactivity had been sufficiently contained as to pose no threat to public health and safety.
There's not much to go off of on this subject in the US - only two successful reactor projects have been started since the 70s: Units 3 and 4 at Vogtle in Georgia*. They cost $15 billion each and bankrupted the remnants of Westinghouse (in combination with a similar project in South Carolina which was never finished).
*Many reactors started construction in the 70s and were finished in the 80s or 90s, plus Watts Bar Unit 2 which was started in 1972 and finished in 2016 for a total of $5 billion. The US also of course builds many naval reactors.
> There's not much to go off of on this subject in the US
The main problem is that things cost more per unit if you do them less. The first new reactor in decades is going to be stupid expensive because you have new people doing it who are learning things for the first time, which often means doing them over again, which is expensive. And then we didn't even get to see if the second unit at Vogtle could improve on the first because then COVID hit and made everything cost even more.
Whereas the interesting question is, how much do they each cost if you build them at scale?
The obvious problem with studies like these is when they're from the period in history when the attitude towards regulatory costs was that the cost/benefit of any particular rule should be ignored and the total number of rules should only increase over time. Especially so when the study is from France, where these attitudes are even more entrenched and on top of that the program is centrally planned -- not the thing known for bringing efficiency -- and where higher efficiency is regarded as a cost because you're losing union jobs or lucrative contracts and the one paying is the taxpayer or the ratepayer, who has less influence over the government than the unions or construction companies.
Whereas what you want is multiple companies building these things on an assembly line, where you can plop the components out of a truck already approved and ready to be turned on.
> Whereas what you want is multiple companies building these things on an assembly line, where you can plop the components out of a truck already approved and ready to be turned on.
How many trillions in handouts from tax payer money to get there?
Renewables and storage deliver that dream unsubsidized today.
> The study also shows the same pattern in the US.
The modern US where a negligible number of reactors are being built, or the US in the period of the regulatory changes that accompanied the Cold War when everyone was afraid of nuclear bombs and then The China Syndrome and Chernobyl?
If you want it to mean anything you need data from when the regulatory environment isn't increasing in hostility over time during the measurement period.
> How many trillions in handouts from tax payer money to get there?
It's a loan at an interest rate higher than the government itself is paying to issue the bonds, i.e. the taxpayer is making money from this.
Also, the main issue isn't funding it, it's making it cost less, e.g. by moving most of the approvals from being needed at each and every site to the factory where components are being mass produced.
> Renewables and storage deliver that dream unsubsidized today.
Then why does the price per kWh that consumers are paying keep going up instead of down?
I love how it's always someone elses fault that nuclear power never delivered on its promise. Despite handout after handout trying to get the industry to work.
> It's a loan at an interest rate higher than the government itself is paying to issue the bonds, i.e. the taxpayer is making money from this.
This assumes the risk is zero. Which given e.g. Virgil C. Summer is not the case.
> Then why does the price per kWh that consumers are paying keep going up instead of down?
Depends on where you are in the world. In Europe most of the recent price increases are coming from fossil energy becoming expensive.
The ETS system is making coal power expensive and running a peaker on LNG is extremely expensive.
Cost varies with the site conditions. It's one of the many things that push nuclear construction costs up; every build needs to take into consideration the geographic nature of the site (bedrock levels, etc) and so every location requires customizations to the design.
With that said, while it doesn't provide numbers, the article does say the refurbishment (costing $1.6 billion, estimated) will be cheaper than a new build. It'll also likely be much faster, projected to open in 2028.
A quick google search puts construction costs of new nuclear of a Unit 2 size in the $5-10 billion range. 3 Mile Island itself was constructed for $2 billion in 2024 inflation-adjusted dollars. All in all, refurbishing sounds like a good bargain compared to a green field build.
Beat for beat it would probably be cheaper to get the Chinese to do it.Cost per watt charts [1] show US builds still high while they are decreasing for the rest.
Apart from the obvious labour costs difference , theres also the skills at scale.Chinese have been on a continous buildout of new plants , so at this point they have designs/skilled teams for whom this is another routine at this point(i think 30+ under construction concurrently).The US builds are almost artisinal at this point.
And yeah at $1B , given prior examples , it expect them to be late and costs to baloon.Unless they use this as a template to upskill/retrain a workforce that will lead a new buildout so economies of scale take over and put downward pressure on the costs.
Most of the cost is not in building -- it is in planning, wrestling with capricious government agencies, and rebuilding when said agencies come up with "improvements".
100%. We have this thing where every 10 years or so we collectively admit that if we started nuclear 10 years ago it would be fine, but we need it now, so it should remain illegal because its too expensive. Its some of the most insane doublethink in human politics.
And we are swimming in Uranium and water for cooling, and we are tectonically stable.
And every single argument is a weird either or scenario. Like some people want Gas. JUST Gas. Some people want Solar and Wind. JUST Solar and Wind.
Nuclear power can also fill batteries. Can also fill pumped hydro storage. Ditto Gas. Nuke and Gas are good for restarting a grid when theres a catastrophe, see Spain.
Give engineers more tools, not less. Its infuriating.
Doubly so when you consider their enormous reserves of bauxite, iron ore, and met coal.
They could have set up green steel and aluminum industries supplying the world. Instead they ship millions of tons of unprocessed ore and thermal coal to east asia where it's processed with CO2 intensive energy, and then the metals are shipped back in the form of automobiles and construction materials.
If you were to step into the control room you’d see analog phones, tiny incandescent bulbs behind plastic covers… looks like a sci-fi set from the 60s.
The expensive part of a reactor isn’t really the reactor or tech itself, it’s the government regulation from the DOE and NRC.
I worked at Areva/Framatome/B&W and IIRC they still have the archival room where hundreds of 4 inch D ring binders held the original design docs that had to be submitted for approval.
reply