Hacker News new | past | comments | ask | show | jobs | submit login

Correct me if I'm wrong but if Chinese can produce the same quality at %99 discount, then the supposed $500B investment is actually worth $5B. Isn't that the kind wrong investment that can break nations?

Edit: Just to clarify, I don't imply that this is public money to be spent. It will commission $500B worth of human and material resources for 5 years that can be much more productive if used for something else - i.e. high speed rail network instead of a machine that Chinese built for $5B.




The $500B is just an aspirational figure they hope to spend on data centers to run AI models, such as GPT-o1 and its successors, that have already been developed.

If you want to compare the DeepSeek-R development costs to anything, you should be comparing it to what it cost OpenAI to develop GPT-o1 (not what they plan to spend to run it), but both numbers are somewhat irrelevant since they both build upon prior research.

Perhaps what's more relevant is that DeepSeek are not only open sourcing DeepSeek-R1, but have described in a fair bit of detail how they trained it, and how it's possible to use data generated by such a model to fine-tune a much smaller model (without needing RL) to much improve it's "reasoning" performance.

This is all raising the bar on the performance you can get for free, or run locally, which reduces what companies like OpenAI can charge for it.


Thinking of the $500B as only an aspirational number is wrong. It’s true that the specific Stargate investment isn’t fully invested yet, but that’s hardly the only money being spent on AI development.

The existing hyperscalers have already sunk ungodly amounts of money into literally hundreds of new data centers, millions of GPUs to fill them, chip manufacturing facilities, and even power plants with the impression that, due to the amount of compute required to train and run these models, there would be demand for these things that would pay for that investment. Literally hundreds of billions of dollars spent already on hardware that’s already half (or fully) built, and isn’t easily repurposed.

If all of the expected demand on that stuff completely falls through because it turns out the same model training can be done on a fraction of the compute power, we could be looking at a massive bubble pop.


If the hardware can be used more efficiently to do even more work, the value of the hardware will hold since demand will not reduce but actually increase much faster than supply.

Efficiency going up tends to increase demand by much more than the efficiency-induced supply increase.

Assuming that the world is hungry for as much AI as it can get. Which I think is true, we're nowhere near the peak of leveraging AI. We barely got started.


Perhaps, but this is not guaranteed. For example, demand might shift from datacenter to on-site inference when high-performing models can run locally on consumer hardware. Kind of like how demand for desktop PCs went down in the 2010s as mobile phones, laptops, and ipads became more capable, even though desktops also became even more capable. People found that running apps on their phone was good enough. Now perhaps everyone will want to run inference on-site for security and privacy, and so demand might shift away from big datacenters into desktops and consumer-grade hardware, and those datacenters will be left bidding each other down looking for workloads.


Inference is not where the majority of this CAPEX is used. And even if, monetization will no doubt discourage developers from dispensing the secret sauce to user controlled devices. So I posit that data centres inference is safe for a good while.


> Inference is not where the majority of this CAPEX is used

That's what's baffling with Deepseek's results: they spent very little on training (at least that's what they claim). If true, then it's a complete paradigm shift.

And even if it's false, the more wide AI usage is, the bigger the share of inference will be, and inference cost will be the main cost driver at some point anyway.


You are looking at one model and also you do realize it isn’t even multimodal, also it shifts training compute to inference compute. They are shifting the paradigm for this architecture for LLMs, but I don’t think this is really new either.


> it shifts training compute to inference compute

No, this is the change introduced by o1, what's different with R1 is that its use of RL is fundamentally different (and cheaper) that what OpenAI did.


>Efficiency going up tends to increase demand by much more than the efficiency-induced supply increase.

https://en.wikipedia.org/wiki/Jevons_paradox


The mainframes market disagrees.


Like the cloud compute we all use right now to serve most of what you use online?


Ran thanks to PC parts, that's the point. IBM is nowhere close to Amazon or Azure in terms of cloud, and I suspect most of their customers run on x86_64 anyway.


Microsoft and OpenAI seem to be going through a slow-motion divorce, so OpenAI may well end up using whatever data centers they are building for training as well as inference, but $500B (or even $100B) is so far beyond the cost of current training clusters, that it seems this number is more a reflection on what they are hoping the demand will be - how much they will need to spend on inference capacity.


I agree except on the "isn't easily repurposed" part. Nvidia's chips have CUDA and can be repurposed for many HPC projects once the AI bubble will be done. Meteorology, encoding, and especially any kind of high compute research.


None of those things are going to result in a monetary return of investment though, which is the problem. These big companies are betting a huge amount of their capital on the prospect of being able to make significant profit off of these investments, and meteorology etc isn’t going to do it.


Yes, it's going to benefit all the other areas of research like medical and meteorology, which I'm happy with.


/Literally hundreds of billions of dollars spent already on hardware that’s already half (or fully) built, and isn’t easily repurposed./

It's just data centers full of devices optimized for fast linear algebra, right? These are extremely repurposeable.


For mining dogecoin, right?


Nobody else is doing arithmetic in fp16 though.


What is the rationale for "isn't easily repurposed"?

The hardware can train LLM but also be used for vision, digital twin, signal detection, autonomous agents, etc.

Military uses seem important too.

Can the large GPU based data centers not be repurposed to that?


> If you want to compare the DeepSeek-R development costs to anything, you should be comparing it to what it cost OpenAI to develop GPT-o1 (not what they plan to spend to run it)

They aren't comparing the 500B investment to the cost of deepseek-R1 (allegedly 5 millions) they are comparing the cost of R1 to the one of o1 and extrapolating from that (we don't know exactly how much OpenAI spent to train it, but estimates put it around $100M, in which case deepseek would have been only 95% more cost-efficient, not 99%)


Actually it means we will potentially get 100x the economic value out of those datacenters. If we get a million digital PHD researchers for the investment then that’s a lot better than 10,000.


$500 billion is $500 billion.

If new technology means we can get more for a dollar spent, then $500 billion gets more, not less.


That's right but the money is given to the people who do it for $500B and there are much better ones who can do it for $5B instead and if they end up getting $6B they will have a better model. What now?


I don't know how to answer this because these are arbitrary numbers.

The money is not spent. Deepseek published their methodology, incumbents can pivot and build on it. No one knows what the optimal path is, but we know it will cost more.

I can assure you that OpenAI won't continue to produce inferior models at 100x the cost.


What concerns me is that someone came out of the blue with just as good result at orders of magnitude less cost.

What happens if that money is being actually spent, then some people constantly catch up but don't reveal that they are doing it for cheap? You think that it's a competition but what actually happening is that you bleed out of your resources at some point you can't continue but they can.

Like the star wars project that bankrupted the soviets.


> Like the star wars project that bankrupted the soviets.

Wasn't that a G.W Bush Jr thing?


A timeline where the lesser Bush faced off against the Soviet Union would be interesting. But no, it was a Reagan thing.


Also it didn't apparently actually bankrupt the soviet though it may have helped a little: https://www.reddit.com/r/AskHistorians/comments/8cnm73/did_r...


Ty. I had this vague memory of some "Star Wars laser" failing to shoot down a rocket during Jr. I might be remembering it wrong. I can't find anything to support my notion either.


I think there was a brief revival in ballistic missile defense interest under the W presidency, but what people refer to as "Star Wars" was the Reagan-era initiative.


The $500B wasnt given to the founders, investors and execs to do it better. It was given to them to enrich the tech exec and investor class. That's why it was that expensive - because of the middlemen who take enormous gobs of cash for themselves as profit and make everything more expensive. Precisely the same reason why everything in the US is more expensive.

Then the Open Source world came out of the left and b*tch slapped all those head honchos and now its like this.


Are you under the impression it was some kind of fixed-scope contractor bid for a fixed price?


No, its just that those people intend to commission huge amount of people to build obscene amount of GPUs and put them together in an attempt to create a an unproven machine when others appear to be able to do it at the fraction of the cost.


The software is abstracted from the hardware.


Which means?


The more you spend on arxiv, the more you save on the gpus Jensen told you you would save more on if you were to spend more on gpus


Not sure where to start.

- The hardware purchased for this initiate can be used for multiple architectures and new models. If DeepSeek means models are 100x as powerful, they will benefit

- Abstraction means one layer is protected from direct dependency on implementation details of another layer

- It’s normal to raise an investment fund without knowing how the top layers will play out

Hope that helps? If you can be more specific about your confusion I can be more specific in answering.


if you say, i wanna build 5 nuclear reactors and I need 200 billion $$. I would believe it because, you can ballpark it with some stats.

For tech like LLMs, it feels irresponsible to say 500 billion $$ investment and then place that into R&D. What if in 2026, we realize we can create it for 2 billion$, and let the 498 billion $ sitting in a few consumers.


I bet the Chinese can build 5 nuclear reactors for a fraction of that price, too. Deepseek says China builds them at $2.5-3.5B per 1200MW reactor.


Don’t think of it as “spend a fixed amount to get a fixed outcome”. Think of it as “spend a fixed amount and see how far you can get”

It may still be flawed or misguided or whatever, but it’s not THAT bad.


It seems to mostly be hardware.


> Isn't that the kind wrong investment that can break nations?

It's such a weird question. You made it sound like 1) the $500B is already spent and wasted. 2) infrastructure can't be repurposed.


OpenAI will no doubt be copying DeepSeek's ideas also.

That compute can go to many things.


The 500b isn’t to retrain a model with same performance as R1, but something better and don’t forget inference. Those servers are not just serving/training LLMs, it training next gen video/voice/niche subject and it’s equivalent models like bio/mil/mec/material and serving them to hundreds of millions of people too. Most people saying “lol they did all this for 5mill when they are spending 500bill” just doesnt see anything beyond the next 2 months


When we move to continuously running agents, rather than query-response models, we're going to need a lot more compute.


> i.e. high speed rail network instead

You want to invest $500B to a high speed rail network which the Chinese could build for $50B?


My understanding of the problems with high speed rail in the US is more fundamental than money.

The problem is loose vs strong property rights.

We don't have the political will in the US to use eminent domain like we did to build the interstates. High speed rail ultimately needs a straight path but if you can't make property acquisitions to build the straight rail path then this is all a non-starter in the US.


Just commission the Chinese and make it 10X bigger then. In the case of the AI, they appear to commission Sam Altman and Larry Ellison.


The US has tried to commission Japan for that before. Japan gave up because we wouldn't do anything they asked and went to Morocco.


It was France:

https://www.businessinsider.com/french-california-high-speed...

Doubly delicious since the French have a long and not very nice colonial history in North Africa, sowing long-lasting suspicion and grudges, and still found it easier to operate there.


It doesn't matter who you "commission" to do the actual work, most of the additional cost is in legal battles over rights of way and environmental impacts and other things that are independent of the construction work.


The chinese gv would be cutting spending on AI according to your logic, but they are doing opposite, and they’d love to get those B200s I bet you


500 billion can move whole country to renewable energy


Not even close. The US spends roughly $2trillion/year on energy. If you assume 10% return on solar, that's $20trillion of solar to move the country to renewable. That doesn't calculate the cost of batteries which probably will be another $20trillion.

Edit: asked Deepseek about it. I was kinda spot on =)

Cost Breakdown

Solar Panels $13.4–20.1 trillion (13,400 GW × $1–1.5M/GW)

Battery Storage $16–24 trillion (80 TWh × $200–300/kWh)

Grid/Transmission $1–2 trillion

Land, Installation, Misc. $1–3 trillion

Total $30–50 trillion


If Targeted spending of 500 Billion ( per year may be ? ) should give enough automation to reduce panel cost to ~100M/GW = 1340 Billion. Skip battery, let other mode of energy generation/storage take care of the augmentations, as we are any way investing in grid. Possible with innovation.


The common estimates for total switch to net-zero are 100-200% of GDP which for the US is 27-54 trillion.

The most common idea is to spend 3-5% of GDP per year for the transition (750-1250 bn USD per year for the US) over the next 30 years. Certainly a significant sum, but also not too much to shoulder.


It’s also cheaper than dealing with the exponentially increasing costs of climate adaptation.


Really? How? That's very interesting


Sigh, I don't understand why they had to do the $500 billion announcement with the president. So many people now wrongly think Trump just gave OpenAI $500 billion of the taxpayers' money.


It means he’ll knock down regulatory barriers and mess with competitors because his brand is associated with it. It was a smart poltical move by OpenAI.


Until the regime is toppled, then it will look very short-sighted and stupid.


Nah, then OpenAI gets to play the “IDK why he took credit, there’s no public money and he did nothing” card.

It’s smart on their part.


That would be an obvious lie, since they set up in front of cameras in the actual White House to publicly discuss it.


I don't say that at all. Money spent on BS still sucks resources, no matter who spends that money. They are not going to make the GPU's from 500 billion dollar banknotes, they will pay people $500B to work on this stuff which means people won't be working on other stuff that can actually produce value worth more than the $500B.

I guess the power plants are salvageable.


By that logic all money is waste. The money isnt destroyed when it is spent. It is transferred into someone else's bank account only. This process repeats recursively until taxation returns all money back to the treasury to be spent again. And out of this process of money shuffling: entire nations full of power plants!


Money is just IOUs, it means for some reason not specified on the banknote you are owed services. If in a society a small group of people are owed all the services they can indeed commission all those people.

If your rich spend all their money on building pyramids you end up with pyramids instead of something else. They could have chosen to make irrigation systems and have a productive output that makes the whole society more prosperous. Either way the workers get their money, on the Pyramid option their money ends up buying much less food though.


Money can be destroyed with inflation.


Deepseek didn't train the model on sheets of paper, there are still infrastructure costs.


Which are reportedly over %90 lower.


Trump just pull a stunt with Saudi Arabia. He first tried to "convince" them to reduce the oil price to hurt Russia. In the following negotiations the oil price was no longer mentioned but MBS promised to invest $600 billion in the U.S. over 4 years:

https://fortune.com/2025/01/23/saudi-crown-prince-mbs-trump-...

Since the Stargate Initiative is a private sector deal, this may have been a perfect shakedown of Saudi Arabia. SA has always been irrationally attracted to "AI", so perhaps it was easy. I mean that part of the $600 billion will go to "AI".


MBS does need to pay lip service to the US, but he's better off investing in Eurasia IMO, and/or in SA itself. US assets are incredibly overpriced right now. I'm sure he understands this, so lip service will be paid, dances with sabers will be conducted, US diplomats will be pacified, but in the end SA will act in its own interests.


One only needs to look as far back as the first Trump administration to see that Trump only cares about the announcement and doesn’t care about what’s actually done.

And if you don’t want to look that far just lookup what his #1 donor Musk said…there is no actual $500Bn.


Yeah - Musk claims SoftBank "only" has $10B available for this atm.

There was an amusing interview with MSFT CEO Satya Nadella at Davos where he was asked about this, and his response was "I don't know, but I know I'm good for my $80B [that I'm investing to expand Azure]".


And with the $495B left you could probably end world hunger and cure cancer. But like the rest of the economy it's going straight to fueling tech bubbles so the ultra-wealthy can get wealthier.


Those are not just-throw-money problems. Usually these tropes are limited to instagram comments. Surprised to see it here.


I know, it was simply to show the absurdity of committing $500B to marginally improving next token predictors.


True. I think there is some posturing involved in the 500b number as well.

Either that or its an excuse for everyone involved to inflate the prices.

Hopefully the datacenters are useful for other stuff as well. But also I saw a FT report that it's going to be exclusive to openai?

Also as I understand it these types of deals are usually all done with speculative assets. And many think the current AI investments are a bubble waiting to pop.

So it will still remain true that if jack falls down and breaks his crown, jill will be tumbling after.


I'm not disagreeing, but perhaps during the execution of that project, something far more valuable than next token predictors is discovered. The cost of not discovering that may be far greater, particularly if one's adversaries discover it first.


Maybe? But it still feels very wrong seeing this much money evaporating (litteraly, by Joule heating) in the name of a highly hypothetical outcome. Also, to be fair, I don't feel very aligned with tech billionaires anymore, and would rather someone else discovers AGI.


It's almost as if the people with the money and power know something about "next token predictors" that you don't.


Do you really still believe they have superior intellect? Did Zuckerberg know something you didn't when he poured $10B into the metaverse? What about Crypto, NFTs, Quantum?


They certainly have a more valid point of view than, "Meh, these things are just next-token predictors that regurgitate their training data. Nothing to see here."


Yes, their point is to inflate the AI bubble some more so they can extract more wealth before it's over.


Not as much as the Chinese, apparently.


they clearly missed out on the fact that they could've trained their $5bn model for much less


Think of it like a bet. Or even think of it a bomb.


There are some theories from my side:

1. Stargate is just another strategic deception like Star Wars. It aims to mislead China into diverting vast resources into an unattainable, low-return arms race, thereby hindering its ability to focus on other critical areas.

2. We must keep producing more and more GPUs. We must eat GPUs at breakfast, lunch, and dinner — otherwise, the bubble will burst, and the consequences will be unbearable.

3. Maybe it's just a good time to let the bubble burst. That's why Wall Street media only noticed DeepSeek-R1 but not V3/V2, and how medias ignored the LLM price war which has been raging in China throughout 2024.

If you dig into 10-Ks of MSFT and NVDA, it’s very likely the AI industry was already overcapacity even before Stargate. So in my opinion, I think #3 is the most likely.

Just some nonsense — don't take my words seriously.


No nation state will actually divert money without feasibility studies, there are applications, but you are very likely misfiring. If every device everyone owns has continuously running agents, we will see the multiple applications as time passes by.


> Stargate is just another strategic deception like Star Wars

Well, this is a private initiative, not a government one, so it seems not, and anyways trying to bankrupt China, whose GDP is about the same as that of the USA doesn't seem very achievable. The USSR was a much smaller economy, and less technologically advanced.

OpenAI appear to genuinely believe that there is going to be a massive market for what they have built, and with the Microsoft relationship cooling off are trying to line up new partners to bankroll the endeavor. It's really more "data center capacity expansion as has become usual" than some new strategic initiative. The hyperscalars are all investing heavily, and OpenAI are now having to do so themselves as well. The splashy Trump photo-op and announcement (for something they already started under Biden) is more about OpenAI manipulating the US government than manipulating China! They have got Trump to tear up Biden's AI safety order, and will no doubt have his help in removing all regulatory obstacles to building new data centers and the accompanying power station builds.


> Americans excel at 0-to-1 technical innovation, while Chinese excel at 1-to-10 application innovation.

I was thinking the same thing...how much is that investment mostly grift?

1: https://www.chinatalk.media/p/deepseek-ceo-interview-with-ch...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: