Google reported an estimated 2025 AI CapEx of around $85 billion. I don’t know how much is inference vs training (or shared), and Google is quite proud of using a whole bunch of their own chips. Much of the data on how much money is spent where is public.
In any event, one can make some generalizations about the companies involved. Nvidia makes excellent hardware that everyone wants and charges large enough markups that their margins are around 90%. AMD is chasing the big buyers to sell their products. Google spends a lot and is a mature company, and they seem uninterested in selling chips that compete with Nvidia, but they certainly care about revenue and profit. OpenAI, Anthropic, etc and, perhaps oddly, Meta don’t seem to care too much about profit, but they certainly spend enough money that it would help them to get more bang for their buck. Alibaba, etc buy whatever Nvidia gear they can get, but they have a lot of incentive to find a domestic supplier, and Huawei seems quite interested in becoming that supplier. And there are plenty of US startups (Cerebras and others) going after the inference market.
In any event, one can make some generalizations about the companies involved. Nvidia makes excellent hardware that everyone wants and charges large enough markups that their margins are around 90%. AMD is chasing the big buyers to sell their products. Google spends a lot and is a mature company, and they seem uninterested in selling chips that compete with Nvidia, but they certainly care about revenue and profit. OpenAI, Anthropic, etc and, perhaps oddly, Meta don’t seem to care too much about profit, but they certainly spend enough money that it would help them to get more bang for their buck. Alibaba, etc buy whatever Nvidia gear they can get, but they have a lot of incentive to find a domestic supplier, and Huawei seems quite interested in becoming that supplier. And there are plenty of US startups (Cerebras and others) going after the inference market.