Maybe cost & latency for both training and inference it getting too high. If costs doubled for every 5% better performance, would it be worth it? NVIDIA is making a small fortune from this.
> But Google could easily 10x the training cost of GPT-4 if they thought it would protect their search business
Google makes /\$0\.[0+]\d/ per search query. If the inference cost of the model exceeds that, they go from making money to losing money. It is not clear if the Bing integration is a money maker or a lost leader.
Maybe cost & latency for both training and inference it getting too high. If costs doubled for every 5% better performance, would it be worth it? NVIDIA is making a small fortune from this.
> But Google could easily 10x the training cost of GPT-4 if they thought it would protect their search business
Google makes /\$0\.[0+]\d/ per search query. If the inference cost of the model exceeds that, they go from making money to losing money. It is not clear if the Bing integration is a money maker or a lost leader.