Hacker News new | past | comments | ask | show | jobs | submit login

> So I wonder what changed?

Maybe cost & latency for both training and inference it getting too high. If costs doubled for every 5% better performance, would it be worth it? NVIDIA is making a small fortune from this.

> But Google could easily 10x the training cost of GPT-4 if they thought it would protect their search business

Google makes /\$0\.[0+]\d/ per search query. If the inference cost of the model exceeds that, they go from making money to losing money. It is not clear if the Bing integration is a money maker or a lost leader.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: