Hacker News new | past | comments | ask | show | jobs | submit login

>There are many ways to monetize a chatbot, OpenAI for example is raking billions in subscription fees.

Compared to Google, OpenAI's billions is peanuts, while costing a fortune to generate. GPT-4 doesn't seem profitable (if it was, would they need to throttle it?)




> GPT-4 doesn't seem profitable (if it was, would they need to throttle it?)

Maybe? Hardware supply isn’t perfectly elastic


Wouldn't Google be better able to integrate ads into a "ChatGoogle" service than OpenAI is into ChatGPT?


The cost per ad is still astronomically different between search ads and LLMs


There could be an opposite avenue: ad-free Google Premium subscription with AI chat as a crown jewel. An ultimate opportunity to diversify from ad revenue.


There's not enough money in it, as Google's scale.

Especially because the people who'd pay for Premium tend to be the most prized people from an advertiser perspective.

And most people won't pay, under any circumstances, but they will click on ads which make Google money.


The low operating margin of serving a GPT-4 scale model sounds like a compelling explanation for why Google stayed out of it.

But then why did Microsoft put its money behind it? Alphabet's revenue is around $300bn, and Microsoft's is around $210bn which is lower but it is the same order of magnitude.


YouTube does it, at Google scale. And these same people do pay $20/mo for ChatGPT anyway.


YouTube isn't comparable - YouTube revenue is roughly 30B/year, while Search revenue is roughly 175B/year.

Advertisers are willing to pay far more than $20/mo per user, combined with the fact that search costs way less per query than inference.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: