Hacker News new | past | comments | ask | show | jobs | submit login

Hard disagree. So far every big important model is closed-source. Grok is sort-of the only exception, and it's not even that big compared to the (already old) GPT-4.

I don't see open source being able to compete with the cutting-edge proprietary models. There's just not enough money. GPT-5 will take an estimated $1.2 billion to train. MS and OpenAI are already talking about building a $100 billion training data center.

How can you compete with that if your plan is to give away the training result for free?




Where is the $1.2b number from?


There are a few numbers floating around, $1.2B being the lowest estimate.

HSBC estimates the training cost for GPT-5 between $1.7B and $2.5B.

Vlad Bastion Research estimates $1.25B - 2.25B.

Some people on HN estimate $10B:

https://news.ycombinator.com/item?id=39860293




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: