Hard disagree. So far every big important model is closed-source. Grok is sort-of the only exception, and it's not even that big compared to the (already old) GPT-4.
I don't see open source being able to compete with the cutting-edge proprietary models. There's just not enough money. GPT-5 will take an estimated $1.2 billion to train. MS and OpenAI are already talking about building a $100 billion training data center.
How can you compete with that if your plan is to give away the training result for free?
I don't see open source being able to compete with the cutting-edge proprietary models. There's just not enough money. GPT-5 will take an estimated $1.2 billion to train. MS and OpenAI are already talking about building a $100 billion training data center.
How can you compete with that if your plan is to give away the training result for free?