Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Releasing the model has paid off handsomely with name recognition and making a significant geopolitical and cultural statement.

But will they keep releasing the weights or do an OpenAI and come up with a reason they can't release them anymore?

At the end of the day, even if they release the weights, they probably want to make money and leverage the brand by hosting the model API and the consumer mobile app.



If they continue to release the weights + detailed reports what they did, I seriously don't understand why. I mean it's cool. I just don't understand why. It's such a cut throat environment where every little bit of moat counts. I don't think they're naive. I think I'm naive.


If you’re not appearing, you’re disappearing.

Now they are firmly on the map, which presumably helps with hiring, doing deals, influence. If they stop publishing something, they run the risk of being labelled a one-hit wonder who got lucky.

If they have a reason to believe they can do even better in the near future, releasing current tech might make sense.


I think those are valid points but it's hard for me to see that this is worth it. With the might of the CCP in the back and the giant labor pool that is China, surely they can make hiring work either way. If they now start offering a model that's cheaper and better then anyone else's, surely anyone will take notice, even if the weights are not open.


If moving faster is a most, then open source AI could move faster than closed AI by not needing to be paranoid about privacy and welcoming external contributions


But if an open model ever pulls ahead then the closed vendors can immediately piggyback on that.


I don't think any of these companies are aiming at long term goal of making money from inference pricing of customers.


> I don't think any of these companies are aiming at long term goal of making money from inference pricing of customers.

What is DeepSeek aiming for if not that, which is currently the only thing they offer that cost money? They claim their own inference endpoints has a cost profit margin of 545%, which might be true or not, but the very fact that they mentioned this at all seems to indicate it is of some importance to them and others.


Well it's certainly helpful in the interim that they can recoup some money from inference. I'm just saying that with systems with more intelligence in the future can be used to make money in much better ways than charging customers for interacting with it. For instance it could conduct research on projects which can generate massive revenue if successful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: