Hacker News new | past | comments | ask | show | jobs | submit login

Part of the reason their API is so cheap because they explicitly state they are going to train on your API data. Open AI and Claude say they won’t if you use their API (if you use ChatGPT that’s a different story). There are no free lunches.



This comment is misleading. There is a "free lunch" here in the sense that serving this model is far cheaper than worse, open source models at scale.

Yes they probably are more willing to go down in price due to this, but the architecture is open, and they are charging similarly to a 30B-50B dense model, which is about how many active params deepseek-v3 has.


So then OP is correct? Your comment confirms the same sentiment about the tradeoff API users make: cheaper inference means you pay with your data.

Sure Deepseek may publish their weights so you dont have to use the API, but the point still stands for the API.


Its a matter of degree. If 90% of the cost savings are from a new, smarter architecture, it doesn't make sense to point to the API terms as the reason for it being so cheap.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: