Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Their are signs that China is not open sourcing their SOTA models anymore. Both Huawei and Qwen (Qwen-Max, WAN 2.5) and have launched flagship models which are yet to be opensourced.


Qwen's max series had always been closed weight, it's not a policy change like you are alluding.

What exactly is Huawei's flagship series anyway? Because their PanGu line is open-weight, but Huawei is as of yet not in the LLM making business, their models are only meant to signal that it's possible to do training and inference on their hardware, that's all. No one actually uses those models.


Small counterpoint but there are also 2 new players putting out SOTA open source models (Moonshots Kimi and zhipus GLM) so we're still seeing the same number of models overall, just via newer entrants.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: