Hacker News new | past | comments | ask | show | jobs | submit login

Seems like both:

- AI Labs will eat some of the wrappers on top of their APIs - even complex ones like this. There are whole startups that are trying to build computer use.

- AI is fitting _some_ scaling law - the best models are getting better and the "previously-state-of-the-art" models are fractions of what they cost a couple years ago. Though it remains to be seen if it's like Moore's Law or if incremental improvements get harder and harder to make.




It seems a little silly to pretend there’s a scaling “law” without plotting any points or doing a projection. Without the mathiness, we could instead say that new models keep getting better and we don’t know how long that trend will continue.


"Law" might not be the right word - but there's no denying it's scaling with compute/data/model size. I suppose law happens after continued evidence over years.


> It seems a little silly to pretend there’s a scaling “law” without plotting any points or doing a projection.

Isn't this Kaplan 2020 or Hoffmann 2022?


Yes, those are scaling laws, but when we see vendors improving their models without increasing model size or training longer, they don't apply. There are apparently other ways to improve performance and we don't know the laws for those.

(Sometimes people track the learning curve for an industry in other ways, though.)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: