Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Creating AI models has proven to simply be easier than other past innovations. Much lower barrier to entry, the knowledge seems to be spread pervasively within months of breakthroughs.

People seem to take offense at this idea, but the proof is in the pudding. Every week there's a new company with a new model coming out. What good did Google's "AI Talent" do for them when OpenAI leapfrogged them with only a few hundred people?

It's difficult to achieve high margins when barrier to entry is low. These AI companies are going to be moreso deflationary for society rather than high margin cash cows as the SaaS wave was



It's easier for large rich companies with infrastructure and datasets. It's very hard for small startups to build useful real world models from scratch, so you see most people building on top of SD and APIs, but that limits what you can build, for example it's very hard to build realistic photo editing on top of stable diffusion.


Most of the cutting edge models are coming from companies with a few dozen to a few hundred people. Stability AI is one example.

Training an AI model, while expensive, is vastly cheaper than most large scale products.

This wave will be nothing like the SaaS wave. Hyper competitive rather than weakly-competitive/margin preserving


I wrote it from the perspective of a small startup (<10 people, bootstrapped or small funding). I think it's far cheaper and easier to build a nice competitive mobile app/saas than to build a really useful model.

But yes I agree, it will be very competitive with much smaller margins.


Someone was able to replicate GPT 3.5 with $500. The training of models is getting very cheap.

[1] https://newatlas.com/technology/stanford-alpaca-cheap-gpt/


I've tried it, sure it's good, but not even close to the real thing. But yes it's getting cheaper through better hardware, better data and better architectures. Also it builds on Facebook's models that were trained for months on thousands of A100 GPUs.


By fine-tuning a leaked model trained with a lot more money than that. If somebody leaks the GPT 3.5 model, can I say that I replicated it for $0?


That's not really true though. 4 months on and no one else is close to matching the original ChatGPT.

It's too early to say how hard this is, for all we know no one but OpenAI will match it before 2024.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: