Hacker News new | past | comments | ask | show | jobs | submit login

The drip-feeding seems crazy to me. Open AI is undermining their reputation by forcing almost everybody to use the older, lower-quality models. Even if customers are willing to pay for GPT 4, they're being told to wait at the back of the line.

Wait for what!? Christmas? When we can open our presents and have a GPT 4 inside?

It's like they took a leaf from Google's "how to guarantee the failure of a new product" marketing. That is: restrict access, ensuring that to word-of-mouth marketing can't possibly work because none of your friends are allowed to try the product.

The announcement here is "general availability" of the GPT-4 model...

...but not the 32K context model. Not the multi-modal version with image input. No fine-tuning. Only one model (chat).

As of today, I can only access GPT 3.5 via Azure Open AI service and the Open AI API account that I have.

What's the point of all these arbitrary restrictions on who can access what model!?

I can use GPT 4 via Chat, but not an API. I can use an enhanced version of Dall-E via Bing Image Creator, but not the OpenAI API. Some vendors that have been blessed by the Great and Benevolent Sam Altman have access to GPT-4 32K, the rest of us don't.

Sell the product, not the access to it.

Don't be like the Soviet Union, where you had to "know someone" to get access.




I think maybe you don't understand that they don't have enough GPUs to do this, and money can't buy enough GPUs to do it.


This is the bottleneck. EUV Photolithography is one of the hardest engineering challenges ever faced, it's like trying to drop a feather from space and guaranteeing it lands on a specific blade of grass. Manufacturing these GPUs at all requires us to stretch the limit of what is physically possible in multiple domains, much less producing them at scale.


Thanks for this explanation! :) (as someone without knowledge of the hardware process I appreciated it).

It is SO amazing that we have such a driving force (LLMs/consumer-AI) for this (instead of stupid cryptocurrencies mining or high-performance gaming). This should drive innovation pretty strongly and I am sure the next "leap" in this regard (processing hardware) will put technology in a completely different level.


That's a cool last minute detour from techno gods. We can incentivize AI to work on crypto mining, and regain our fully engaged primeval lives back ;)


Not disagreeing but just curious, why can't money buy enough GPU's? OpenAI's prices seem low enough that they could reasonably charge 2x or more to companies eager to get on the best models now.


They're giving people access to GPT-4 via Bing for free, but apparently can't accommodate paying API users!?

That makes no sense.

What makes much more sense -- especially if you listen to his interviews -- is that Sam Altman doesn't think you can be trusted with the power of GPT-4 via an API unless it has first been aligned to death.


Microsoft is giving that for free but I assume they're paying OpenAI for it.

And having such a big anchor tenant, its reasonable that you would prioritize them if GPUs are in short supply.


> Microsoft is giving that for free but I assume they're paying OpenAI for it.

Yeah, but Microsoft already gets 75% of the profits OpenAI makes, it's not the same price for them as the rest of us.


It’s the exactly the same. If they could make 75 cents selling the compute to someone else for $1 versus not making it providing the Bing chat service, that is 75 cents they lose.


Why do you assume that the same amount of computing power would be used by someone else? There are only so many customers. You can't magically start selling more compute if you stop using it yourself.


At scale, GPU's are capacity constrained right now, so if Microsoft stopped using them, their capacity would be absorbed by others.


10 billion$


Bing GPT-4 is a much smaller and less capable model than regular GPT-4.


"Free". The worst four-letter F word in America.


I think GPUs are in short supply and Nvidia can't make enough to keep up with demand.


To a first approximation, the increased share price of NVIDIA is because AI developers including OpenAI bought as many as NVIDIA can make.


This may be true but isn’t their official stance that their models are too powerful and could destroy Western civilization as we know it?


They simply want control over the rollout of their product and how it is used. That, and perhaps opening the flood gates would produce scaling bottlenecks they’d rather stay ahead of than get behind.

So they open things carefully, pull back when necessary like when they limited use of the public GPT-4 version of ChatGPT. That doesn’t seem too unreasonable. And yes sure, some amount of it might be attempts to manufacture scarcity to increase the hype. It’s an old tactic and hardly comparable to Soviet Russia.


There's no scaling issues to speak of. These AIs are stateless, which makes them embarrassingly parallel. They can always just throw more GPUs at it. Microsoft even had some videos where they bragged about how these models can be run on any idle GPU around the world, dynamically finding resources wherever it is available!

If there's not enough GPUs at a certain price point, raise prices. Then lower prices later when GPUs become available.

They did it with GPT 3.5, so why not GPT 4?


More GPUs currently don't exist. Nvidia is at capacity for production, and they have to compete with other companies who are also bidding on these GPUs. It's not an issue of raising the price point. The GPUs they want to buy have to be purchased months in advance.


> embarrassingly parallel

I don’t see why such a thing should be embarrassing. Or, at least no more so than being acute or obtuse. Just as long as nothing is askew.


"Embarrassingly parallel" is a term of art: https://en.wikipedia.org/wiki/Embarrassingly_parallel


Yes I was anthropomorphising it back into the realm of human emotion, wherein the angles at which one’s lines run need not be a source of emotional distress. Excepting perhaps the innate sadness of two parallel lines destined to ever be at each other’s sides but still never to meet across the infinite plane.


The problem is GPUs are hard to come by.

If we guesstimate that every 100 customers needs 1 NVIDIA GPU (completely random guess), then that means OpenAI needs to buy more GPUs for every 100 new customers using GPT-4. The problem is there's a GPU shortage so it's hard to add more GPUs by just throwing money at the problem.

https://www.fierceelectronics.com/electronics/ask-nvidia-ceo...


> Wait for what!? Christmas?

Infrastructure.

> It's like they took a leaf from Google's "how to guarantee the failure of a new product" marketing.

Yeah, an infamous guaranteed failure: GPT-4. (canned laughter)


It’s an experiment. You are part of it.


It's one thing to vent frustration, but it's another to compare a capitalist startup to the Soviet Union... Get your facts right.


Sam Altman was giving people access to GPT 4 APIs because they attended a conference.

"Lick my boots, in person, and you can be one of the privileged few" is very much the behaviour of a Communist dictatorship, not a capitalist corporation.

I can spin up an Azure VM right now in almost any country I choose... except China. That's the only one where I have to beg the government for permission.


How the world of enterprise sales works may come as an unpleasant surprise to you, then.


You've clearly not experienced the reality of enterprise then. You're opinions are based on a limited understanding and knowledge of real life situations when it comes to this sort of stuff.


I’ve only worked in big enterprise and big government for over two decades.

I know exactly how this works.

When someone has power, they will use it. In small, petty ways, or big “do me favours for access” ways.


Then you should know this is literally what every capitalist does!


Seems correct to me. This is a good assessment of the personalities involved.


> "Lick my boots, in person, and you can be one of the privileged few" is very much the behaviour of a Communist dictatorship, not a capitalist corporation

It's both, and more besides. Veblen goods are absolutely a thing in capitalism.

Not that "giving people access to GPT 4 APIs because they attended a conference" should be controversial enough to even get worked up about, let alone to compare to a dictatorship, but Google did much the same at developer conferences: I/O 2012 swag list included a Galaxy Nexus, a Nexus 7, a Nexus Q, and a Chromebox.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: