I think I worded this poorly. What he said was that a lot of people say they want open-source models but they underestimate how hard it is to serve them well. So he wondered how much real benefit would come from open-sourcing them.
I think this is reasonable. Giving researchers access is great but for most small companies they're likely better off having a service provider manage inference for them rather than navigate the infra challenge.
The beauty of open source is that the community will either figure out how to make it easier, or collectively decide it’s not worth the effort. We saw this with stable diffusion, and we are seeing it with all the existing OSS LLMs.
“It’s too hard, trust us” doesn’t really make sense in that context. If it is indeed too hard for small orgs to self host then they won’t. Hiding behind the guise of protecting these people by not open sourcing it seems a bit disingenuous.
If he says he's inclined to open-source GPT-3, I don't see any good arguments not in favor of giving startups the choice of how they can run inference.
I think this is reasonable. Giving researchers access is great but for most small companies they're likely better off having a service provider manage inference for them rather than navigate the infra challenge.