It was the high cost per API call that pulled all the fun out of GPT-3 for me. I hadn’t been following the company at all but believe you based on just that single issue. It looks like GPT-J has open sourced their entire model so I could potentially run it myself for almost-free? If so, that’s going to be much more fun. Thanks for sharing!
That thought was why I added “almost” right before posting! It’s free to run, just, except for the expensive parts... AWS still rents out GPU power, right? That might be a bit easier to get started with.
GNU/Linux requires basically the same hardware as the usual alternative (Windows). If you switch from hosted GPT-3 to selfhosted GPT-J you start needing to get and manage all the hardware, which might go underutilized most of the time depending on what your demand looks like, and which requires lots of software optimizations to use maximally effectively. You can use hosted GPT-J, though, a few companies offer that.
The comment to which you replied is making a useful point actually. Self hosting is not free in the relevant sense when comparing cost vs a third party API.