Any AI product sold for a price that's affordable on a third-world salary is being heavily subsidized. These models are insanely expensive to train, guzzle electricity to the point that tech companies are investing in their own power plants to keep them running, and are developed by highly sought-after engineers being paid millions of dollars a year. $20/month was always bound to be an intro offer unless they figured out some way to reduce the cost of running the model by an order of magnitude.
We've been conditioned to pay $10/mo for an endless stream of gloried CRUD apps, but it is very common for specialized software to cost orders of magnitude more. Think Bloomberg Terminal, Cadence, Maya, lots of CAD software (like SOLIDWORKS), higher tiers of Adobe etc. all running in the thousands of dollars per user. And companies happily pay for them because of the value they add. ChatGPT isn't any different.