I do think we can do much better than GPT4. Size isn't everything. Small models can outperform large models, when trained and finetuned in the right way. And transformers are hardly the be-all and end-all of language AI either - there's plenty of reason to believe they're both inefficient and architecturally unsuited for some of the tasks we expect of it. The field is brand new, and now that the ChatGPT has brought the world's figurative Eye Of Sauron on developing human-level AI, we're going to see a lot of progress very quickly.