However, wouldn't a GPT trained exactly on Wikipedia be quite useful? It would be the biggest user-editable training material for a language model that can be asked about things.
In addition to the obvious use of responding to questions based on the material, perhaps it could be a tool for finding e.g. if and how a cited source relates to the article where it was cited. Abuse detection could also be one application.
I couldn't exactly find out what the goal of Wikipedia is from https://en.wikipedia.org/wiki/Wikipedia but it doesn't seem a "better search" would be opposite to those goals.
Claims that GPT produces "better search" are groundless until GPT demonstrably produces "better search" and the resulting product has been observed for unintended consequences. Gonna be a wait.
Oh boy, the best place for subtle errors. A public wiki, being proofread by someone who is likely not a subject matter expert! I don’t see anyway this can go poorly. And obviously search hasn’t been solved for decades.