Before there were neural language models, there were n-gram models, skip n-gram models, latent dirichlet allocation models, a whole zoo of non-neural machine learning models. Google used some of them to power old Google Translate, which was still a very impressive piece of technology.