This is a great point, as far as the understanding of "bot" is concerned. Although I think the overall point is that the bots create an algorithmic determinism toward information.
I do agree with the overall trend the author is observing, but I guess what I was getting at is that this is sort of an old problem extending to the web.
There's a unique social stigma around "bots" that isn't applied the same way to power users of any other system (understandably so, given some are nefarious). I believe this largely gave way to AI-powered bots, as there's a demand for bots to behave as humanly as possible to 1) minimize obstructions like 403s and 2) maximize the extraction of information.
Maybe if web servers were broadly designed thinking of bots as power users, the web would bifurcate into a "optimized web" for bots (APIs), and a "traditional web" for humans (HTML). Instead, we're getting this mess of bots and humans all trying to squeeze through the same doorway.