I'm not saying this hasn't happened to you, but I'm curious: are you working with scientific Python codebases or similar? I've done Python development off and on for the last ~10 years, and I think I can count the number of times I've had transitive conflicts on a single hand. But I almost never touch scientific/statistical/etc. Python codebases, so I'm curious is this is a discipline/practice concern in different subsets of the ecosystem.
(One of the ways I have seen this happen in this past is people attempting to use multiple requirements sources without synchronizing them or resolving them simultaneously. That's indeed a highway to pain city, and it's why modern Python packaging emphasizes either using a single standard metadata file like pyproject.toml or a fully locked environment specification like a frozen requirements file.)
I've encountered the same problem with Python codebases in the LLM / machine learning space. The requirements.txt files for those projects are full of unversioned dependencies, including Git repositories at some floating ref (such as master/HEAD).
In the easy cases, digging through the PyPI version history to identify the latest version as of some date is enough to get a working install (as far as I can tell -- maybe it's half-broken and I only use the working half?). In the hard cases, it may take an entire day to locate a CI log or contemporary bug report or something that lists out all the installed package versions.
It doesn't help that every Python-based project seems to have its own bespoke packaging system. It's never just pip + requirements.txt, it'll have a Dockerfile with `apt update`, or some weird meta-packaging thing like Conda that adds it own layers of non-determinism. Overall the feeling is that it was only barely holding together on the author's original machine, and getting it to build anywhere else is pure luck.
That’s still the happy case. Once upon a time I spent four days chasing dependencies before reaching out to the original author, who admitted that it hasn’t actually worked in several months but he kept on editing code anyway.
The program depended on fundamentally incompatible sub-dependencies, on different major versions.
I've had similar problems with python packaging in both the web dev and embedded spaces. There are ways to largely solve these issues (use package managers with lock files and do irregular dependency updates), but I rarely see that being done in projects I work in.
If you use gRPC directly and some other library in your stack does it as well it's very likely you end up with conflicts either on gRPC itself or the proto library under the hood.
(One of the ways I have seen this happen in this past is people attempting to use multiple requirements sources without synchronizing them or resolving them simultaneously. That's indeed a highway to pain city, and it's why modern Python packaging emphasizes either using a single standard metadata file like pyproject.toml or a fully locked environment specification like a frozen requirements file.)