The JupyterLab extension system is the biggest offender imo.
"Oh, I need to install the python package and the jupyter extension separately? And there are no breadcrumbs if I install one but not the other? And I have to figure out which versions are mutually compatible with each other and jupyterlab because 'everything latest' sure isn't? And I have to install extensions to get features that used to be built-in? And then I have to explain the install procedure to the person I'm trying to share with, twice, because he didn't believe me the first time? UGGGGGGGHHHHHHHH"
We maintain a custom internal Docker image that everyone uses for their JupyterLab needs to make this manageable; that way, we only have to get everything to work once, and everyone else can just docker pull :latest and be good. It's become a rather large image, since it needs to have everything anyone uses, but for a small-ish team, this has worked well for about two years or so now.
Another upside: There is a GCP gcloud one-liner to start a VM for a given Docker image, and the image is designed to work both locally and in a GCP VM, with notebooks in a git repository that gets checked out automatically on container creation, so switching from laptop to 16-core VM to churn through a large dataset from a bucket is pretty seamless.
"Oh, I need to install the python package and the jupyter extension separately? And there are no breadcrumbs if I install one but not the other? And I have to figure out which versions are mutually compatible with each other and jupyterlab because 'everything latest' sure isn't? And I have to install extensions to get features that used to be built-in? And then I have to explain the install procedure to the person I'm trying to share with, twice, because he didn't believe me the first time? UGGGGGGGHHHHHHHH"