You’re totally missing the point. The point is: you have some Python script, and you don’t want it to disappear into the sad world of a ticket attachment, and if you add the 5 lines of code and have a nice onboarding experience, it doesn’t have to disappear into a ticket attachment, it can instead disappear into your browser bookmarks. You should never test the limits of this framework because if it’s anything other than putting a simple script into a webpage, you shouldn’t be using this. It’s rot proof.
The US Python foundational team was less than 10 people. If you think any 10 person team in a 200,000 employee corporation is that critical well I’d have to disagree
When their jobs were moved to Munich there was a lot of discussion about how important they were, but something else stood out to me. One of the things they accomplished was taking over a year to make sure the monorepo can be upgraded to the latest version of python
As an outsider this definitely smells like an org creating work just to justify its budget. You have a single team of a handful of engineers doing the python upgrade work for tens of thousands of engineers. Doesn’t feel right out of the bat
I’m sure soon enough with the role moving they’ll find out if the team is needed at all
you are right. made worse because there's an internal rule that you can't ever have two versions of a library in the monorepo ant the same time. it has to be one giant CL for the entire migration company-wide.
It was a nightmare. Mostly because there were a lot of special cases (think of a Python2 appengine-classic having no upgrade path on some //third_party lib that the team needed to keep running anyway), and it's not just the 10^5-10^6 # of concurrent contributors to the codebase, it's 10+ years of contributions much of which were maintenance-mode projects, many of those had been inherited two or three times over.
IF every package in the monorepo is forced to use the same exact version then there would exist literally thousands of these teams, even tens of thousands, which I doubt
I don’t really believe if I go to 500 Google owned websites I will find the exact same version of angular everywhere
every package in the monorepo did indeed need to use the same exact version of python. the python interpreter itself is part of the monorepo, and all other python code builds and runs that binary.
the python team did not personally upgrade every line of code to work with a new python version, we made sure that the code could be upgraded and got the people maintaining each package to do the upgrade. (a missing piece of the puzzle is that every package, even mirrored third party code, has an official owner responsible for keeping it working within google).
what the python team needed to do was upgrade all the tooling (build system, linter, type checker, etc) to work with the new version, run large scale tests to see what would break in an upgrade, analyse the dependency graph to figure out what order packages needed to be fixed in (e.g. if numpy needed to be upgraded that would be super high priority because thousands of other packages depend on it, and we might even pitch in and help the team maintaining it with that if needed), and track the progress of the upgrades, communicating with teams whose code hadn't been upgraded so that they could prioritise the work.
sometimes we also needed to write automated refactorings to fix some basic code pattern that changed, e.g. if some very widely used library decided it was going to stop accepting numbers where it needed strings, and that the caller should do the conversion first, we would try to mechanically fix that across the codebase rather than make everyone do it themselves.
anyway, this is all to say that it was indeed a large and time consuming problem, and we were certainly not doing it because we had nothing better to do with our time.
https://news.ycombinator.com/item?id=40171125
Draw your conclusions of the level of support for this library.