And I would have considered that to be large 20 years ago, because it would have filled an entire CD-ROM disc, and it would have taken over five hours to download.
However, I still have somewhere over 100 GB free on my seven-year-old laptop, and the download will take me ~40 seconds.
But even if the 600MB were a problem, Bazel doesn’t need an external version of the JVM. It bundles its own, and fits the whole thing in under 50MB. Some people are under the mistaken impression that you have to install the JVM first and then install Bazel, which is simply not true.
Good to know that you live in a privileged place and access a fast internet connection. Most of the places don't have that kind of bandwidth [1] to download 600 MB of file in 40 seconds. Also, Bazel needs a huge amount of RAM because of JVM and runs a daemon process in the background to speed up the build duration. Running a background JVM daemon process is a NO for me and Bazel wastes system resources.
> Most of the place don't have that kind bandwidth…
Most people don’t have the kind of bandwidth it takes to download a 50 MB binary? Are we still talking about Bazel here, because that’s the size of the download. What about tools like compilers? You have to download those, too.
> Running a background JVM daemon process is a NO for me and Bazel wastes system resources.
What system resources does it actually use up? A half-gig of RAM? This kind of optimization is penny-wise and pound-foolish. You are spending your precious time and energy worrying about a resource whose marginal cost to you is about the same as a cup of coffee.
I run Bazel on a terribly obsolete, seven-year-old laptop which is my daily driver. Sometimes Chrome will choke on a website, sometimes I'm waiting ages for Homebrew to update or for NPM to download some packages, but Bazel is not a problem.
> You are spending your precious time and energy worrying about a resource whose marginal cost to you is about the same as a cup of coffee.
I don't install tons of random garbage because I want to know and understand what is running on the hosts that I maintain.
I need to be able to debug the stuff I run, and complexity makes it difficult.
Tenths of MB of stuff is complexity. Tenths of build time or run time dependencies is complexity. A compiled tool when a script would have been sufficient is also complexity.
> I need to be able to debug the stuff I run, and complexity makes it difficult.
Bazel is way easier to debug than Make. Debugging a decent-sized build system made with Make is just an exercise in suffering.
I am… honestly… no longer interested in understanding the entire software stack. I understand my time on this earth is limited and want to spend it doing other things. With Bazel, I am spending less time fucking around with build systems and more time doing the stuff I care about.
> Tenths of MB of stuff is complexity. Tenths of build time or run time dependencies is complexity. A compiled tool when a script would have been sufficient is also complexity.
As someone who still writes C, I can understand the joy that people feel when you make some cool program and it’s measured in KB, not MB. However, what you’re describing strikes me as fetishistic.
That 600 MB is mostly standard library, and reusing it makes every app smaller. Statically linking everything creates size and version skew problems that I hoped we had left behind in the 1980s.
> In my experience there is a strong correlation between executable size and how hard it is to unravel any issue that arises in it.
Bazel bundles the JRE, that’s why it’s 45 MB.
The correlation is kinda beside the point, though, because my experience with Bazel is that it’s easier to unravel issues with Bazel than unravel issues with Make, and Make is much smaller.