You're nitpicking one example. Putting base64 into std makes the higher level comment go from complaining about 100 dependencies to complaining about 99 dependencies.
I don't have an opinion about the broader dependency problem in Rust. The comment to which I responded was hosting an internal debate about Base64 and UTF-8. I think that debate is easy to resolve: use UTF-8, put Base64 in the stdlib. Do you disagree?
I guess I'd just say that the aggregation of these "keep base64 out of std" decisions are why every Rust thingy I put together ends up with a zillion dependencies. There might be an upside to that (I have not detected it), but there's an obvious downside.
Rust gets a bunch of stuff right that other languages missed. I don't think this one of them. I think Go's strategy of having a pretty-ok version of most of the table-stakes things developers do is probably the right one, just like I think sum types and option/result are better than error types and `nil`.
I share your irritation at people who drop into the middle of discussions like these to nitpick random language war arguments, but this is a thread that is actually about Rust dependency hell, and the comment I responded to, I think, sort of diagnoses why Rust has that "problem" (if you think it's a problem --- I do, but whatever).
For whatever this is worth, by the way, languages that have Base64 in std appear to include Ruby, Javascript, Python, Clojure, Java, Scala, Go, Elixir, Nim, and Swift. OCaml, C++ and Haskell do not.
> I guess I'd just say that the aggregation of these "keep base64 out of std" decisions are why every Rust thingy I put together ends up with a zillion dependencies.
Right, I partially agree, and this is a more substantive critique. But it's a much harder position to put into a pithy HN comment that sounds obviously correct. ;-)
With that said, it's pretty common in my experience to wind up in exactly the same sort of situation in Go-land. Go's standard library might hold it off for a little bit, but I seem to accrue large dependency trees there too.
> There might be an upside to that (I have not detected it)
Aw c'mon. The comment you were originally nit-picking even called it out. I'm sure you've heard the quip, "the standard library is where things go to die." And Python's in particular is often brought up as an example. Go's standard library hasn't suffered as badly as Python's has, but there are parts that have died.
If you've used Python a lot, then the fact that you have to use 'requests' and not std's HTTP client is probably not a downside. You've already digested that ecosystem knowledge. But as stewards of the standard library, this is something we see as a downside.
> For whatever this is worth, by the way, languages that have Base64 in std appear to include Ruby, Javascript, Python, Clojure, Java, Scala, Go, Elixir, Nim, and Swift. OCaml, C++ and Haskell do not.
Yes. Our design philosophy around std learns from some of those languages. Python in particular has influence in two respects. First is what I said above: when you have to make an HTTP request in Python, the standard advice is "use the third party requests library" despite the standard library having an HTTP client. This should make it clear that following your advice is merely a necessary criterion; it is not sufficient. Second is that "std should remain backward compatible for approximately forever." That means we have a very high bar for including things. (I am saying "we" here because I am on the Rust team that decides these things.) API evolution is more difficult in std then outside std, because we can't just release a 2.0. So if we get the API wrong, it sucks. And eventually a superior third party crate appears and everyone has to "learn" to use the third party crate instead of std.
And of course, adding things to std also means a higher maintenance burden. There have to be people willing to champion and maintain these sorts of huge additions. And many of us (now speaking as the regex crate maintainer) are actively opposed to throwing it into std precisely because it becomes more difficult to evolve.
Even if we did all this and put whatever pet thing it is you want to use without pulling in dependencies (I guess that's "HTTP client" here), that doesn't really solve the general problem. Eventually, you're going to pull in a dependency for _something_. And that crate's author may have decided that depending on something else instead of rolling their own thing was the better choice. Or also the opposite: perhaps they want to split apart the internals of their crate so that others can reuse those components. I did that with the regex crate. So on the one hand, I'm contributing to the high dependency count number, but on the other, I'm solving actual problems people have by providing access to internal APIs that are separately versioned. How many other regex libraries do that? Not many. Because dependencies are too annoying in the C ecosystem, which is where most regex engines live.
Getting more hand wavy... In the Go world---and I speak as someone who has used and enjoyed Go for over a decade now---performance is not a #1 concern. Not like it is in the Rust ecosystem. This has huge API implications. Culturally, people expect Rust programs to be fast and they expect to be able to achieve congruent performance as they could in C. In this environment, API complexity isn't much of a reason to not doing something. Everything gets sacrificed at the alter of performance. I know, because I've been carrying sacrificial lambs to the alter for years in the Rust ecosystem. But in Go? Nope. API simplicity is valued higher relative to things like performance.
Here's the punch line: if you want to write a program that sends an HTTP request and has a dependency count of 0, then Rust isn't the right language for you. It probably never will be.
Also, to be clear, I am someone that works both sides of this debate. I can often be found advocating against the use of dependencies and decreasing dependency count. Precisely because I perceive and experience costs associated with them. I've done work to reduce the dependency tree size of core ecosystem libraries. But there are some fundamental trade offs that put floors on how long you can go. And you can also seen me as an advocate against using many of my crates if a "simple" solution works for you. Take aho-corasick for example. My implementation of it is several thousand lines because I sacrificed just about every lamb possible at the alter of performance. But the <10 line naive implementation of multiple substring search might be just fine for your use case. And if it is, just write that and don't bring out the big guns.
Yeah, so let me be clear: I can see why, especially when the std lib was in its formative stages, the project might have decided to keep it, uh, lithe. You have Python as a very good counterexample to "kitchen sink stdlib can be good". Python's stdlib is a kitchen sink that badly needs to be cleaned out, and never will.
You're right that Go has some degrees of freedom that Rust doesn't have, especially on performance. But people do have perf problems with the Go stdlib, and they replace components (amusingly, one of the first things I built in Go did its own sockets and its own polling because Go's timers were too expensive for what I needed).
I just think it's better to have a good-enough stdlib --- with well-thought out interfaces where you're not going to have net/http and then net/http2 and then net/http3, and 16 different ways to run a subprocess and read the output --- that people can optionally outdo with 3rd party dependencies, than the situation I think Rust is in right now, where you basically can't do anything without a bunch of dependencies.
I'm definitely not dunking on Rust. We do a bunch of Rust here and I mostly enjoy working in it.
> Go's standard library hasn't suffered as badly as Python's has
In no small part because it was born 20 years later: lots of modules were added a bit haphazardly to the Python stdlib early on and the tech ended up mostly dying, but for the most part they're still there in case you care to use them. Like sunau, aiff, or dbm. Others were added because package managers were not really a thing back then so you'd try to get useful packages into the stdlib so they could be used easily, but then the maintenance burden increases a lot (and all the users assume the stdlib is maintained anyway).
Having to find, download, review, and build a bunch of extra dependencies (and surveil all the dependencies you do explicitly want to make sure they're not pulling in fucky code to do table-stakes things that the stdlib could just stdize.)
Again: these are downsides, but not dispositions. On balance there could be good reasons to have a terse, svelte, balletic std. I'm just saying, there's clearly a downside to the decision.
Without getting into Rust’s standard library inclusion philosophy, I would note that UTF-8 and Base64 are quite different cases, and shouldn’t be combined in this way.
String encoding is pervasive, and if the standard library didn’t have strings or didn’t specify the encoding, there would be much grief throughout the ecosystem, because you can’t just patch it in at the library level, it’s a language-level feature. So string encoding is a choice that has to be made.
Base64, on the other hand, is just a bunch of functions, nothing special about it that means it has to be in the standard library.
It does need to be in the standard library, because everyone needs it, almost nobody has special needs for it, and if it's not in the stdlib then you have to find, download, review, and build it and its dependencies, and that is problematic.
When I say that something “has to be in the standard library”, I mean that it can’t be implemented outside the standard library. That’s certainly not the case here. You’re using an outright bad definition of “need” here—subjective opinion rather than objective requirement.
> because everyone needs it
This is factually wildly wrong. I wrote a fair bit more here but decided it wasn’t helpful. Précis: web stuff tends to load it indirectly (though amusingly most of the time actually not use it, so that Base64 code won’t actually end up in your binary), but it’s not terribly common outside of internet stuff to reach for Base64.
I’ll leave just one more remark about Base64: once things are in the standard library, breaking changes can no longer be made; the base64 crate is still experiencing breaking changes (<https://github.com/marshallpierce/rust-base64/blob/master/RE...>, 0.12 and 0.13 were last year and 0.20 is not released), largely for performance reasons.
Please don’t just call the thin-std approach “problematic” without acknowledging that the alternative is at least as problematic, just with a different set of caveats.
> This is factually wildly wrong. I wrote a fair bit more here but decided it wasn’t helpful. Précis: web stuff tends to load it indirectly (though amusingly most of the time actually not use it, so that Base64 code won’t actually end up in your binary), but it’s not terribly common outside of internet stuff to reach for Base64.
It also seems to be distinctly losing popularity as time goes on, as the alphabet is not restricted enough to be easy for a human to sight-read/copy, and a wider alphabet will have as good an ASCII channel compatibility.
(begin rant) My issue with base64 is that + and / are terrible character for the alphabet and that the final padding = can be omitted so that base64 strings cannot be safely concatenated (as whitespace is ignored)
See also https://en.wikipedia.org/wiki/Base64#Variants_summary_table. base64url is common, with - and _ instead of + and /. As for concatenating, that’s not valid even in variants where = padding is mandatory: padding marks the end and is not allowed to occur mid-string. But I don’t think this is a problem, either: transparently concatenating Base64 strings is not generally a useful operation; I can’t think of any valid use case for it off the top of my head, I’m curious why you might think it useful. Base64 is a transfer encoding sort of a thing, and concatenating blobs while in their transfer encoding just… isn’t something that you do.
It’s experiencing breaking changes in the name of improving performance and flexibility. This is precisely the reason why you don’t want it in the standard library, because that would have either prevented these improvements from being invented, or fractured the ecosystem between std::base64 and base64 in just your urllib/urllib2/urllib3/requests sort of way.
As it stands, by being external, people can depend on base64 0.11, 0.12, 0.13, &c. and it all works fine, and over time they’ll tend to migrate to the latest version (dealing with the generally minor breaking changes involved) but it doesn’t matter so much.
The ecosystem is already fractured! Depending on when you added the base64 crate to your project, you have incompatible interfaces! This seems bananas. This isn't, like, serde. It's base64!
Perhaps I didn’t express it in the clearest way; what I meant was more irredeemably fractured with no intent or expectation of reconciliation. When it’s split between base64 0.11, 0.12 and 0.13, it’s reasonable to expect people to update to the latest version of base64 over time—and hopefully that will eventually just be "1", but if it takes time to get there that’s fine.
The incompatible interfaces thing can be a problem in Rust and does require a bit of care, but in this particular case isn’t a serious problem; the base64 crate is almost always going to be implementation detail, not exposed publicly. With serde, yes, reaching a stable 1.0 was definitely more important for ecosystem compatibility.
If I want to dunk on Rust going forward, I can just say "it is important to know the difference between 0.11, 0.12, and 0.13 of the base64 crate". I think this is a hard argument to rebut. My point is not that Rust sucks, but that the stdlib strategy they've pursued is wrong; not a strength of the language.
Further: this seems like an easy problem to fix, much easier than "Go lacks match statements and generics" or "Python lacks meaningful typing altogether". But my guess is that culturally, it's just as hard to fix.
It’s not important. The differences are tiny, the sort of thing that would probably pass unnoticed in dynamically-typed languages but which Rust’s greater rigour instructs be marked a breaking change. As for knowing, humans aren’t in the habit of memorising every detail of libraries that they only interact with occasionally, and this is such a case.
Besides that, you’ll normally be dealing with the latest version, and if you deal with an older version then you should normally consider upgrading it. I looked through a few projects of mine with base64 deep in their dependency trees, and it was all 0.13 except for one instance of 0.12 via a dependency that needs updating.
You’re stubbornly ignoring the problem of freezing base64 into the standard library, apparently obdurately refusing to even acknowledge that a thick std is every bit as problematic as a thin std, just with a different set of caveats.
If you freeze base64 into the standard library, it’s frozen. No more even slightly incompatible changes can occur, and so you’re stuck with something inferior for ever after. This is exactly the urllib/urllib2/urllib3/requests situation. This is the “standard library is where things go to die” problem.
Rust’s chosen standard library strategy is in no way wrong; it’s just a different set of trade-offs, and incidentally one that experience says is decidedly preferable for a language of Rust’s complexity (wherein there is routinely not one simple and obvious way of doing a thing, in contrast to languages like Go, Python and JavaScript which have distinctly lower hazards to freezing things into the standard library, yet even their standard libraries have wilted in places).
I think it's you who are ignoring the problem. You say that by freezing a particular Base64 interface into std, Rust programmers lose the opportunity to take advantage of more efficient interfaces. But that's obviously not the case. std can just host the best general-purpose Base64 interface, a "good enough" implementation, and people who are fussy about their Base64's can use any crate they like. That's what happens with the Go standard library, and, in fact, it's also what happens in Rust (for instance, with the Nix crates).
I don't think you've thought this all the way through. This last response of yours seems more reflexive than contemplative; for instance, it suggests that breaking changes are no big deal, because they're small breaking changes. That's not how breaking changes work! Indeed, I feel like everyone who's ever worked in a totally undisciplined package ecosystem (I'm not saying Rust is one of those) knows what the end result of this is: hours of brain surgery figuring out how to reconcile incompatible dependencies-of-dependencies.
> You say that by freezing a particular Base64 interface into std, Rust programmers lose the opportunity to take advantage of more efficient interfaces. But that's obviously not the case. std can just host the best general-purpose Base64 interface, a "good enough" implementation, and people who are fussy about their Base64's can use any crate they like. That's what happens with the Go standard library, and, in fact, it's also what happens in Rust (for instance, with the Nix crates).
nix is not a good example. Nix gives _additional_ APIs that std doesn't provide that are specific to Unix.
This isn't about std-having-base64 preventing the use of external third party implementations. This is an absurd interpretation of what chrismorgan is saying. They're saying that std's implementation is frozen forever. We can't change it. One of three things occurs:
1. We got the API right and everyone is happy in every use case.
2. We got the API really wrong and now we probably need to deprecate it.
3. The API is too simple to permit maximal performance, and now people need to know a decision procedure to choose between std and a third party library.
You seem to think (2)+(3) is okay, but I'm actually not a huge fan of either. (2) causes churn and ecosystem pain. (3) causes confusion and becomes the butt of a joke: "yeah you actually shouldn't use the base64 stuff in std since it's so slow, so just use FooBar's implementation from crates.io instead." It's not unlike the JSON situation in the Go world: I am routinely having to replace the standard library's JSON implementation with a third party one that's faster. The difference is that "good enough" in Go-land is much more culturally acceptable than it is in Rust-land. In Rust land, as I said in an adjacent thread, everything gets sacrificed at the alter of performance. When people come to Rust, they generally expect to get the fastest of everything. Otherwise, it's not good enough to supplant C or C++.
The result of this is that adding things to std is a risk. So when we add things, we need to do a risk analysis and evaluate its benefits against its costs. When you nitpick a specific example like base64, it's easy to lose sight of the bigger picture here. Bringing base64 into std wouldn't solve the problem you're complaining about. (If it did, I have no doubt we'd do it, because it's base64. Probably we can get that right.) What you're really advocating is a philosophical/cultural change that will dramatically increase the size of std, and thus increase the risk we take with respect to introducing things that people will ultimately wind up not using. We don't want to do that. We want the ecosystem to work that out so that they have room to evolve.
> knows what the end result of this is: hours of brain surgery figuring out how to reconcile incompatible dependencies-of-dependencies
This generally only occurs for public dependencies. base64 is very unlikely to be a public dependency. Folks who maintain foundational public dependencies are generally quite careful about pushing our breaking change releases. (`rand` is a good counter example, but it looks like they've slowed things down as of lately.)
I'm not a huge fan of taking forever to get to a stable 1.0 for core ecosystem libraries, so that's definitely a knock against our approach. I'd rather see folks commit to a stable interface for some years even if the API isn't perfect in order to minimize churn.
I can assure you that my position is not some reflexive HN drivel. I've been on the library team since the beginning. I've thought a lot about this and lived through the trade offs. And chrismorgan is right: this is a decision with trade offs. There is no obviously correct choice here. I probably hate bringing in dependencies as much or more than you do.
Just a note that I did read this, wanted to reply, but felt I needed to write a reply that did it justice, and then it fell off my radar. We will meet again on the fields of language battle, burntsushi! (Thanks for taking the time to write this, even though I disagree with a bunch of it.)
You could use Rust editions to allow evolving the standard library, perhaps - they changed the prelude (slightly) in Rust 2021. That might come with its own problems - on the other hand, you can't actually tell the major version of any given library from code (unlike with Go), so changing the definition of a standard library package upon an edition change.
Are you referring to base64 here? If so, then you are very very wrong. I've written lots of Rust programs over the years and have needed base64 precisely once IIRC.
So maybe it's harder to work out than you think.