It’s experiencing breaking changes in the name of improving performance and flexibility. This is precisely the reason why you don’t want it in the standard library, because that would have either prevented these improvements from being invented, or fractured the ecosystem between std::base64 and base64 in just your urllib/urllib2/urllib3/requests sort of way.
As it stands, by being external, people can depend on base64 0.11, 0.12, 0.13, &c. and it all works fine, and over time they’ll tend to migrate to the latest version (dealing with the generally minor breaking changes involved) but it doesn’t matter so much.
The ecosystem is already fractured! Depending on when you added the base64 crate to your project, you have incompatible interfaces! This seems bananas. This isn't, like, serde. It's base64!
Perhaps I didn’t express it in the clearest way; what I meant was more irredeemably fractured with no intent or expectation of reconciliation. When it’s split between base64 0.11, 0.12 and 0.13, it’s reasonable to expect people to update to the latest version of base64 over time—and hopefully that will eventually just be "1", but if it takes time to get there that’s fine.
The incompatible interfaces thing can be a problem in Rust and does require a bit of care, but in this particular case isn’t a serious problem; the base64 crate is almost always going to be implementation detail, not exposed publicly. With serde, yes, reaching a stable 1.0 was definitely more important for ecosystem compatibility.
If I want to dunk on Rust going forward, I can just say "it is important to know the difference between 0.11, 0.12, and 0.13 of the base64 crate". I think this is a hard argument to rebut. My point is not that Rust sucks, but that the stdlib strategy they've pursued is wrong; not a strength of the language.
Further: this seems like an easy problem to fix, much easier than "Go lacks match statements and generics" or "Python lacks meaningful typing altogether". But my guess is that culturally, it's just as hard to fix.
It’s not important. The differences are tiny, the sort of thing that would probably pass unnoticed in dynamically-typed languages but which Rust’s greater rigour instructs be marked a breaking change. As for knowing, humans aren’t in the habit of memorising every detail of libraries that they only interact with occasionally, and this is such a case.
Besides that, you’ll normally be dealing with the latest version, and if you deal with an older version then you should normally consider upgrading it. I looked through a few projects of mine with base64 deep in their dependency trees, and it was all 0.13 except for one instance of 0.12 via a dependency that needs updating.
You’re stubbornly ignoring the problem of freezing base64 into the standard library, apparently obdurately refusing to even acknowledge that a thick std is every bit as problematic as a thin std, just with a different set of caveats.
If you freeze base64 into the standard library, it’s frozen. No more even slightly incompatible changes can occur, and so you’re stuck with something inferior for ever after. This is exactly the urllib/urllib2/urllib3/requests situation. This is the “standard library is where things go to die” problem.
Rust’s chosen standard library strategy is in no way wrong; it’s just a different set of trade-offs, and incidentally one that experience says is decidedly preferable for a language of Rust’s complexity (wherein there is routinely not one simple and obvious way of doing a thing, in contrast to languages like Go, Python and JavaScript which have distinctly lower hazards to freezing things into the standard library, yet even their standard libraries have wilted in places).
I think it's you who are ignoring the problem. You say that by freezing a particular Base64 interface into std, Rust programmers lose the opportunity to take advantage of more efficient interfaces. But that's obviously not the case. std can just host the best general-purpose Base64 interface, a "good enough" implementation, and people who are fussy about their Base64's can use any crate they like. That's what happens with the Go standard library, and, in fact, it's also what happens in Rust (for instance, with the Nix crates).
I don't think you've thought this all the way through. This last response of yours seems more reflexive than contemplative; for instance, it suggests that breaking changes are no big deal, because they're small breaking changes. That's not how breaking changes work! Indeed, I feel like everyone who's ever worked in a totally undisciplined package ecosystem (I'm not saying Rust is one of those) knows what the end result of this is: hours of brain surgery figuring out how to reconcile incompatible dependencies-of-dependencies.
> You say that by freezing a particular Base64 interface into std, Rust programmers lose the opportunity to take advantage of more efficient interfaces. But that's obviously not the case. std can just host the best general-purpose Base64 interface, a "good enough" implementation, and people who are fussy about their Base64's can use any crate they like. That's what happens with the Go standard library, and, in fact, it's also what happens in Rust (for instance, with the Nix crates).
nix is not a good example. Nix gives _additional_ APIs that std doesn't provide that are specific to Unix.
This isn't about std-having-base64 preventing the use of external third party implementations. This is an absurd interpretation of what chrismorgan is saying. They're saying that std's implementation is frozen forever. We can't change it. One of three things occurs:
1. We got the API right and everyone is happy in every use case.
2. We got the API really wrong and now we probably need to deprecate it.
3. The API is too simple to permit maximal performance, and now people need to know a decision procedure to choose between std and a third party library.
You seem to think (2)+(3) is okay, but I'm actually not a huge fan of either. (2) causes churn and ecosystem pain. (3) causes confusion and becomes the butt of a joke: "yeah you actually shouldn't use the base64 stuff in std since it's so slow, so just use FooBar's implementation from crates.io instead." It's not unlike the JSON situation in the Go world: I am routinely having to replace the standard library's JSON implementation with a third party one that's faster. The difference is that "good enough" in Go-land is much more culturally acceptable than it is in Rust-land. In Rust land, as I said in an adjacent thread, everything gets sacrificed at the alter of performance. When people come to Rust, they generally expect to get the fastest of everything. Otherwise, it's not good enough to supplant C or C++.
The result of this is that adding things to std is a risk. So when we add things, we need to do a risk analysis and evaluate its benefits against its costs. When you nitpick a specific example like base64, it's easy to lose sight of the bigger picture here. Bringing base64 into std wouldn't solve the problem you're complaining about. (If it did, I have no doubt we'd do it, because it's base64. Probably we can get that right.) What you're really advocating is a philosophical/cultural change that will dramatically increase the size of std, and thus increase the risk we take with respect to introducing things that people will ultimately wind up not using. We don't want to do that. We want the ecosystem to work that out so that they have room to evolve.
> knows what the end result of this is: hours of brain surgery figuring out how to reconcile incompatible dependencies-of-dependencies
This generally only occurs for public dependencies. base64 is very unlikely to be a public dependency. Folks who maintain foundational public dependencies are generally quite careful about pushing our breaking change releases. (`rand` is a good counter example, but it looks like they've slowed things down as of lately.)
I'm not a huge fan of taking forever to get to a stable 1.0 for core ecosystem libraries, so that's definitely a knock against our approach. I'd rather see folks commit to a stable interface for some years even if the API isn't perfect in order to minimize churn.
I can assure you that my position is not some reflexive HN drivel. I've been on the library team since the beginning. I've thought a lot about this and lived through the trade offs. And chrismorgan is right: this is a decision with trade offs. There is no obviously correct choice here. I probably hate bringing in dependencies as much or more than you do.
Just a note that I did read this, wanted to reply, but felt I needed to write a reply that did it justice, and then it fell off my radar. We will meet again on the fields of language battle, burntsushi! (Thanks for taking the time to write this, even though I disagree with a bunch of it.)
You could use Rust editions to allow evolving the standard library, perhaps - they changed the prelude (slightly) in Rust 2021. That might come with its own problems - on the other hand, you can't actually tell the major version of any given library from code (unlike with Go), so changing the definition of a standard library package upon an edition change.