Disclaimer : my career has mainly been on compiler/runtime and other infrastructure related stuff, so not much of a business logic/web dev kind of guy.
> The end of the day I think the way to get tech too scale, is for companies / academics / etc to really just put the work in.
This is true in the abstract, but my point is mainly about the comparative effectiveness of different strategies when it come to building scalable infrastructure (or scaling up infra): Keep the bulk of the existing code and invest in making the underlying compiler/interpreter/tool chain better or progressively migrate the code to a tool chain with better scaling capabilities from the get go.
From experience, the nature and semantic of a language severely what is "reasonably" possible in the runtime in term of safety (like code loader in java), performance (both CPU and memory) and tooling. Now improvement are possible, but they tend become exponentially expensive as time goes on.
Now, obviously there is always a tension between developer productivity and infra concerns, but i do believe that we have better compromise point on that line with newer language and framework.
I totally agree that in the 2000's the experience of most "enterprise framework" sucked very hard, and the emergence of language such ruby/python etc... was a god send : a response to the overly rigid and ceremonial way of the past.
But with time, we have been able to understand better what makes a good programming experience, distill that into better designed languages and framework which offer "better" compromises. For example :
- Instead of dynamic vs static type , we have progressively typed and type inference
- Instead of GC vs non-GC we have rust borrow-checker
- Instead of runtime meta programing with have DSL and macro's
Even more important, i believe that a lot of the experience come from the tooling around the language, and there again the "cargo/dotnet/go" cli approach with a single coherent entry point for both package management and framework scaffolding ease a lot of the pain of the old way.
With all of that we now have languages which offer a better compromise on the dev. prod vs infra/performance...
> JVM is blazingly fast
I would say blazingly faster... But compared to C++ (or even rust) java is still quite slow. Especially for anything compute intensive.
> Disclaimer : my career has mainly been on compiler/runtime and other infrastructure related stuff, so not much of a business logic/web dev kind of guy.
That probably explain your surprise. This question comes up extremely often on HN, the answer is that even if you were to stop all feature development, re-platforming a large organization like Shopify would take years, not even considering all the re-training, lost knowledge etc.
And you'd spend these years having to support multiple platforms so that would only pay off much later, and in the meantime your competitors continue iterating.
In general it's best to stick with the devil you know, unless the new platform interoperate very well with the old one.
> re-platforming a large organization like Shopify would take years
I am not sure what re-platforming means here, but "slowly transitioning" to a different stack is not the same as stop and rewrite everything.
> That probably explain your surprise
While i don't write "business logic", i have been involved in a lot of project to babysit, maintain, refactor and/or improve business code bases. Sometime the runtime can only do so much and you have to adjust the user level code. And from experience it's never as bad as some make it to be.
> And you'd spend these years having to support multiple platforms so that would only pay off much later, and in the meantime your competitors continue iterating.
Again i see this very often as well. Software engineering seem to over emphasizes the cost of platform transition while downplaying the operating burden of not modernizing.
> over emphasizes the cost of platform transition while downplaying the operating burden of not modernizing.
Given that I work on the Ruby Infrastructure team at Shopify, I think I have a well informed view on both the cost of transitioning and the operating burden.
When you reach the level of scale where you need to move to distributed systems, you pretty much need to re-platform anyways. That's usually the ideal time to make a decision like this.
I've worked in Orgs with Cobol code written in the 70s and 80s still around. The strategy of moving your business over to a more performant / scalable tech, in essence means you now have two sets of technologies that just end up sitting on top of each other. I worked on a business line that literally had Java, Javascript, Cobol, VBScript, C#, and Smalltalk all working somewhere in the business process counting only in house written code! Businesses never actually modernize everything. They patch old code, write new stuff in new tech, and it grows in complexity. So the invest in core techs we already use is I think a great idea.
Hard to comment in the abstract, but i think this just showed that the organization did have a defined strategy and things just grew on a had-hoc fashion. Seems like an orthogonal problems.
The JVM takes bytecode and generates assembly for execution. It also profiles that code and improves it over time.
Sometimes long running java bytecode will be faster than statically compiled C++. In many/most cases, long running Java code on good JVM will reasonably close to C++ for some measures of "reasonable".
Garbage collection pauses, memory usage, JVM startup/warmpup is all more detrimental to Java in the speed comparison.
But "quite slow" basically, to me, implies it is 10-100x slower than C++. Sure you can come up with various benchmarks (lies, damn lies, and benchmarks) to make some corner case.
In general, the JVM is probably about 2-3x slower than C++ compiled code. It certainly is not "quite slow".
Every thing depends on the use case, and one case always find special case as counter example and most engineering solution/choices.
Generalization are still useful and sometime "true". C++ (and native languages) are faster than java simply because it was designed this way. C++ chose speed at the price of complexity, safety and build time. Java on the other hand focused on simplicity and safety.
I understand the idea of wanted to bring nuance, but the"it depends" can also become an excuse for bad tech choices.
> There are plenty of cases where Java will be faster than c++ purely due to the available libraries.
Hard to believe, but in that case i would say that you are comparing libraries, not the languages.
If that were true, C and Rust (which are typically faster) would be just as complex as C++ (but no language is).
It chose to jump into the OOP fad while staying low level ("what if we had C with classes?"). That's the cause of most of the complexity. The rest is age, backwards compatibility handcuffs, and kitchen sink stuff from other languages like move semantics and functional stuff.
> Java on the other hand focused on simplicity and safety.
Java chose to jump into the OOP fad while being high level. Ruby and Python did the same thing, and they're also simple as a result. That or low level and not OOP are the correct combos if you don't want to wind up with something as byzantine as C++.
> If that were true, C and Rust (which are typically faster) would be just as complex as C++ (but no language is).
I don't think that this assertion follows from my statement.
It's totally possible that there exist simpler languages which give exactly the same level of performance and expressiveness as C++ while being simpler. Nobody said that the C++ design was optimal with regard to the complexity/zero cost abstraction ratio. Bjarne him self think so (https://www.stroustrup.com/quotes.html), and both carbon and cppfront are effort in that direction.
The point was that when comparing (as of today) java vs C++ we shouldn't be surprised that a language which has "zero cost abstraction" as a core principle, and which is willing to be arbitrary complex ended up being faster.
With regard to the language you mentioned, C is simpler because it does less... And rust is much more recent and completely rethinks the native language landscape. I am not sure that we had the understanding (or even the tech) necessary to create rust 20 years ago.
But more importantly, i don't really buy the premise, of "are typically faster". Some concrete example would be nice, otherwise from experience this statement is wildly incorrect.
> It chose to jump into the OOP fad while staying low level ("what if we had C with classes?"). That's the cause of most of the complexity
Disagree.
> The rest is age, backwards compatibility handcuffs
Very true, if we remove backwards compact. C++ would be simpler.
> like move semantics and functional
Adding features paradoxically can simplify a language by providing a coherent/unified version of previously distinct usage pattern. uniform initialization is the canonical example.
I would say that move semantics also simplify the language by folding resources reuses patterns inside RAII.
Same for C++ lamda, just simpler syntax ...
So you seems to say that we can choosing to forgo low-level control (ruby,java,python) or general abstraction structure(which is what OOP is really) to would produce a simpler language.
But C++ is exploring a different design question, trying to have in the same language low-level control, and abstraction that are general enough to expression complex design, and which can be efficiently deconstructed as to not impact performance. And the C++ community seems to be willing to pay some level of complexity for that.
I do not doubt that there are domain where the best libraries are in a particular language, and this language might not be C++.
I am just not convince that java is better than C++ with this regard. (as they are more domain where the best C++ is faster than best java one).
But i think that's beside the point, we are comparing language here, so the existing/quality of characteristics of libraries shouldn't be the main focus (beside the stdlib of course).
Now there is more to choosing a tech. stack than just the programming language, and that include the libraries, current knowledge of the team etc. And i would even venture to say that those might be more important than the programming language per say. But i do believe we can still factor those concern out and compare programming languages, keeping in mind it's only part of the story.
What value is there in that exercise? We never use a given programming language outside of a given task/context. Comparing them absent that is navel gazing.
> The end of the day I think the way to get tech too scale, is for companies / academics / etc to really just put the work in.
This is true in the abstract, but my point is mainly about the comparative effectiveness of different strategies when it come to building scalable infrastructure (or scaling up infra): Keep the bulk of the existing code and invest in making the underlying compiler/interpreter/tool chain better or progressively migrate the code to a tool chain with better scaling capabilities from the get go.
From experience, the nature and semantic of a language severely what is "reasonably" possible in the runtime in term of safety (like code loader in java), performance (both CPU and memory) and tooling. Now improvement are possible, but they tend become exponentially expensive as time goes on.
Now, obviously there is always a tension between developer productivity and infra concerns, but i do believe that we have better compromise point on that line with newer language and framework.
I totally agree that in the 2000's the experience of most "enterprise framework" sucked very hard, and the emergence of language such ruby/python etc... was a god send : a response to the overly rigid and ceremonial way of the past. But with time, we have been able to understand better what makes a good programming experience, distill that into better designed languages and framework which offer "better" compromises. For example :
- Instead of dynamic vs static type , we have progressively typed and type inference - Instead of GC vs non-GC we have rust borrow-checker - Instead of runtime meta programing with have DSL and macro's
Even more important, i believe that a lot of the experience come from the tooling around the language, and there again the "cargo/dotnet/go" cli approach with a single coherent entry point for both package management and framework scaffolding ease a lot of the pain of the old way.
With all of that we now have languages which offer a better compromise on the dev. prod vs infra/performance...
> JVM is blazingly fast
I would say blazingly faster... But compared to C++ (or even rust) java is still quite slow. Especially for anything compute intensive.