System Verilog is the C++ of chip design. It sucks, but it’s everywhere and tied into everything. Like C++ there are so many ways to write dangerous code. We do need a new chip design language but it’s going to take something special to move the needle.
30 years or so ago, there were a handful of programming languages in regular use: C/C++/Assembly/Ada/Fortran/Pascal. Now there's 100x more that people use professionally.
I always wondered what would happen if hardware engineers bothered to learn compiler theory. Maybe this could have been a solved problem decades ago.
This is a pretty heavy handed statement -- there are plenty of "hardware engineers" who know plenty about compiler theory and/or have contributed significantly to it. A similarly flippant comment might be "if software engineers only understood hardware better we might have smartphones that last a month on a charge and never crash".
The challenge with hardware is that unlike "traditional" software which compiles to a fixed instruction set architecture, with hardware you might literally be defining the ISA as part of your design.
In hardware you can go from LEGO style gluing pre-existing building blocks to creating the building blocks and THEN gluing it together, with everything in-between.
The real crux of the problem is likely our modern implementation of economics -- a CS graduate who has base-level experience can bank roll a crazy salary that some guy who might have a BSEE, MSEE and PhD in Electrical Engineering ("hardware") will be lucky to get a job offer that's even enough to cover costs of education.
Until the "industry" values hardware and those who want to improve it, you'll likely see slow progress.
P.S.
VHDL (a commonly-used hardware description language) is more or less ADA. Personally I think the choice of ADA syntax was NOT a positive for hardware design but the type-safety and verbosity being a very apt fit for software.
industries don't value things, markets value things. RTL work doesn't pay well because the products don't sell into markets that scale like software. with the exception of NVIDIA and whatever you call Apple's teams that actually do/tweak hardware, no hardware product has the margins and scale of software. that's not culture, that's physics.
> who know plenty about compiler theory and/or have contributed significantly to it
also maybe some RTL people know about compiler stuff (phi nodes/dataflow analysis/reg alloc/etc) but most definitely do not know anything except verilog/vhdl and perl/python.
I won't get into a debate about semantics, but even your employer is a prime example -- Xilinx profit margin per device is staggering by semiconductor standards but they won't pay what they could. The market doesn't always make that dictation, the company does.
And that's an example but there are other "high-margin" hardware organizations (besides Apple) who still pay HW/EEs lower than SW/SREs.
And if you've been in engineering long enough you'll meet some pretty brilliant people, but their names might not be household or widely known. Lumping engineering into "RTL people" is pretty dismissive.
Claude Shannon, father of information theory, is less widely known for his MS work on 'Shannon Decomposition' which FPGA companies used for a long time to break combinitorial logic functions to multiplexor (MUX) based implementation. That's not "compiler theory" but that's a building block of a type of logic synthesis which compilers used. Shannon isn't a "compiler guy", just a smart dude who's intelligence wasn't bounded by one domain or discipline.
I believe this to be a cultural difference as well as an economic one.
The marginal cost of open source software is $0. But if you license an ARM core, you're paying money for it (like Apple is doing for the M-series processors). ARM has to make money somehow for the development of it's core.
Open source software reduced risk -- though it took a while. And the reduction of risk was valued by most companies who wrote software, or relied upon it for it's profit stream. Major software corporations at the time (IBM, Microsoft) only increased the risk in the 1980/90's, because they were mostly rent seeking.
Most problems were seen over and over and over again, hence they could be solved with software. And when the Microsoft failed to solve them, the open source community did.
In hardware there's only a handful of companies, and open sourcing anything might lead to a competitive disadvantage. So whatever tooling AMD has, they're not going to share with Intel.
Also when you're paying ARM for a license, you're getting a good core, and a lot of good support.
> "if software engineers only understood hardware better we might have smartphones that last a month on a charge and never crash".
I mean... is that wrong, though? In 2024, C++ will still segfault -- 45 years after it's creation in 1979. No one steering the language seems to be bothered by it. Even Bjarne Stroustrup works in the financial industry now raking in the large paychecks.
Turns out, I'm pretty cynical of software as well.
Look, I'm not saying "Software is the utmost gloriousness that will save humanity!" But what I am saying is that hardware does seem to have fallen well behind software in terms of tooling. VHDL and Verilog are still the tools used to describe hardware today as they were 30 years ago.
If you can fix the barrier to entry into hardware, you'll probably increase the demand for custom hardware.
Your example about Bjarne says it all -- it was financially more in his interest to leave academia / standards comittee / "hardcore work" for something more financially lucarative.
There's maybe another bigger point here that a lot of "great tools" or "good ideas turned into reality" happened either because the person/people working on it were so motivated and/or they were in the right environment that supported this type of creativity.
Sadly, most hardware-oriented organizations are the exact opposite of creativity or "great ideas" because if they fail the physical cost to fix something is staggering.
In software you can try something, if it doesn't work, call it a learning experience, re-factor, potentially re-use and have another go at it.
In hardware if you do that few people will high five you for a valiant effort.
Historically academia was a place that could "birth" new ideas, help explore them with a pure intent to see how it worked out. These days I think people are dying to finish with school and not live a pauper's life.
This is really a list of the survivors. Tcl was big in the early 1990's, and Objective-C was around on NeXT and other boxes. Perl was rapidly becoming the glue that holds the web together. Lisp was embedded into a lot of programs, as it is now for extensibility. It's also overstating Fortran's role, which is much like todays niche.
It's very hard to have an objective view of what programming language success and popularity looks like over that long a time, but I think that today there is a much narrower happy path. Either you're a dynamically typed multiparadgram language that's mostly imperative and OO in practice (Ruby, Python, Javascript), a statically typed object imperative language with brackets (C#, Go, Java), or Rust (where a lot of people don't realize how good GCs got instead). Not a ton of Haskell or SML inspired new languages.
By contrast in the 1980-1990s there were serious questions about which of Pascal, C, Objective-C, Smalltalk, C++, was going to win out for the dominant language a system is built in. Stuff like Display PostScript depended in a deep way on exposing programing languages to the engineers who had to work with it that were pretty alien.
The ADA also helps anyone who has kids and a stroller. Also helps people when they are injured and immobile for a time. Dealing with these things in other countries makes you appreciate the ADA so much more.
Code can be indented in a way you didn't mean for it to be. But it cannot be incorrectly indented in the sense I just described, because the indentation changes the functionality of the code. The indentation can never mislead you about the structure of the code.
I've seen one case where it has, where the outdent at the end of a nested loop/loop/if was wrong: it was only 3 spaces instead of matching the 4 indent, so it looked right at a glance but wasn't. Needed to use pdb to step through it and see it do the wrong thing before that stuck out.
https://en.m.wikipedia.org/wiki/%27No_Way_to_Prevent_This,%2...