Hacker News new | past | comments | ask | show | jobs | submit | kimmeld's comments login

Other countries don’t have the bodycount of dead children like the USA. If anything it proves that less guns makes you safer.

https://en.m.wikipedia.org/wiki/%27No_Way_to_Prevent_This,%2...


Cheyenne Wyoming not Cheyenne mountain.


Apple won’t let you install software you wrote onto your own computer. Culturally the same as Ma Bell.


I’m really struggling to see how you could possible make that statement in good faith. I regularly install software I wrote I my own Apple computer.


I have had an ember mug for a couple years now. Never have used the app. Works fine without, unless you want to tweak the temp from default.


I’ve had one for a couple years and never knew there was an app for it.


System Verilog is the C++ of chip design. It sucks, but it’s everywhere and tied into everything. Like C++ there are so many ways to write dangerous code. We do need a new chip design language but it’s going to take something special to move the needle.


30 years or so ago, there were a handful of programming languages in regular use: C/C++/Assembly/Ada/Fortran/Pascal. Now there's 100x more that people use professionally.

I always wondered what would happen if hardware engineers bothered to learn compiler theory. Maybe this could have been a solved problem decades ago.


This is a pretty heavy handed statement -- there are plenty of "hardware engineers" who know plenty about compiler theory and/or have contributed significantly to it. A similarly flippant comment might be "if software engineers only understood hardware better we might have smartphones that last a month on a charge and never crash".

The challenge with hardware is that unlike "traditional" software which compiles to a fixed instruction set architecture, with hardware you might literally be defining the ISA as part of your design.

In hardware you can go from LEGO style gluing pre-existing building blocks to creating the building blocks and THEN gluing it together, with everything in-between.

The real crux of the problem is likely our modern implementation of economics -- a CS graduate who has base-level experience can bank roll a crazy salary that some guy who might have a BSEE, MSEE and PhD in Electrical Engineering ("hardware") will be lucky to get a job offer that's even enough to cover costs of education.

Until the "industry" values hardware and those who want to improve it, you'll likely see slow progress.

P.S.

VHDL (a commonly-used hardware description language) is more or less ADA. Personally I think the choice of ADA syntax was NOT a positive for hardware design but the type-safety and verbosity being a very apt fit for software.


> Until the "industry" values hardware

industries don't value things, markets value things. RTL work doesn't pay well because the products don't sell into markets that scale like software. with the exception of NVIDIA and whatever you call Apple's teams that actually do/tweak hardware, no hardware product has the margins and scale of software. that's not culture, that's physics.

> who know plenty about compiler theory and/or have contributed significantly to it

also maybe some RTL people know about compiler stuff (phi nodes/dataflow analysis/reg alloc/etc) but most definitely do not know anything except verilog/vhdl and perl/python.


I won't get into a debate about semantics, but even your employer is a prime example -- Xilinx profit margin per device is staggering by semiconductor standards but they won't pay what they could. The market doesn't always make that dictation, the company does.

And that's an example but there are other "high-margin" hardware organizations (besides Apple) who still pay HW/EEs lower than SW/SREs.

And if you've been in engineering long enough you'll meet some pretty brilliant people, but their names might not be household or widely known. Lumping engineering into "RTL people" is pretty dismissive.

Claude Shannon, father of information theory, is less widely known for his MS work on 'Shannon Decomposition' which FPGA companies used for a long time to break combinitorial logic functions to multiplexor (MUX) based implementation. That's not "compiler theory" but that's a building block of a type of logic synthesis which compilers used. Shannon isn't a "compiler guy", just a smart dude who's intelligence wasn't bounded by one domain or discipline.


Despite how verbose it is, I honestly like working with VHDL more than Verilog.


And Ada is really an underappreciated language. Still not perfect, but Spark makes it super interesting.


Big software companies create and open source stuff which makes them more productive.

Why doesn't this dynamic work in hardware?

Wouldn't "valuing hardware" improve their competitiveness?


Well... no.

I believe this to be a cultural difference as well as an economic one.

The marginal cost of open source software is $0. But if you license an ARM core, you're paying money for it (like Apple is doing for the M-series processors). ARM has to make money somehow for the development of it's core.

Open source software reduced risk -- though it took a while. And the reduction of risk was valued by most companies who wrote software, or relied upon it for it's profit stream. Major software corporations at the time (IBM, Microsoft) only increased the risk in the 1980/90's, because they were mostly rent seeking.

Most problems were seen over and over and over again, hence they could be solved with software. And when the Microsoft failed to solve them, the open source community did.

In hardware there's only a handful of companies, and open sourcing anything might lead to a competitive disadvantage. So whatever tooling AMD has, they're not going to share with Intel.

Also when you're paying ARM for a license, you're getting a good core, and a lot of good support.


> "if software engineers only understood hardware better we might have smartphones that last a month on a charge and never crash".

I mean... is that wrong, though? In 2024, C++ will still segfault -- 45 years after it's creation in 1979. No one steering the language seems to be bothered by it. Even Bjarne Stroustrup works in the financial industry now raking in the large paychecks.

Turns out, I'm pretty cynical of software as well.

Look, I'm not saying "Software is the utmost gloriousness that will save humanity!" But what I am saying is that hardware does seem to have fallen well behind software in terms of tooling. VHDL and Verilog are still the tools used to describe hardware today as they were 30 years ago.

If you can fix the barrier to entry into hardware, you'll probably increase the demand for custom hardware.


Your example about Bjarne says it all -- it was financially more in his interest to leave academia / standards comittee / "hardcore work" for something more financially lucarative.

There's maybe another bigger point here that a lot of "great tools" or "good ideas turned into reality" happened either because the person/people working on it were so motivated and/or they were in the right environment that supported this type of creativity.

Sadly, most hardware-oriented organizations are the exact opposite of creativity or "great ideas" because if they fail the physical cost to fix something is staggering.

In software you can try something, if it doesn't work, call it a learning experience, re-factor, potentially re-use and have another go at it.

In hardware if you do that few people will high five you for a valiant effort.

Historically academia was a place that could "birth" new ideas, help explore them with a pure intent to see how it worked out. These days I think people are dying to finish with school and not live a pauper's life.


This is really a list of the survivors. Tcl was big in the early 1990's, and Objective-C was around on NeXT and other boxes. Perl was rapidly becoming the glue that holds the web together. Lisp was embedded into a lot of programs, as it is now for extensibility. It's also overstating Fortran's role, which is much like todays niche.

It's very hard to have an objective view of what programming language success and popularity looks like over that long a time, but I think that today there is a much narrower happy path. Either you're a dynamically typed multiparadgram language that's mostly imperative and OO in practice (Ruby, Python, Javascript), a statically typed object imperative language with brackets (C#, Go, Java), or Rust (where a lot of people don't realize how good GCs got instead). Not a ton of Haskell or SML inspired new languages.

By contrast in the 1980-1990s there were serious questions about which of Pascal, C, Objective-C, Smalltalk, C++, was going to win out for the dominant language a system is built in. Stuff like Display PostScript depended in a deep way on exposing programing languages to the engineers who had to work with it that were pretty alien.


> Not a ton of Haskell or SML inspired new languages.

"The Rust Reference: Influences" includes SML and Haskell.

https://doc.rust-lang.org/reference/influences.html


You mean if compiler theorists learned about parallel processing...


> It sucks, but it’s everywhere and tied into everything.

meh lots of places aren't actually programming system verilog - they're using python/perl to generate it. that's not the same thing.


Mangling the quote about regexes… http://regex.info/blog/2006-09-15/247

Some people when confronted with a problem, think “I know, I’ll use a different language to generate another language.” Now they have two problems.


Sideloading, otherwise known as installing software you want, on hardware you own.

Love how the term tries to make it sound so suspicious.


The ADA also helps anyone who has kids and a stroller. Also helps people when they are injured and immobile for a time. Dealing with these things in other countries makes you appreciate the ADA so much more.



Being actively updated and maintained when Perl wasn’t helped Python dramatically.


Code can absolutely be incorrectly indented in python and to make matters worse it changes the functionality of the code when it is.


Code can be indented in a way you didn't mean for it to be. But it cannot be incorrectly indented in the sense I just described, because the indentation changes the functionality of the code. The indentation can never mislead you about the structure of the code.


I've seen one case where it has, where the outdent at the end of a nested loop/loop/if was wrong: it was only 3 spaces instead of matching the 4 indent, so it looked right at a glance but wasn't. Needed to use pdb to step through it and see it do the wrong thing before that stuck out.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: