Hacker News new | past | comments | ask | show | jobs | submit login
A guide to getting started with embedded systems (yinka.dev)
180 points by memorable on May 13, 2022 | hide | past | favorite | 138 comments



Something I never understood about embedded systems work: why is the pay so bad?

Maybe I was just looking at the wrong job listings, but the technical difficulty (relatively low-level programming, manual memory management, dealing with janky firmware, antiquated toolchains, and incomplete documentation) seemed much harder than the compensation being offered.

At least in comparison to other types of coding you could be paid to do.

The only answer I could come up with was that unit profit margin * sales volume imposed a much lower cap, relative to pure software products?


I suspect it's because much of the field is actually quite junior, relative to the expectations of a senior or principal contributor. It's not until recently that I started to grasp how much breadth you have to have to be functional in embedded systems.

To be functional and independent at it, you need:

- a very solid grasp of C/C++,

- a good understanding of the concept of a state machine,

- a decent grasp of digital electronics and communications protocols, and

- some comfort in debugging (or at least diagnosing) electronics issues.

These skills take years to accumulate.

At the same time, I keep hearing the claim online and IRL (particularly among SWEs who try to get into embedded) that "it's just software - you don't need a good understanding of electronics" to be a competent embedded systems engineer. These are also the same people who get hung up on pretty fundamental problems with simple solutions, like:

- I2C buses missing pullups

- Power supplies not delivering enough current (Hi, everyone who's ever tried to drive a huge NeoPixel panel from a RasPi alone)

- IO logic levels not meeting input specs (e.g. driving a 3.3V input with a 1.8V signal)

Maybe I'm needlessly cranky but I've seen a non-trivial number of people go from "It's just software" to "OMG what do I do help" in the face of very basic hangups like these. It highlights to me that folks who have the breadth to do all of these things comfortably are extremely valuable, and hiring companies know this.


Just interviewed for an embedded dev position. The interviewer told me it is rare to find a candidate that can tell why a linked list is a bad data structure choice for embedded systems. I guess it goes down to a fine knowledge gap between CS and EE curriculum.

There is a "Computer Engineering" major in my university. C/C++ and state machines are taught in a freshmen year course. However, digital electronics / communication protocols / debugging is never taught in class. Practically everyone who know what SPI or I2C means is either an electronics hobbyist or member of a student competition team.


Part of the problem that I sense in the embedded systems world is that there is often huge differences in philosophical outlooks and development perspectives. In that, many in that world want to always be in the nuts and bolts of everything. For me, the perspective is to always be as high-level as possible when you can get away with it, leaving FPGAs and real-time OSs for things that actually need to be on an FPGA or real-time. The problem with this is that there are not very many who can bridge the gap between FPGAs, real-time programming, and high-level software.

I personally am very interested in the "software for hardware" realm, but it is a little difficult to find people doing interesting approaches. It's usually all Verilog or all VHDL on FPGAs with a smattering of HLS, all C/C++ embedded software, all high-level programming languages with soft real-time on single-board computers or something similar, or somewhere in between with something like LabVIEW. But there's usually almost no overlap between the people that do work in each of those categories.

There's a lot of constraints in these systems and almost all of the innovation has come at sort of the chip level in terms of hardware performance. The design and development process, tools, and experience have seen little innovation, as far as I can tell. Everything remains hard in embedded systems. I don't see any reason why it needs to be that way though.


You're right that it doesn't have to be that way, but there are good reasons it's remained so. Code re-use in ES is fiendishly difficult because every system is a special little snowflake. Moreover, ES teams are fantastically understaffed relative to the complexity of the systems they own.

Because nothing works by default on ES systems, you have to implement it yourself most of the time. If you can clear that hurdle, your company is unlikely to be open source friendly. If you can clear that hurdle, people on other teams won't have time to adopt it for their systems, so it won't spread. If you can clear all of that, all you've won is a single tool. In the meantime, the rest of the industry as written half a dozen.

There are things you can do here, but... yeah.


I'll bite: why shouldn't you use a linked list for an embedded system? As opposed to a contiguous, dynamically allocated data structure, I mean.


A couple of reasons, both to do with the fact that data memory is scarce on embedded systems. One is that it has high memory overhead (every entry includes a pointer), two is that making many small allocations can cause memory fragmentation, especially with the kinds of allocators commonly used in embedded systems.

Also if time is scarce, following pointers is slower than calculating offsets, but that's usually not an issue worth thinking about.


Oh ok. Counterpoint: allocating a single node if memory is already fragmented will succeed where growing a vector would fail. But I suppose the point of the question is being able to discuss the issues involved.


You don't have to worry about dozens of other applications fragmenting your memory. You keep your threads neat and your memory neat, and you don't have to worry about failed allocations.

Another reason to keep memory contiguous is that fragmentation can be much more punishing (sometimes), especially when latency is important. On large devices it doesn't matter much if you're under-utilizing the CPU while waiting for RAM, because the OS will do other things and then let you do a lot of work all at once. Embedded processors will have nothing else to do.


I feel like in many embedded systems you'd use a fix sized vector/buffer rather than a growable one.


I think it's the dynamic allocation that's considered bad? Other than at initialization.

It seems like allocating nodes out of a fixed size array wouldn't be too bad, though.


I used to interview embedded engineers, and I was one for a very short time. I think the issue is that many embedded engineers are not good at articulating when a linked list is preferable compared to a continuous data structure. The times when a linked list is better are getting increasingly rare as "embedded" increasingly means "includes a raspberry pi."

On devices with 1 GB of memory, the contiguous structures are usually better, but embedded developers often still reach for linked lists. On devices with 1 kB of memory, trained software engineers will scoff at linked lists and instead waste hundreds of bytes on large buffers sized for peak usage.


I'd say a linked list is preferable when you're writing a `malloc` implementation. It makes the code simple to understand (and small), you don't have any dynamic memory allocation available (because you're writing your system's allocator), and the size overhead is lower than alternatives. Linked lists aren't that bad in processors that lack cache, since then walking the list won't incur cache misses (every memory access is equally slow).

There are alternatives, of course. Your particular target might perform poorly with linked lists, so you might make your malloc use a different internal data structure. Or you might have plenty of flash but not much RAM, so you might trade code size for runtime overhead. Or... There are lots of alternatives, but a linked list is the simple default.


Arena/buffer-based allocation is also used to avoid an OOM at runtime not due to the lack of memory per se but due to fragmentation.


I recently interviewed for an embedded position, and had a set of similar questions. The right answer is, of course, "It depends".

If you've got 6 bytes of free code space and/or you're scrambling for RAM, you look at everything for space savings.

If the hardware team comes to you apologetically and says, "We're sorry, but for cost reasons we have to /double/ the amount of DRAM and flash and give you a faster, more power efficient SOC" then you may not have to worry about this level of detail.

Both of these have happened to me on prior projects. There are no fixed answers.

(Whenever I've gotten one of these types of questions in an interview, I try to have a broader conversation on the subject rather supply the answer they are looking for. If that goes badly, it's a big red flag).


That's not a particularly good interviewing question, in my opinion.

College curricula vary wildly in quality and content. I don't know of one that quite teaches you everything you need to know to be a functional embedded developer.


Seems like a linked list (and set dressing) is circumstantial.

Is it because linked lists can be slow and can sometimes pocket arbitrary data?

I feel like strong typing and limiting pointers in favor of runners combats this.


> I suspect it's because much of the field is actually quite junior

Most of the people I know in embedded development are greybeards. There aren't a lot of younger people going into the field. The schools don't seem to be prepping a lot of people for these kinds of jobs. (not sure what we're going to do because a lot of these guys are in their 60s and about ready to retire)

> These skills take years to accumulate.

Indeed. Which is why the low pay is all the more puzzling, but I have theories: As mentioned above, most embedded devs are greybeards. They don't move around much. They probably aren't as familiar with the much higher pay people are getting at the FAANGs, etc. They aren't as likely to have a LinkedIn profile or if they do they don't spend much time there. As an older dev myself, it's not like I'm clamoring for higher pay: my mortgage payment is quite low compared to rent (and will be paid off completely next year), my cars are old but paid off, don't have any debt other than the soon-to-be-paid off mortgage. All this to say that older devs maybe don't make as much money because they're not as aware of market rates and they don't really feel like they need a lot more money if they're happy where they're at.


I know at least five people doing embedded systems work in the greater Boston area who are comfortably pulling in north of $200k a year.

The pay is there. I'm not sure why there's this persistent notion that it's not.


That's well above the 95th percentile for Boston, and a bit above 90th for south bay. As someone who is in embedded above that level as well, I can assure you that it's a highly abnormal situation for the vast majority of ES devs. Most companies pay far below what they compensate other SWEs at. 100k-200k total compensation is about the norm for the majority of the market.


We can't even find anyone who knows what a cache line is...


That's what a CPU snorts before it goes to catch the bus.


maybe you shouldnt be interviewing people for specific terminology. People can learn. Experienced competent people can definitely learn. Maybe they know the concept, but not the term. Maybe they need a little guidance about what you want out of the question. Do you want them to know "cache line" the phrase? or do you want them to understand memory structure? etc...


> maybe you shouldnt be interviewing people for specific terminology. People can learn. Experienced competent people can definitely learn.

I generally agree, but "cache line" is a pretty low bar though, no? I'm a web developer with no formal CS background and I've known what a cache-line is since the first year I started to learn to code. Every textbook on my shelf that even tangentially relates to the kind of work embedded devs do makes reference to cache-lines, and god knows how many articles I've read over the years.

Training people from what is probably nothing is a pretty big bet on the part of a team and employer...


We all look up super simple stuff we learned in year one of engineering school. I'm not sure I would call reading a little "training". I just think interviewing should be a little more forgiving. Interviewers like to gate keep a little with little gotchas in an environment that is completely different than day to day work.


> We all look up super simple stuff we learned in year one of engineering school. I'm not sure I would call reading a little "training". I just think interviewing should be a little more forgiving. Interviewers like to gate keep a little with little gotchas in an environment that is completely different than day to day work.

I don't disagree with any of this, I just think there's a certain "base" of knowledge for certain jobs that needs to be so well understood that it's automatic. For an embedded engineers I think "cache line" is part of that base in much the same way I would consider it a hard requirement for any incoming frontend web-developer to know what the DOM is.


Somewhat disagree, depending on the embedded subset. Lots of MCUs don't have any cache, so they don't have any cache lines. If you're programming some sort of embedded Linux system you'll have a more powerful processor and almost certainly have cache.

So if you're hiring for MCU work (bare metal or RTOS) the concept of a cache line isn't important. If you're hiring for application processor work (often Linux), it's very important.


Engineers typically get paid less than developers. I used to joke back in the 90's and 00's (&still kind of do) that I'd quit and become a DBA (typically 2-3x salary).

I look around at the 6 figure web dev jobs and it honestly breaks my heart. I look at embedded and engineering roles (ones that interest me anyway) and it's rarely 6 figure. I just don't have the mental fortitude to deal with web dev monotony.

The only other time I got bad pay was game development.

So, my experience is, the more technical the job, the less you get paid (but more interesting/ demanding). Look at middle management, why do they get paid better? (Btw, I've been CTO too)


"I just don't have the mental fortitude to deal with web dev monotony."

If you move to be the guy that interfaces between developers and business people, it's not monotonous - each group is retarded in its own unique way.

You either stess over it, or you dont take it too seriously and have fun :D


>If you move to be the guy that interfaces between developers and business people

Any idea how you can do that? I noticed companies are very picky who they put in those spots like need you to have the same experience in the industry or connections in the company.


So in my experience there isn't a single job with a defined remit that does this, instead a range of positions do bits of this

Business Analysist, solution architects, presales engineers, product owners, sometimes even project delivery managers and developer team leads have to deal with this.

You might stand better luck of getting into one of these positions in a consultancy company, because there each customer has a separate set of requirements and often they don't have enough of these people.


>> web dev monotony

This is why some web devs have 3+ full-time jobs.


I have thought about this myself. I do not think it is directly related to China. I think it is just not trendy to be an embedded developer. And the reason is that it is extremely difficult to start a hardware company (even more with the chip shortages in the last 2 years) because one needs to take care of delivering components that are rare or not available for months, to create the firmware for the board, to manufacture it and deliver it to the customer. Working from home office is often not possible, because soldering stations are needed, oscilloscopes, power supplies etc. The tools are also often not cheap, a debugger can cost 10-15$, but often for bigger projects the debuggers are 100x more expensive, the compiler could also cost a couple of thousand dollars. Then if there are bugs in the SW (there are always bugs) it is again difficult to update the software (this has improved in recent years with all the connected devices/cars etc., but it is still difficult). And as far as I know there are not that many hardware companies that have big exits. On the other side, being a web developer does not require being in an office, buying expensive hardware and tools, nothing is physically delivered and at the end there is a chance to be part of a big tech exit. Big exits -> big salaries


> I do not think it is directly related to China. I think it is just not trendy to be an embedded developer

Not true at all. Pay has nothing to do with trendiness, but the other way around, being a webdev is trendy because of the high pay, great benefits like WFH and low bar to entry. Meanwhile embedded dev has a higher bar to entry due to the difficulty of the work, low relative pay and low flexibility in terms of WFH, making the jobs untrendy.

And no, the outsourcing of most HW related dev work to China has had a big impact in lowering embedded dev wages in the west, coupled with the fall of great electronics giants in the west. When I started uni to become an EE there was Nortel, Blackberry, Siemens, Ericsson, Nokia, Motorola, Sagem, Philips, etc. developing HW and mobile devices in Europe, or the west, and were hiring like crazy. Fast forward 6 years when I finished my Master's and most of those companies have either went bust, or have become just brand names for Chinese OEMs, or have become sweatshops for far east workforce, keeping only some of their sales and senior management in the west.

The EE market in the west has went way down in the last 10-15 years in comparison to the web dev market which went way up. The only western HW company making insane profits is Apple and semi titans like ASML, Intel, Nvidia, AMD, Quallcomm, while the Japanese, Korean and Chinese companies fighting for the rest of the scraps, and most of the European ones throwing the towel completely. The commodization of HW and FW dev has meant the commodization of dev wages as well.

After graduation, some of my colleagues went into mobile app dev (the iPhone has been out for a few years but were far away from becoming the norm) and are now making several times what I do as an embedded dev at the same level of YoE. Talk about betting on the wrong horse. I still can't stop kicking myself for choosing such a poor career path and wonder if I can still switch as most companies seem reluctant to hire a thirty-someting embedded senior to do junior web dev work usually done by a teen or twenty-something out of bootcamp.


'most companies seem reluctant to hire a thirty-someting embedded senior to do junior web dev work usually done by a teen or twenty-something out of bootcamp.'

If you are based in UK, drop me a line.

If you are an embedded developer with experience, you probably understand the difference between a linked list and an array, I am interviewing 'senior' web devs and half don't.

Many 'boring' companies like corporate accounting or whatever, need developers badly and pay maybe 70-80% of what 'fashionable' ones do. It could a good place to break in.


In my opinion age is not an issue, but lack of experience in the field is. If you want to get a good compensation in a different field, you need to gain experience there. This means either developing something in your spare time, or finding a job that is willing to let you learn on the spot (probably the pay will not be that great at first).


I work at a big tech company doing web-dev. One of my coworkers is 36 and was a marketer till he joined a boot camp three years ago. I don't think age should be an issue.


As someone who works in embedded systems, the answer is "it's not." Just like every other sort of software development, it depends what you're doing, who you're doing it for, and how good you are.


> the answer is "it's not."

From your answer I take it you're well compensated. From another embedded dev who is not well companensated, may I ask in what market you are in?


Not the parent, but in my experience, working in a regulated field the pay tends to be better since the stakes are higher and there is a larger push for quality rather than "ship it". I work in the Medical Device space doing embedded work and find the compensation to be very good.


I recently made a change from a small embedded company that focused on high security applications, such as automotive and aerospace, to doing embedded development at one of the large tech companies. I was making about 200k at the first company and almost double that now.

It was difficult to find a company that compensated well. A lot of the non-tech companies that rely on embedded developers like automotive and consumer electronics companies pay shit.

The large tech companies all have need for embedded developers, for example Apple, Google, Nvidia and Amazon.


>I was making about 200k at the first company and almost double that now.

That's absolutely insane to me. In EU I make 55K (considered a great mid-senior wage in my area) and could make up to 75K for senior, if I move to an expensive city in Germany and work at a top company but that's the top end of the market. There's no amount of interviewing or job hopping that could get you further as an embedded dev IC here. Higher pay is reserved for management positions which are tough to get. Even making six figures as an embedded IC here is impossible, let alone 200k+


I consider myself a pretty solid experience embedded developer(20+ experiences continously, starting from motorola then, mostly embedded linux) but I just could not pass the leetcode(or similar) tests from FAANG. I actually passed it once at Amazon but was told I'm lack of 'leadership', strangely some of my old team members(that I was a tech leader for the team, and I also founded a startup) got the offer at the same time, I was rejected by Amazon(I was later told somehow one of their interview just dislike me no matter what, even though the rest all wanted me). Since then I lost interests in preparing leetcode to try at other big ones. At smaller companies the compensation is not as good as I would like indeed.


That's pretty atypical, is it not? The vast majority of software engineers are not paid even close to that, and software engineers almost always get paid more than EEs, computer engineers, control engineers, embedded developers, etc.


The small company I worked at was definitely an anomaly. They treated all their employees well. I got a very good raise every year.

When I was looking for a new job last year I had a hard time finding any company that would even match my compensation. Luckily for me, large tech companies have an absurd amount of money these days are and paying absurd salaries.


I'm not the person you're asking, but I'm well compensated and I work in Cambridge, UK. Also known as "Silicon Fen". There are a lot of embedded jobs around here and they are pretty well paid. Yes you'll still do a little better if you go to London and work for Amazon (etc.) but it's competitive.


From my experience in past doing embedded, then web (and now mostly nothing), I noticed how much more accessible open web devs were. Information travelled fast, so it wouldn't surprise me if in the embedded world today still many don't know where the demand/money is.


>Just like every other sort of software development, it depends what you're doing, who you're doing it for, and how good you are.

Are you comparing apples to apples?

Does a mediocre Java developer make as much as a mediocre embedded systems developer?

Does a elite Java developer make as much as a elite embedded systems developer?


In my experience, the old-school electronics companies believe "if you can't stick a label on it, you can't sell it". In their eyes software is a necessary evil, but it only costs money and doesn't generate any revenue. This is obviously wrong, as the products are worthless without the software and would probably sell better with better software. If they recognized the value of the software then they would probably pay their software engineers better.


Pay has never been correlated to the difficulty of the job. It's simply how much money you can generate for the business. Software is but one part of an embedded system. Whereas it is the whole part of a pure software product.


>It's simply how much money you can generate for the business

I find your excessive faith in businesses disturbing.

Markets aren't rational [1] and the wage-labour market isn't either. Sometimes there's no good reason for something, only a bunch of chain of consequences that don't necessarily have to make logical sense from an utility perspective.

[1] https://www.cnbc.com/2017/12/21/long-island-iced-tea-micro-c...


You need to look at the business as a whole. Software only business are profitable because the per unit cost approaches zero as your volume increases past the development amortization cost. Say profits in the order of 80%+

Embedded systems are part of tangible products. They have an order of magnitude greater non-recurring engineering costs, then cost of components, manufacture, packaging, inventory, distribution and after-sales service and support. Due to the huge up front costs, volumes are generally limited by capital funds. So the net profits are likely to be 15% - 20% range.

When you look at the differences, it becomes clear why tangible product companies are unable / unwilling to pay like software-only companies.


One simple reason, China. Also its not as fancy as some other software dev domains like web. No one wants to pay for pricey hardware and most money is in software/cloud/Saas. Also to get a role you need a seed capital >$1mn unlike software which a single person can do a patreon for.


Nearly half a century after The Mythical Man Month, we still don't know how to manage software development except by throwing money at it. And the scale of software is limitless in the amount of code, complexity, and technical debt that can be added. Management can't guess which parts of software are adding value, rather than collecting rent. The principal-agent problem looms large.

In contrast, the scale, and thus the complexity and cost, of embedded tends to be limited by physical constraints.


Even with those limitations, doesn't metal programming optimization compound in later abstractions?

Seems worth the investment from a rational standpoint.


I think the main reason is that untrained EEs end up doing their best to ship product. Since the rise of outsourcing hardware manufacturing, it is simply that domestic demand is now low. Also, most PHBs have zero understanding of low-level systems, and visible areas are easier to quantify value. Basically, the kids with 37000 dependencies in their nodejs steaming pile of awesome… are more impressive than some graybeard’s 4 C library solution that needs updated once every 5 years.


> I think the main reason is that untrained EEs end up doing their best to ship product.

I seriously doubt it. In all my EE gigs, untrained EEs weren't shipping anything but were there learning and helping the graybeard ship. In fact you need a lot of industry experience and learning from the graybeards to ship products that will actually sell for profit and not flop due to some weird edge-case that didn't cross the junior's mind to look for before shipping, since the answers to most difficult issues are had through years of blood, sweat and tears burning the midnight oil with an oscilloscope in hand, not a google search away on stack overflow.

If you're into actual product development and not consulting work, getting untrained EEs to ship products will mean the shortest path to your company going bust in notime.


Anecdotally, I think it depends where you work, as “Churn and burn” culture in large firms tends to push out workers every 1.5 to 3 years on average. i.e. any long-term planning is wishful thinking, and managers rarely allocate resources for general DFM training.


offtopic, but what's a PHB?


Pointy-Haired Boss, Dilbert reference.

https://www.wikiwand.com/en/Pointy-haired_Boss


Positions requiring specialist, uncommon knowledge or long training are often "penalized" in many sectors. It pays more to be a typical "CRUD writer".

And this is not specific to IT - many academic researchers have to make ends meet with almost minimum wage.


I suspect the answer stems from the commoditization of the hardware - in which the meaningful differences for a particular good are so few between that no one cares much about which brand it is except that it fulfills their needs. And as you said, the reduced profit margins spreads down the chain.

I've also wondered how employers see them from the business perspective. It's hard to determine a 'business profit contribution' value attached to the 'quality of work' they produce except that things don't break down, and stuff gets done on time.

Furthermore, if the embedded code is the sort where once it's done, it's done, and little to no maintenance is required, the business people might see no need for the guy to be around. Maybe someone could clarify if this is true.


> Something I never understood about embedded systems work: why is the pay so bad?

It's about in line with what electrical engineers make on average, slightly higher, at least in my location. The market places more value on people who can write code to give you a nice shopping cart interface over the ability to design digital control systems for engine control units to make sure your car can drive properly. It's just what we as contributors in a market value as truly important to us. To give some more insight: Look at the value of Twitter. A control system for a weapons-grade nuclear centrifuge - a weapon of mass destruction - is valued less than a 280 character limit.


I think it's just that fewer are needed, and it competes with 'other types of coding you could be paid to do' as you said - sure domain experience helps a tonne, but basically the 'core competencies' are the same as web/desktop/mobile stuff. Many of us doing the latter trained as EEs even.

So, from that huge pool (big supply) there's plenty for the few that are needed (little demand); enough even that will choose that domain over a better-paying one for interest's sake, so it ends up not having to pay as much.


Indeed, the pay for embedded development is curiously low especially given the wide range of skills you need - both software and hardware knowledge as well as debugging skills that take years to develop.

I think there's been a lot of market distortion over the last 15 years or so due to the way money has flowed. VCs haven't been very interested in funding startups making hardware so they tend to be self-funded as in funded on a shoestring. There's not the money sloshing around for these companies.


What is the pay like for the "hardware" side of these products, that is, ICB design and the like? I mean as far as I know there's not nearly as much people in that industry now compared to software development so I would assume prices go up... but with tons of international competition, price will go down. (competition as in... people will happily take Chinese designed circuits or whatever, but not so much software.


It's no better. Part of the low pay, relative to pure software, is the much greater head count for all the tasks involved in designing and manufacturing physical goods.


I'm an EE, and yeah, I don't get why the pay in the field is so bad. The stakes are higher too. If I mess something up I blow a very very costly PCB or even worse - an ASIC. If a software dev messes up it can cause millions of dollars of damage, but for the most part you can ship half-baked projects and patch them up as you go. You just can't do that in EE.


>I'm an EE, and yeah, I don't get why the pay in the field is so bad.

Also EE here. Because in EE you need to invest a lot of funds into equipment, prototyping, manufacturing, RMA, shipping, logistics, customer support, etc. only to make pennies in profit when you actually sell your finished products, while it's the opposite for SW dev work, developing it and selling it costs you next to nothing, you just need a laptop and an internet connection, and once you sunk the cost of developing it you can scale it virtually infinitely and sell it across the globe with low costs and high margins.

Long story short, it's a lot more profitable to make apps where you get consumers to click on ads, buy stuff, or invest their savings into meme stocks or crypto, than to build and sell physical products to them (unless you're Apple).


R&D is more expensive, fixing errors is more expensive. It doesn't scale like software does. You want to produce 100x more product? You'll need a big organisation compared to a software company.


Software eng. unsatisfied with $ -> leaves company -> makes dumb clone, steal subset of customers from former co -> iterate, grow, might end up becoming a threat

Hardware eng. unsatisfied with $ -> ask their friends in the other 2 cos in the same niche how much they are being paid -> cry

Bootstraping a software business is trivial compared to any other eng. business. That, plus the internet's existence, is what makes dev so well paid.


> Make dumb clone

In hardware world that translates to contracting or hiring a Chinese team to clone your competitor and eat the lower-end of the market. This is especially true for development boards, hobbyist and supply chain stuff. Digikey vs LCSC, Sparkfun/Adafruit vs AliExpress/Taobao, Arduino vs clone boards.

Sometimes the dumb clone company gets big enough and become a legit player in the Anglosphere. Say Seeedstudio or ALINX (FPGA dev boards).


> Something I never understood about embedded systems work: why is the pay so bad?

I think it’s because the software is generally simply considered part of the BOM and thus a cost to be reduced. Also, the hardware side typically rule the roost in such projects, and hardware engineers tend to look down on software developers because, after all, how hard is it to just type code in (if you’ve ever looked at code written by EEs…ugh).

As supporting evidence: the pay is better at companies that do take software seriously. Examples: at one class, Cisco; at another, FB, Google, Apple…


I think FB, Google and Apple simply being more profitable than Cisco has more to do with it


This has been well answered by others: Embedded Software is adjacent to engineering and are seen as engineers like any other (usually the pay scale is higher since software is in demand).

Additionally, I'd add that many embedded developers work for hardware companies that don't understand or care about software; why pay so much more, and they already think they're paying "way more" (it's more than we pay EEs!), for something you don't get.

Finally, Embedded Software Engineers do quite well often. We don't pay people based on how challenging their jobs are.


Is it because hardware doesn't scale as well as software? For embedded development, you need some kind of hardware that needs to be sold. A computer or a phone is something everyone already has. An embedded microcontroller? Not so much.


I am going to break with the consensus here and present an alternative: embedded systems are easier than "big software".

With much of embedded you don't need to worry about updates, maintenance, logs, security, multiple cores.


I feel very confident saying that the reason is supply and demand, despite the fact that I know zero about the embedded systems labour market.


Could it be due to competition from China?


Having worked with outsourced Chinese firmware developers: no. Absolutely not.

The quality of code I was submitted was atrocious.

Any halfway decent Chinese firmware/embedded developer is likely going to work for a bigco where they get paid well, like Tencent or Huawei.


Tencent and Huawei are still Chinese companies though, just like parent said. They just replaced the likes of Nortel through IP theft and undercutting them on cost.

So parent's point still stands that Chinese competitors along with customers in the west who prioritized low cost over everything else, destroyed the embedded dev market in the west.


Mm, don't agree. You can earn a very handsome amount of money working for Tencent and Huawei.

I'd accept "undercutting", but that's a corporate/macro pricing trend, not a trend for salaries among embedded devs in China. I suspect many companies who are undercutting pricing are getting funded by the Chinese government. They're still paying people quite well, just because they need talented engineers just as much as their competitors do.


>You can earn a very handsome amount of money working for Tencent and Huawei.

Maybe you can but they don't exist where I live in EU and that's the only market I can speak of.


> Something I never understood about embedded systems work: why is the pay so bad?

Because there is no demand. Because today, you get all the software along with hardware. And that software is already written somewhere in China, by same people who make the hardware.

Google, Facebook, Snapachat, MicroSoft, Amazon all run their hardware R&D in China for obvious reason that the hardware is already being manufactured there. They all copy Apple, and surely think that they can become Apple if they do like them.


Embedded is not any different from virtually any other area of software development, outside of certain hot pockets of Web app development.


I'm an embedded developer, and I've recently started teaching a coworker about microcontrollers. I think it's most helpful to first understand the basics of digital electronics. Transistors can be wired together to create logic gates, and logic gates can be wired together to create flip flops. A flip flop can be thought of as a single bit of RAM. Logic gates and flip flops can be used to create all sorts of useful circuits like counters, shift registers, adders, multipliers, multiplexers, etc.

With this foundation, one can understand a microcontroller as a big digital circuit. A CPU executes instructions by using a digital circuit. The instruction's op code is selecting the appropriate circuit to operate on the instruction's operands. Microcontrollers also have peripherals like timers, serial communications (spi, i2c, uart), and analog to digital converters. All of these are digital circuits that are configured by writing bytes to registers. Those registers can be thought of as the inputs and outputs of those circuits, just as if they were built with discrete flip flops and logic gates. In fact, many microcontroller datasheets provide diagrams of the logic circuitry for these peripherals.

Once you can truly understand how the hardware operates, writing the code to configure and operate that hardware is pretty straightforward.


For an excellent course that covers that full journey (and more): https://www.nand2tetris.org/.


That site links to the 2005 edition of the book The Elements of Computing Systems, but an updated edition was published in 2021. (I found the original edition excellent years ago. I'm not aware how the new edition differs, only that it exists.)

https://mitpress.mit.edu/books/elements-computing-systems-se...


> an updated edition was published in 2021

Good stuff. I wonder if the author has recorded new lectures. Probably not, since the website still links to 1ed...


https://nandgame.com/ is also great

edit: oh, and there's new levels. there goes my afternoon...


Approximately how long does a curriculum like that take?


I got a pretty solid foundation in embedded from ~1 year worth of uni EE courses. Learn a bit of Arduino, learn a bit of VHDL, fiddle with some registers and you're 90% of the way there. The last 10% comes from finding a job and just spending a lot of time learning all the other little pieces you need.


This is a very cool and playful introduction to the basics of the topic:

https://nandgame.com/


You would cover all of that material in an intro to digital logic course (no prereqs). I found it a very enjoyable topic to learn. I don't think it would be too difficult to learn on your own. I would try to find a course that included a lab component, testing things out on a bread board is very fun. Sole caveat; debugging circuits can be a challenge if you don't have anyone to help and you're learning.


You say "no prereqs" but I can assure you, it takes more than a single intro course to go from "I don't know what programming is, or the difference between voltage and current" to understanding a microcontroller at a logic gate level.


Re-read the list of topics that person provides. They are all authentically covered in a single no pre-reqs course. I took an 'Intro To Digital Logic" course a couple years ago hence my confidence about this.


I learned most of this in a first pass through Crash Course's Computer Science playlist. Episodes 2 – 10 discuss the electronics [1].

I normally dislike videos and prefer books, but these were well done. They did an excellent job of pointing out the specific abstractions of each component as they are introduced.

[1] https://www.youtube.com/watch?v=LN0ucKNX0hc&list=PL8dPuuaLjX...


As a recovering embedded engineer (though I am a CS grad), I strongly suggest noone enter the field: the pay is miserable, the job market is saturated with EE grads who wouldn't be able to do regular development without a major time investment.

I'm switching to full-time C++ backend and GUI development (so far it was 70% embedded and 30% this. I find that there is much more complexity in developing server-client systems, which optionally integrate with other systems, than it is to write low level code (maybe I'm just good at writing low level code). It's more intelectually rewarding, and you get to use higher level concepts. There's always room for advancement and computer programming is deep in a way that embedded engineering never will be. I've worked on embedded systems for 6 years and I've tried pretty much everything there is to try, from the lowest level to graphics from scratch, RTOS development and networking.


I agree, I've done decades of embedded and more think client scientific computing, going back and forth between the two.

Embedded pay isn't worth the pain of fighting hardware and working with people that don't know how to write code, writing code.


What would you suggest they study in order to learn how to write code properly? I've been working in chip design for a few years (writing verilog code instead of software code) but I'm thinking of trying to make a move into the software world


> the job market is saturated with EE grads

Can confirm, I'm one of the only two CS grads at our robotics company and I'm vaguely in charge of all the web and GUI related stuff that EE grads don't know much about.

Wages in other fields aren't that much higher here in Europe though.


I think the issue is that there are a lack of companies that are competing for talent.

Basically the job market consists of Apple, Lab126 and perhaps Logitech in the US. The rest of the job market is either in Asia or in niche industrial markets or early stage robotics companies.

That said - I do think having an embedded background - particularly embedded Linux - can actually improve your systems level knowledge - since you have to get into Linux drivers, the networking stack, package managers, etc.


I'd read a blog about your experience


Do you have some info on C++ backend development? What is there to look for? I’ve been looking to get into a more low level development job.


IIRC C++ backend dev is mostly used in HFT (think of the stock market) so it's a relatively niche, if well paying, market (NY, London, Chicago, Amsterdam)


I do embedded stuff from time to time in the amateur rocketry hobby. The hardest part for me was getting the initial tool chain setup from scratch. I wanted to do it the way the “pros” did it with C, make, a compiler, and a loader no training wheels. Going from zero tools/code to a blinking led (the hello word of embedded) was pretty tough.

It never dawned on me that different buses run at different speeds and you have to literally setup the clocks. Heh and before that you have to load the assembly to jump to the address where main() is located (basically your own boot loader). I wouldn’t say it’s harder than web dev per se but it’s just very different.

Then I just bought a feather from Adafruit, started using micropython, and never looked back. Much better, although a buddy of mine holds his nose whenever I talk about it.


Yeah, there's two parts to it.

The first is "Can you build your code, for the correct target with the right cross compiler. Is your linker script set up correctly etc.". This isn't too bad once you've done it once, as long as your toolchain isn't proprietary and the target is sane.

But the second part is painful every time which is "what code do I need to run at startup so that my hardware is correctly configured". This is before you even write a driver. This is so specific to the particular chip, and even what you're particularly doing with it that it always takes time to get right.

I'll tell you that as a "pro" in this field, any time I can skip the second part I will do. STM32 chips for instance need a lot of clock config and selecting the right muxes because their chips are very flexible as to which peripherals can come out of which pins. They have a graphical tool called "STM Cube" where you can select your chip, configure the function for each pin and then it'll generate you a full project with startup code. We threw away the project and just used the startup code, regenerating it occasionally when we changed something. It saved so much time and suffering and is a big reason why I'll always choose an STM32 if I can.


Pros often use commercially licensed IDEs like Keil or IAR Workbench and a bunch of other tools that only run on Windows (no matter how much they dislike Windows personally). Setting up a toolchain from scratch sounds like what hackers (i.e. those who are not professional embedded devs getting paid for such work) would do.


Exactly. Setting up your GNU based embedded toolchains and make-files from scratch is something that hackers/hobbyists do.

In production, where you have deadlines to build firmware that is set to ship on millions of units, and your company's future is on the line, you just buy how many Windows Keil/IAR licenses you need, and get on with your work, regardless of how you feel about proprietary software.


Curious if you work in the field. Because in my experience, it's not like that at all. YMMV, I guess.


Right :-)

Remembering the lengths a former employer would go to fit code into the 32kB limit of a free dev tool because they didn't want to shell out for the unlimited paid version.


No offense, but that sounds like a poorly funded company.

We had the opposite experience, we paid for IAR since that was the only compiler to generate binary compact enough to fit in the 256KB version of the chip and not have us spend extra to upgrade to the more expensive 512KB SKU.

If you're laughing at this, do note that at scale, costs add up significantly even if we're talking about $.50 and a license to a good compiler and debugger can have an amazing return on investment.


> No offense, but that sounds like a poorly funded company.

None taken. Not just poorly funded, but often penny-wise and pound-foolish. That's one of many reasons they're a former employer.


>Curious if you work in the field

I do work in the filed yes, worked in automotive, semiconductor and IoT.

>Because in my experience, it's not like that at all.

In what way is it "not like that"?


The IDE's I've seen are mostly used for the debugging magic, and the toolchain used for building is some customized thing. I work in automotive.


>the toolchain used for building is some customized thing. I work in automotive.

That's because in automotive the microcontrollers are more complex, with multiple cores of different types doing different things, so the build process is more complex therefore the final binary goes through several steps like patching various calibration data, FPGA bit-streams, cyptopgraphic keys, fuses, debug data, etc. but even so, those custom toolchains still call a comercial compiler via command line using some perl/python scrips at the end of the day.

You can script same steps with IAR or Keil if you wish, their compiler support the same command line functionality, except that in IoT we mostly used their solution 'as-is' since it worked well with their debuggers out of the box meaning quicker time to market.

Automotive development has a lot more crust in general due to legacy, safety, legal, plus bs stuff like the entire Autosar stack needing to be added to each build whether you need it all or not so a lot of inefficiencies build up over time due to this inertia leading to some insanely complex and lengthy build processes.


Honestly, those ide's suck. Most of the time we only would use them for debugging, but the build system was usually some home grown thing. You get better control. The startup code really depends, you can use the stuff from the vendor, but a lot of the times, the needs are specific so you might roll that up as well.


If fsking hate IAR Workbench and can't believe that it exists in this day and age. Especially when PlatformIO with STM32 plugins is free.


If you ever want more performance, I suggest trying PlatformIO and C++. It will set up the toolchain for you and do board initialization, and then you can use the Arduino API. The package manager is more reasonable than Arduino.


This guide should point people at a better board than an Arduino Uno. At least recommend something with a 32 bit processor.

This board looks good for a beginner since it has interesting sensors built in: https://www.adafruit.com/product/3333

Shopping for electronic parts is kind of involved actually. It seems like a kit with a bunch of basic electronic components would be a good idea.


I think the critical part is not the 32bit thing, but the lack of a proper JTAG debugger on the arduinos. A debugger would give much better insight to what is happening on the board and the moment I saw a proper debugger and was able to step from the first instruction to my code was the moment I "got" microcontrollers.

I know a lot of people hate debuggers, but the information they provide is extremely useful.


I think Arduino hardware is a great place to start with embedded, specifically AVRs and avr-gcc. For one, there is a huge amount of daughterboards/shields because of the size of the userbase and common pinouts. Most importantly, devices like the ATMEGA328 (probably the most common arduino chip) are much simpler/smaller than just about any ARM device. There are less than 100 peripheral registers in an ATMEGA328. The peripherals have far less options, and are simpler to configure.

You can also do real debugging on AVRs. I believe its not actually JTAG, but you can still use breakpoints, watch variables, inspect memory, and step through code.


It's called the Program and Debug interface and uses two wires (PDI/O). That's for xMega though. There's also debugWire (one wire). But many chips also support JTAG as well (for both programming and debugging as opposed to using the ISP).

https://microchipsupport.force.com/s/article/On-chip-debuggi...

The official 328 breakout board uses debugWire I think, but you get a nice USB interface. I recommend the dev boards, they're very cheap and quite capable (all pins broken out, prototyping areas, usually a few LEDs and you can use the debugger). They also have a footprint for arduino Shields!

https://www.microchip.com/en-us/development-tool/ATMEGA328P-...

Watching variables is flaky in my experience, but hardware breakpoints and memory inspection work fine.


As a relative beginner with the Arduino API, I never have a reason to poke at registers directly and use print-based debugging over the serial port.

It seems like getting to "blink an LED" early is important for a reasonable "getting started" experience? I'd like to have a real debugger, but it looks awkward to set up compared to just plugging a microcontroller into a USB port.


The "curse" of the Arduino software is that the libraries abstract away the registers. So you get the quick satisfaction of blinking an LED, but you're never forced to dig in to what those libraries are actually doing to make the LED blink. In my opinion, this model leaves you dependent on other people providing libraries. If you eventually want to move on to other microcontrollers, you'll have to be able to read and understand the data sheets, and understand how the registers configure and operate the peripherals.

While a debugger is helpful, you can still work at the register level without one. You can simply print the register values to the serial port. In fact, if you wanted to, you could create a serial console that would allow you to set the registers over the serial port.


Stm32 dev boards. They have on board debuggers. Just plug em in and debug. Bring up is painles with cubemx. Very powerful and fast. My go to for fast prototyping.


You can do debugging on AVRs, it's just not generally exposed in arduino products (neither the hardware nor the software).


That is one sweet board, and it might be a good starting point. It is, in my opinion, rather intense/complex, which might scare people off.

For something leaner and more "clean", while staying with Adafruit, their take on the RP2040 might be good [1]. The official Pico [2] is still just $5, which might make it an easier buy for some.

For something even smaller, I rolled my own ATtiny214 [3] board but it's not presentable at the moment (and it's slightly more "hardcore" since it's not programmable over straight USB). Also, it's of course dropping down back to 8-bit land again.

[1]: https://www.adafruit.com/category/875

[2]: https://www.adafruit.com/product/4883

[3]: https://www.microchip.com/en-us/product/ATTINY214


Software world founder from the west in China doing robotics. Spent half the day churning out boards for a new version of a nontrivial mechanical assembly, half the day cutting metal and doing mechanical, half the day dealing with supply chain and half the day on business crap. Currently in a bar at 10:30PM and just knocked off outstanding supply chain work. Those parts ETA Sunday/Monday. Thinking I will work tomorrow because I'm excited to see some really cool stuff coming together. Seven years and counting! Hahaha. I am become rabbit, digger of holes. Seriously though, there's more to work than immediate payment.


It's a bold move to put Learn C the Hard Way as the only book recommendation for learning C, haha. This article seems like a good jumping-off point. I would be interested to see some examples of well-written embedded software which are sufficiently limited in scope that a beginner can feasibly grok the whole codebase.


As for learning C, I like to recommend Modern C:

https://gustedt.wordpress.com/2019/09/18/modern-c-second-edi...

There are also printed versions.


For well-written embedded software (processor peripheral drivers and OS concepts) I can recommend ChibiOS [1] and libopencm3 [2].

[1] https://github.com/ChibiOS/ChibiOS

[2] https://github.com/libopencm3


Started doing a Masters in embedded software design a couple of years ago, and I could condense this guide into a more succinct explanation.

Getting started with embedded systems is being repeatedly told "you're wrong" or "that won't work" until you give up and do something else.


You could say that about almost anything difficult that requires a lot of feedback and/or practice.


Certainly the real embedded world, which is beyond embedded "toys" [1], can be a cultural shock when you come from general CS.

And indeed, just like "making games" can be attractive for young people but far less so as an actual job (from what I've heard), embedded can be fun when playing with IoT-like toys, but it actually needs a lot of patience and tenacity when you do it as a job (from experience).

[1] No disdain here. Those MCUs and SoCs are used in actual products, but here they are not used "seriously".


Can you elaborate? I don't understand what you mean (as somebody with 10+ years in embedded software + electronics space).


> Getting started with embedded systems is being repeatedly told "you're wrong" or "that won't work" until you give up and do something else.

I had no idea I was doing embedded systems for so long.


The hard part about getting started with embedded systems is that the intersection of what you have, what you can do with it, what you know you can do with it and what you actually want to do is most likely empty.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: