I don't understand the need to editorialize by changing the title. If you read the title posted here, you may get the impression that there's no need for radiation hardened chips can be used. It would be better to use the actual title "Common misconceptions about space-grade integrated circuits".
To me the actual title more properly reflects a good summation of the article:
"The ultimate answer to a question if COTS chips can be used in space is “Yes, but”. There are many opportunities, but also many constraints."
> Just like anyone else, they normally write code filled with crutches to be ready for yesterday's deadline and want more powerful hardware to mask their sloppy job; some would’ve used Arduino if it was properly certified.
The (very interesting) main thrust of the article aside, I see this kind of sentiment a lot and I don't really understand it. Arduino's just a common dev board, running a standard microcontroller, plus some hand-holding libraries (which you can dispense with if you want.) Is there any actual legitimate reason that they're not suitable for production use? Or is it really just "real-embedded-devs-don't-use-Arduino" ego flexing?
After spending time in the Arduino community; it becomes apparent very quickly that hardware and embedded people have a real issue with the Arduino platform.
Most of it is ego flexing, and coming from the field of software engineering; we are trained these days to recognize and squash this behavior, but it seems to be alive and well with those working in the embedded field.
Honestly, after years of working with all types of software on all types of platforms, the Arduino platform is pretty nice. It gets you up and running very quickly and anyone with basic programming knowledge can start creating electromechanical devices, while it also allows you to write more advanced software (with essentially C++).
All of that being said: there are a ton of garbage 3rd-party libraries released for the Arduino, with garbage code, no documentation, no tests, no support, and gnarly tutorials. So I see where some resentment comes from, but once again; it's unacceptable ego flexing.
IMO, the problem with Arduino in production is not necessarily one of quality, but the fact that is geared towards getting the first prototype up and running as fast as possible, but is difficult to use for production where eg. real-time and safety concerns require mechanisms that aren't present or impossible to implement because the software libraries can't support them.
Scheduling, real-time characteristics, safety and watchdog mechanisms, and resource management pose hard engineering problems, and are often a very unique mix for each application, and again the Arduino libraries are not typically considerate of such concerns.
Add to that the usual pressures of keeping per-unit hardware costs low, to the tune of shaving a few cents off the BOM by choosing extremely constrained microcontrollers, and bending Arduino to fit those constraints becomes an unfavorable proposition.
The biggest benefit of Arduino as I see it is in reducing the barrier to entry for starting a hardware project because you can just wire software and hardware modules together for a proof-of-concept, but it doesn't do much for production.
I wouldn't say it's always ego flexing per se. For instance there are still tons of embedded engineers who won't touch C++ because they think it has some sort of overhead over C, even when C++ wins out in terms of binary size, execution speed, etc. So I'd say there's plenty of ignorance going on as well.
To be fair, as other commenters have pointed out, Arduino isn't always a good solution. This is the same in all of engineering though.
At work we tend to use Arduinos for production tooling a lot. Our main product is decidedly not an Arduino though. We have lots of constraints though, from form factor to battery life.
There's a lot of elitism and gatekeeping in those communities. People are just starting out different and different is bad to many. I saw it a lot on Arduino StackExchange, but even more so when we got migrations from other StackEchange sites, some of those had mods and high-rep users that actively irrationally loathed the platform and it's users.
Yeah, there are weird currents of "we had to learn that properly, so everyone else should". Totally ignoring that the nuance just doesn't always matter, and not everybody doing things wants to get into it deep enough for it to matter, and tools enabling them to do so are not automatically bad.
Yeah I've seen similar attitudes particularly in niche corners of software. In fact I read a comment just the other day about how bad contemporary programmers don't learn language the "academic" way (with no expansion on what that means) and instead by... writing code.
Yeah, im a chemist who learnt how to code in order to more quickly analyse my data. I later learnt how to do serial communication to log data from my instruments, and now im looking into things like arduino for "better" data logging of other variables, like temperature at different points in the reactor.
I'm happy they learnt the "right" way. But I've got a different set of requirements and I've come from the top down rather than the bottom up
I see three sides of the 'learn from the fundamentals up' argument. On the one hand, 'Real Programmers Use Butterflies'. On the other, an understanding of core algorithms and basic software design principles is really, really useful. On the gripping hand, completely self-taught rubes build incredibly useful and commercially valuable software in Excel.
The main part of Arduino is the hardware abstraction library. Especially for the standard stuff, like iwc, spi, ... this hal makes your code portable between different microcontrollers!
The alternative to Arduino hal is to use the vendor libraries. By doing that, you lock yourself in to the specific controller you are using right now. Also, the quality of the official sdks often enough is joke.
The next thing is the library ecosystem. A device driver library, i.e. to use a certain sensor, from the Arduino evosystem will work on every microcontroller with Arduino support, which is a very very long list.
The arguments against Arduino?
It's not _professional_.
Yeah, try the vendor library, sdk and ide and tell me that's better ...
You are relying on code that _somr guy in the internet_ wrote.
Jup. That's how the whole industry works. If your are using windows, you are using code some guy at the internet wrote. Did you ever get recourse from MS after you got a BSOD?
Relying on code from other people is how we are able to get anything done. If you wrote everything from scratch, you couldn't ever ship anything.
And hey, you got that code from that guy on the internet for free. So do the world and yourself a favour, review the code and upstream any improvements you find!
The IDE sucks.
So does the vendor IDE and nobody is using that stuff anyway. Use platformio.org
There is many situations where Arduino is a valid option.
Honestly I think in the embedded world HALs are a problem with how diverse the micro controller area is getting - especially heterogeneous multi-core chips. HALs end up targeting the least common denominator but there may be a really good reason why you would want a particular micro controller in the first place.
I think Andrei Alexandrescu has a good idea that he presented in his talk about compile time introspection^[1]. That combined with something like ARMs SVD, but vastly expanded to include practically all hardware resources, might be the future. That way you could use exactly what you need.
Of course the embedded world changes very slowly, so if this every does happen I'll be long retired...
Having just discovered Arduino and messing about with ESP32 and ESP8266 chips on it, it’s so impressive.
My abilities are pretty basic, but the amount that can be achieved and the low barrier to entry (a computer, a USB cable and a $5 ESP) make for a great combination.
Thanks for confirming my own impression. The Arduino dev environment is pretty meh, I used it once when I was building a test rig and found it painful, but it IS easy for a beginner. If you don't like it, just use VSCode / PlatformIO.
Also, as for the Arduino HAL, what a REAL embedded dev would do is implement their own platform-agnostic library and then implement it on every microcontroller they use and... oh wait they've just reimplemented the HAL.
But basically there's nothing about Arduino that makes it magically always bad. Start with your requirements spec. If a piece of hardware meets those requirements, and it's cheaper than the other options, use it. Even if it's Arduino.
Typically flight hardware that goes up there is characterized completely. It is not unusual to use decade old GCC versions because their behaviour is completely understood in-house, and all the required functionality is coded from scratch rather than importing a library from github.
The restrictions (power/computational power) are very high and reliability is the major concern for this kind of systems, making development patterns like that absolutely necessary.
Arduino is fine for production if you are doing something (relatively) simple terrestrially, but space is a totally different beast.
Arduino is a great board for production, it just too expensive for large runs. My colleague uses Arduinos for his very special machines. He delivers 5 pieces a year of them. That is 200$. If he designs a board and takes care of production, he works few days on this and it costs a company closer to 2000$, but has very cheap 5 boards a year.
I see this a lot, but don't understand. Arduino is just a dev board with a socketed Microchip AVR ATMega32 microcontroller, which costs $2 (Digi-Key or Mouser in USA).
You csn get clones in volume -- the whole dev board etc -- from China for $2 each in volume. Though recrnt restrictions on China suppliers is a headache.
I haven't tried creating my own large run of a product based on Arduino, but I wouldn't go into production paying for a retail-boxed Arduino.cc dev system.
Never mind the clones, if your development needs any kind of self-designed board, it's cheaper to tack the ATMega (or other processor, you might be able to make it work with something simpler) onto your design.
Plus the reliability of boards plugging into other boards, reliability of clones, etc.
It's a reference to the legendary comment under Dropbox's announcement that asked, what's special about Dropbox when it can be implemented using a {finicky, arcane and difficult to use} software.
My comment was to highlight that there's the ease-of-use and quality-of-life aspect as well that needs to be taken into account.
That was for sure "ego flexing", or, I'd say trying to be sarcastic about guys who are unwell with Arduino.
Arduino itself is completely fine, but hardcore hardware guys aren't always happy with amateur who tend to use Arduino everywhere, without taking actual needs of the task into consideration.
Like, Arduino shouldn't normally be used in space as it's components are not radhard and are not certified for such uses.
I've heard college instructors use words like "Shiturdino" when teaching an intro course into microcontrollers. The dumb elitism exists even in cases where Arduino would be absolutely okay.
One aspect is licensing: IIRC the core libraries are under LGPL.
Another aspect is: there are some poorly documented idiosyncrasies in the standard library. I was personally bitten by the fact that delay disables interrupts (makes sense from one point- it makes for more accurate delay timing), which means you can miss receiving some bytes from serial port.
Also, almost universal lack of low power mode support excludes the use in some contexts (very low power battery powered devices).
The default behaviour of restarting when USB serial port is opened (can be worked around by disabling DTR toggling when opening serial port, but not supported everywhere).
Production won't like that there are no guarantee that the Arduino board you'll be able to buy in 5 years will be exactly the same as now (oh, we changed the USB-serial from FTDI to some other, and now the PC software doesn't auto-detect the correct serial port).
“Production” is imagined differently by different people. Across millions of devices the unit cost premium of an Arduino scales to a non-trivial amount of money and adds an additional manufacturing pipeline into scheduling and availability.
It's absolutely that. So people can achieve results with less effort and hardship. This is a good thing for the users but bad for the gatekeepers.
Yes, the arduino user will have achieved results without understanding as much, or having as much formal training. For some reason that upsets those who have.
"Yes, the arduino user will have achieved results"
The problem is that good engineer normally spends 20% of the time on designing how the circuit works and 80% on designing what happens when something goes wrong. Many guys who started from Arduino tend to ignore the second part, that's why "proper engineers" dislike them.
Arduino guys tend to devalue many aspects of engineering. And the worst part is that they lead many others (like employers or investors) to do so too.
I'm not 100% sure (and I don't actually hate Arduino), but I see it this way.
Whenever the entrance barrier becomes much lower, quality and reliability standards also fall, and that's what causes disgust of "real hardcore guys".
The first Radhard Atmega device arrived just five years ago, so maybe someone is in a process of certification now)
But your question sounds a bit like "why there are no Ferrari semi trucks?"
Just because they are intended for different things)
This is a good article. It covers all the facets of radiation hardening pretty well.
I worked on a radiation hardened processor chipset that had to be very rad hard - total dose, prompt dose, SEU all had to be addressed. That required measures be taken at the process, physical layout and architecture levels while making it performant and low power. Conclusion: very difficult, but possible, and some very interesting problems to solve (like how to mitigate the effects of an SEU in the instruction register at the moment of execution).
I also was peripherally involved with testing COTS memory chips for radiation hardness. They only had a modest total dose requirement. That was messy for the reasons mentioned in the article, like the fact that the manufacturers are always tweaking process and design parameters to optimize for the commercial DRAM market. So from one lot to the next the radiation hardness was unpredictable. Conclusion: costly, time consuming and not sustainable.
Some wrinkles in the article, it looks like it was well-translated from the original, but figure 7 is explained "on the right" twice; one instance should say "on the left", and figure 8 is missing entirely.
I wonder if the FDSOI back-charge technique could be cycled to provide periods of immunity, then "dump" the stored charge (while holding the chip in reset), then become immune again..
Fascinating stuff! Mostly over my head! Literally! :)
I'll fix it, thanks. The problem occurred because it was not translated, but more "rewritten", and this particular figure was changed.
Regarding actual question, that wouldn't work unfortunately. "Dumping" isn't possible as effects are complex and nonlinear. Maximizing total dose hardness via back gate voltage would require complex voltage control.
But there could be something, for example, if one tries to use accelerated high-temperature annealing.
Fun fact, the still persistent use of "rad" instead of Gray (RoW) unit seems to stem from the US NIST confusing radians with rads and saying rad is an SI-conformant unit (which it never was).
Bear in mind that it's a Russian site and they have a huge problem with producing or buying radiation-hardened chips. So it's kinda 'sour grapes' attitude.
It's not "huge" problems. Russian state of art radhard design is mostly on par with what exists in Europe and just a couple of generations behind the USA. Most of production is obviously abroad, and nobody actually cares, as it involves much less money than Huawei case)
Also, this site was chosen for publishing because it was easy and because they actually give you some audience. Getting the same attention at Medium would be much harder, and a copy of this text, which I posted at LinkedIn, got something like 60 views. LinkedIn has a great publication editor, by the way.
If you have an advice on some better place where I could publish scipop texts about microchips, that would be really appreciated.
This "you" doesn't apply to me as I don't live in Russia.
The problems with satellites are mostly due to competely different factors, not the state of radhard IC design.
For example, transition from imported to domestic EEE components is painful, as the demand is much higher than industry's capacity.
If I remember right. The north americans made a lot of noise when SpaceX managed to take Nasa people to space, because USA was relying on Russian space program, lol. So I think they are on par
The knowledge required to understand the vocabulary used in this article seems to disqualify anyone who might have such misconceptions to actually read it to the end.
Oh, I've seen examples of these from the people who managed to work in space industry for decades.
Also, I hoped that I gave enough terms and definitions in the beginning( I would appreciate any feedback on what could make the text better and more understandable for wider audience.
> all the fuss with lead-free solder was partially caused by the fact that lead and some other materials used in IC fabrication contain impurities of heavy elements like uranium.
This is complete news to me.
Lead free solder was mandated by the European Union's "Regulation of Hazardous Substances" legislation, whch was drawn up by a bunch of ninnies with no understanding of biochemistry, nor mathematics, nor epidemiology.
Electronic devices are more reliable with tin-lead solder rather than Aluminum based solders.
> Electronic devices are more reliable with tin-lead solder rather than Aluminum based solders.
What are you talking about? None of the common lead-free solders contain any Aluminium.
Furthermore the primary concern in regards to lead is during wave soldering during manufacturing (where you do get lead vapor; hand soldering and tinning is not really a problem) and during recycling later on.
I don't really get this. Surely the problem is you're handling lead in a home context? That's never going to be safe, since unless you're very careful about cleanup, you're going to get lead in your food.
To me, all this talk about lead being so much better feels a bit silly. As a craftsman, I can see that lead is nicer to work with, and makes for more pretty results. As a dad, I'm pretty glad that every electronic object isn't laced with lead - in fact, I'm glad about every measure full stop that reduces the industrial usage of lead. One of these things is way more important to me than the other.
I don't understand what you're saying. Using hand soldering tools, the tin-lead alloy becomes a liquid, sure there might be traces of lead going into the air that you can detect (ppt? ppq?), but that doesn't make it a health hazard. You might abrade traces of lead onto your hands handling soldering wire, but surely washing your hands after soldering gets rid of that. If you do a few mistakes you might get small lead balls in your work area, but even then I don't see how it gets in my food. I don't understand why solid lead or alloyed lead is supposed to be a safety problem by itself. It's not radioactive, you don't get sick by standing next to it, you have to abrade it and eat the shavings.
Heating soldering wire isn't creating lead vapors like leaded gasoline does.
Lead is soft - you're definitely leaving traces of lead on your hands when you handle it. That then goes onto your clothes, your skin, and everything you touch.
But anyway - it's not really the point. The point is, I can confidently say that some idiot burning a trash bin isn't going to scatter lead all over the local enivroment - ditto a house fire, or a car crash, or somebody dumping a computer in a stream.
Upvoting but want to add a cave-at, you can get lead poisoning from handling pure lead to, if you do it enough and are not rigourous in hand washing. (Licking lead laden fingers can't be good.)
There are many different radiation environments in space, which a quick skim of this article seems to ignore entirely. Low earth orbit isn’t that high-rad because of the earth’s magnetic field. Go outside outside it and you need some rad hardening for sure. Orbit in the van allen belts or go somewhere like Jupiter and you need extensive rad hardening for sure.
To me the actual title more properly reflects a good summation of the article:
"The ultimate answer to a question if COTS chips can be used in space is “Yes, but”. There are many opportunities, but also many constraints."