No idea about mruby/C or picoruby but MicroPython is very popular, enough that I'd just settle on it for devices can handle it. It runs nicely in 256k of flash and 32k of ram, on the small adafruit trinket devices. It was shoehorned into the v1 BBC microbit which had 256k of flash and just 16k of ram, but that was apparently painful.
Smaller include Lua and maybe Hedgehog Lisp. Below that there is ulisp but that is sort of a toy. Also there is Javacard and J2ME depending on device size.
Below that you're probably best off forgoing garbage collection and using Forth or a cross compiled language.
Will have to look closer at picoruby. I do love MicroPython development. One support thing is how easy it is to combine with C, because to get the most out of microcontroller devices one really needs to use low-level languages from time to time.
I'm the developer of Juniper, a functional reactive programming language for the Arduino. It's very possible to run high level code on small devices. This even includes things like closures, which Juniper allocates on the stack!
picoruby is a lot of fun. It's a reimplementation of mruby/c which is even smaller. You might be interested in my friend and creator HASUMI Hitoshi's presentation. He shows off his Ruby REPL that runs on the Raspberry Pi pico.
Recently he ported picoruby to other microcontrollers.
That is larger than the average computer through about 1990 ...
(I know people will say I'm being pedantic. However, RAM and Flash define most of the price of a microcontroller. So, a factor of two--especially in RAM--means a significantly smaller and cheaper chip.)
Depends on use case for sure, though I'm not sure why you'd want an interpreted language on a microcontroller unless you had plans to be able to execute dynamically loaded code in the field, which probably means it has some kind of network connection, and 256KB ROM/128KB RAM isn't too big for your Espressif chips and such.
> I'm not sure why you'd want an interpreted language on a microcontroller
Using a microcontroller to interrogate itself is a really valuable debugging technique.
Generally, you have attach to a UART port and attach a terminal emulator. This is especially important when considering the plethora of sleep modes that now exist for modern SoCs. Debug probes generally put the chip in maximum performance mode which often wipes out the state your trying to debug (especially important for sleep bugs).
Yeah, but 1990 was 35 years ago and Ruby even didn't exist. And the average computer was quite bigger than a RP2040, and consumed much more power :)
I mean, it's very difficult to compare those things. It's a high level language runtime weighting a very small fraction of the JavaScript we're loading on any silly webpage, and we can run in on an stamp sized microcontroller that costs a couple dollars and you'd typically need to program using a relatively low level language.
I think it's neat.
Edit: I was trying to remember our computer then. I think we already had an Amstrad PC1512 with 512KB of RAM by '87 or '88, and by 1990 the 286s with 1 or 2MB of RAM were already common.
And by the time of Amstrad PC1512, BASIC compilers became an option as well, although originally Darthmound BASIC was compiled, CP/M systems also had compilers available, it just did not fit into 8 bit home computers, which had to wait for the 16 bit wave of home computers.
There were other high level dynamic languages with compilers like Lisp subsets, xBase/Clipper.
So if we managed back then, there is no reason to not have a tiny Ruby version nowadays for similar environments.
Yeah. Many of the 8bit machines were lower than 128K stock, but those were uncommon by 90s. Even then, RAM expanders were common. The XT and AT had more than 128K, and the 286 was released in 1982 (volume production in 1983). The 386 was out, and had way more. The 486 was 1989, and it was commonly found with far more. So, really, 128K was only really a good amount of RAM in the early 80s.
Yeah, I'd need to ask my dad, but we got the Amstrad around '86 or '87 maybe, then a Nokia branded(weird, I know) 386sx/16 with 4MB, maybe around '90/'92, then a generic clone with a Pentium 120 and and magnificent Sound Blaster AWE32 around 96.
i share the same sentiment as you. why suffer writing C when one can enjoy the fruits of language features? it's certainly not optimal, but neither is 99+% of software nowadays. There's the feeling of waste and bloat, but the trade-off is language features!
on the other hand, nowadays, we can just generate C code using ai.. as long as the project doesn't get too big to grasp without abstractions. ;)
And we only had ASM for microcontrollers in 1990! (Or mostly, I think C PICs didn't arrive till later). And the tooling was terrible!
I don't hate those languages or think they shouldn't be used, but I don't know how to use them and I already know Python, so if I can easily do MicroPython on a microcontroller, I welcome it :D
Jovial and CORAL-66 had been compiling for various microcontrollers and embedded systems since the 1970s. In the '80s, Ada took over that space.
By 1990, the 8051 (probably the most common uC at the time) had several C compilers, PL/M, at least 1 BASIC compiler, a Pascal, PL/M and several more. Some of those were years old by 1990. There was at least 1 Modula-2 compiler for the 8051, but that might have been a few years after 1990.
Keil, IAR and Tasking went on from 8051 C compilers to write compilers for dozens of microcontroller targets by 1990.
There were C compilers from others for 8096 (multiple vendors), TMS320xx and TMS340x0 (TI), 68xx (Hitachi), ADSP-2100 (Analog Devices), and more.
Intel had PL/M compilers for 8048, 8051 and 8096. Gary Kidall ported a PL/M subset compiler to the Signetics 2650, if you want something really obscure. Green Hills had Pascal compilers for a couple of embedded procs (i960, anyone?).
some people hate to use C. Some people embrace it, even love it (including many game devs!). You have full control! But i wonder, what do HN folks think about these pico implementations of ruby, a complex language made for the tiniest devices, such as sensors? I mean.. is C really that bad??
also, it's very cool they're still being maintained!
It is a return to the age when we had BASIC and Z80/6502 Assembly, and yes doing memcpy/strcpy all over the place instead of having proper arrays and strings is that bad.
> doing memcpy/strcpy all over the place instead of having proper arrays and strings is that bad
This presupposes that you're doing string manipulation and raw memory moves. I don't know about other people's code, but in my own experience neither of those things are likely to be happening in embedded code. You're much more likely to pre-allocate all your memory, and use zero-copy patterns where ever possible. I never used a C-string in relation to talking to peripherals (i2c, spi, register mapped) or when doing dsp. Moreover, if you do need to use memcpy it's probably because you are doing something very low level for which memcpy is the best choice.
What is embedded nowadays was a home computer in 1980's, so maybe you aren't, as someone that started coding on a Timex 2068, I very much doubt that in general.
I don't think C is bad at all. One good thing about C is that the language is quite minimal, the same can not be said for the current batch of contenders (Zig, Rust). Another good thing about C is that it has strong separation of language standard from implementations, with many implementations provided by diverse suppliers. I also prefer the "batteries not included" philosophy. Obviously C is not without warts (e.g. logical operators precedence).
I will leave it to others to fill out the cons. One obvious one is that C does not reflect the parallel compute nature of modern targets (e.g. SIMD) but neither do most serious alternatives.
I do think the time is coming (if not already here) where it would be judged legally negligent to professionally employ C in new systems for certain use-cases (network facing, high assurance, ...). At least not without complete formal verification. I'll leave it to the actuaries to work out where the tipping point is.
Make a few small mistakes, even typos, and C becomes really that bad, because you have full control, no restraints, few guard rails.
C trades very fast execution for rather slow development, especially of larger code bases, if you want correctness. It's fine for tiny programs though.
(Ruby, of course, has its own set of jolly footguns, as any dynamically typed language.)
Smaller include Lua and maybe Hedgehog Lisp. Below that there is ulisp but that is sort of a toy. Also there is Javacard and J2ME depending on device size.
Below that you're probably best off forgoing garbage collection and using Forth or a cross compiled language.