This isn't really comparing C and Rust for ATTiny85, it's comparing Arduino flavoured C++ and Rust for ATTiny85 running a micronucleus bootloader.
The ATTiny85 is incredibly trivial to get code running on in C (and less trivial in Rust, but probably still easier than what this blog post makes it out to be) if you eschew the complexity of Arduino and USB (micronucleus).
Here is all the source code for a blinker in AVR-C:
Not quite as convenient as using micronucleus and USB, but far simpler to debug/understand and generally more flexible.
If you insist on using micronucleus, then as I understand it the only thing which changes is you need to build the command line tool micronucleus counterpart and do something like:
Really the problem here is that Arduino tried to present embedded development as a simple idea, and when this pristine abstraction breaks down inevitably, things get really confusing really fast.
I think part of the reason for arduino's success among amateurs is better variable and function names. I have quite a lot of (mostly higher-level) programming experience, but I just have to take your word for it that that's a blink program. The 500 millisecond delay and the endless loop are straightforward, but I have no clue what DDRB or BV(0) refer to and I'm only vaguely aware that PORTB probably refers to GPIO pins.
Yeah avr-gcc is pretty thin and it would be nice if the memory mapped registers had clearer names. That being said, there's a balance between being close enough to the hardware to get predictable timings and the necessary flexibility and being abstract enough that the code is easy to understand by people without domain specific knowledge. You can get closer to the latter by providing increasingly thick layers of abstraction.
The problem is that making the abstractions perfect (and arduino's are not, probably for this reason) can take a LOT of code.
These Atmel chips are pretty tiny and memory constrained compared to what you might be used to. And that is not even to mention the performance overhead these abstractions can incur.
Sometimes you also run into fundamental impedance mismatch issues. Situations where you can't present a high level interface to something without having to make unacceptable compromises.
All in all, while arduino is convenient for a beginner to get started, the second the beginner starts to get serious about hardware development is the second the issues begin appearing.
You can get quite far with arduino but if you want to use a device like the ATTiny85 to its full potential, you're going to struggle.
Arduino has done an amazing job at bringing those chips to the open public. The job has been done in many directions, software, hardware, education, marketing, etc... Arduino programs are definitely easier to read but they do add an overhead. If you run the above program without the delay and put an oscilloscope at the ends of the LED you will actually see the clock frequency of the chip! If you do that with Arduino code you will, at least, see only 10x the clock frequency. I find it amazing that we can see the "software bloat" in the oscilloscope.
While it is true that the digitalWrite Arduino functions are easier to read I think with a bit of effort and having the datasheet by your side (you are programming hardware!) you can get the hang of it pretty quickly. Without the scary bit operators the code above is not that difficult:
#include <avr/io.h>
#include <util/delay.h>
int main() {
DDRB = 0b00000001; // DB0 is an output
setup();
for(;;) {
PORTB = 0b00000001; // on
_delay_ms(500);
PORTB = 0b00000000; // off
_delay_ms(500);
}
}
You see, this is how I would imagine programming for an ATTiny85 would be like.
I've taken interest in them myself, but I can't imagine taking a modern-programming, DRY-worshipping approach where the code itself is a particular abstracted style guide. I'd expect a lot more hexadecimal, repeated sections, and just generally a picture of what ends up in the chip's memory, which is mighty tiny. It seems like there wouldn't be ROOM to make modern corporate programmers happy.
I find the article fundamentally weird, like criticizing the rhyme scheme of an instructional manual. That said, startling to learn that you get no sort of warning that 'int' for the ATTiny85 is so different from what you might think it is. I'd be looking for some kind of special int definition that corresponds with what's in the ATTiny85, and just sticking to that.
The ATtiny85 is an 8 bit MCU. It has no int. Moreover, the C standard doesn't impose restrictions on the width of an int other than that it must be able to hold values between -(2^15 - 1) and (2^15 - 1) (yes, you read that right, this is because before C2x int can be one's complement, two's complement or sign and magnitude).
Now this is mainly C being an unforgiving language but realising this, it may be easier to piece together what's happening here.
Since the ATtiny85 is an 8 bit MCU, anything wider than 8 bits has to be implemented in software. To keep the performance cost of heavy use of int as low as possible, the narrowest possible definition was used. This is in contrast to the author's system which probably uses 32 bit ints.
I imagine the code when #included worked fine because of 32 bit ints and broke when compiled for the ATtiny85 because it used 16 bit ints. Warnings _could_ help here (and I believe the author when he says that the Arduino IDE had no obvious way of enabling them), but realistically the solution would have to re-write the code to use fixed width types. That way you could at least rely on the widths being the same.
Although at least until C2x, you can't assume two's complement.
That being said, it seems to me that trying to "simulate" the chip like this is always going to result in issues, especially when you try to emulate hardware. Probably the best solution here is to bite the bullet and learn how to use simavr which would pay off regardless of using C or Rust.
That's exactly what I do. I like programming for the ATTiny85 a lot. I basically have my usbtiny programmer always connected to the computer and in a different breadboard the project I am actually working with. Then a simple Makefile takes care of the rest. It's a bit tedious to have to mount and unmount the chip every time, but it's ok. FWIW, this is my Makefile:
> It's a bit tedious to have to mount and unmount the chip every time
after ZFS sockets started getting expensive, i started designing my programmable boards to include a header for https://www.tag-connect.com/ and have by now saved the cost of all those bent pins ;)
To be pedantic _delay_ms() takes a double not an int.
In AVR_GCC versions before __builtin_avr_delay_cycles was available, in the compiler, if compiled with the Small Model (-Os) passing an int instead of a float ie: _dealy_ms(500.0f) would produce unexpected delays. This was a frequent source of bug reports in the early days of avr-libc. Those older versions relied on odd compile time coercions, and got confused if not passed a float.
I just wanted to mention that in case anyone has old versions of GCC laying around.
The ATTiny85 is incredibly trivial to get code running on in C (and less trivial in Rust, but probably still easier than what this blog post makes it out to be) if you eschew the complexity of Arduino and USB (micronucleus).
Here is all the source code for a blinker in AVR-C:
Here is how you build and flash it (with a USBASP programmer, but feel free to buy anything else): Not quite as convenient as using micronucleus and USB, but far simpler to debug/understand and generally more flexible.If you insist on using micronucleus, then as I understand it the only thing which changes is you need to build the command line tool micronucleus counterpart and do something like:
Really the problem here is that Arduino tried to present embedded development as a simple idea, and when this pristine abstraction breaks down inevitably, things get really confusing really fast.