Hacker News new | past | comments | ask | show | jobs | submit login
DigiSpark (ATTiny85) – Arduino, C, Rust, build systems (diziet.dreamwidth.org)
49 points by qsantos on Oct 30, 2023 | hide | past | favorite | 37 comments



This isn't really comparing C and Rust for ATTiny85, it's comparing Arduino flavoured C++ and Rust for ATTiny85 running a micronucleus bootloader.

The ATTiny85 is incredibly trivial to get code running on in C (and less trivial in Rust, but probably still easier than what this blog post makes it out to be) if you eschew the complexity of Arduino and USB (micronucleus).

Here is all the source code for a blinker in AVR-C:

    #include <avr/io.h>
    #include <util/delay.h>
    
    int main(void)
    {
        DDRB |= BV(0);
    
        while (1) {
            PORTB |= BV(0);
            _delay_ms(500);
            PORTB &= ~BV(0);
            _delay_ms(500);
        }
    }
Here is how you build and flash it (with a USBASP programmer, but feel free to buy anything else):

    avr-gcc -std=c11 -DF_CPU=16000000UL -mmcu=attiny85 -o blink blink.c
    avr-objcopy -O ihex blink.hex blink
    avrdude -c usbasp -p attiny85 -U flash:w:blink.hex:i
Not quite as convenient as using micronucleus and USB, but far simpler to debug/understand and generally more flexible.

If you insist on using micronucleus, then as I understand it the only thing which changes is you need to build the command line tool micronucleus counterpart and do something like:

    micronucleus --timeout 60 --run --no-ansi blink.hex
Really the problem here is that Arduino tried to present embedded development as a simple idea, and when this pristine abstraction breaks down inevitably, things get really confusing really fast.


I think part of the reason for arduino's success among amateurs is better variable and function names. I have quite a lot of (mostly higher-level) programming experience, but I just have to take your word for it that that's a blink program. The 500 millisecond delay and the endless loop are straightforward, but I have no clue what DDRB or BV(0) refer to and I'm only vaguely aware that PORTB probably refers to GPIO pins.


Yeah avr-gcc is pretty thin and it would be nice if the memory mapped registers had clearer names. That being said, there's a balance between being close enough to the hardware to get predictable timings and the necessary flexibility and being abstract enough that the code is easy to understand by people without domain specific knowledge. You can get closer to the latter by providing increasingly thick layers of abstraction.

The problem is that making the abstractions perfect (and arduino's are not, probably for this reason) can take a LOT of code.

These Atmel chips are pretty tiny and memory constrained compared to what you might be used to. And that is not even to mention the performance overhead these abstractions can incur.

Sometimes you also run into fundamental impedance mismatch issues. Situations where you can't present a high level interface to something without having to make unacceptable compromises.

All in all, while arduino is convenient for a beginner to get started, the second the beginner starts to get serious about hardware development is the second the issues begin appearing.

You can get quite far with arduino but if you want to use a device like the ATTiny85 to its full potential, you're going to struggle.


Arduino has done an amazing job at bringing those chips to the open public. The job has been done in many directions, software, hardware, education, marketing, etc... Arduino programs are definitely easier to read but they do add an overhead. If you run the above program without the delay and put an oscilloscope at the ends of the LED you will actually see the clock frequency of the chip! If you do that with Arduino code you will, at least, see only 10x the clock frequency. I find it amazing that we can see the "software bloat" in the oscilloscope.

While it is true that the digitalWrite Arduino functions are easier to read I think with a bit of effort and having the datasheet by your side (you are programming hardware!) you can get the hang of it pretty quickly. Without the scary bit operators the code above is not that difficult:

    #include <avr/io.h>
    #include <util/delay.h>

    int main() {
        DDRB = 0b00000001; // DB0 is an output
        setup();
        for(;;) {
            PORTB = 0b00000001; // on
            _delay_ms(500);
            PORTB = 0b00000000; // off
            _delay_ms(500);
        }
    }


DDR* is the register for setting up what the pins on PORT* should be doing. BV is just a bit-twiddling macro.


You see, this is how I would imagine programming for an ATTiny85 would be like.

I've taken interest in them myself, but I can't imagine taking a modern-programming, DRY-worshipping approach where the code itself is a particular abstracted style guide. I'd expect a lot more hexadecimal, repeated sections, and just generally a picture of what ends up in the chip's memory, which is mighty tiny. It seems like there wouldn't be ROOM to make modern corporate programmers happy.

I find the article fundamentally weird, like criticizing the rhyme scheme of an instructional manual. That said, startling to learn that you get no sort of warning that 'int' for the ATTiny85 is so different from what you might think it is. I'd be looking for some kind of special int definition that corresponds with what's in the ATTiny85, and just sticking to that.


The ATtiny85 is an 8 bit MCU. It has no int. Moreover, the C standard doesn't impose restrictions on the width of an int other than that it must be able to hold values between -(2^15 - 1) and (2^15 - 1) (yes, you read that right, this is because before C2x int can be one's complement, two's complement or sign and magnitude).

Now this is mainly C being an unforgiving language but realising this, it may be easier to piece together what's happening here.

Since the ATtiny85 is an 8 bit MCU, anything wider than 8 bits has to be implemented in software. To keep the performance cost of heavy use of int as low as possible, the narrowest possible definition was used. This is in contrast to the author's system which probably uses 32 bit ints.

I imagine the code when #included worked fine because of 32 bit ints and broke when compiled for the ATtiny85 because it used 16 bit ints. Warnings _could_ help here (and I believe the author when he says that the Arduino IDE had no obvious way of enabling them), but realistically the solution would have to re-write the code to use fixed width types. That way you could at least rely on the widths being the same.

Although at least until C2x, you can't assume two's complement.

That being said, it seems to me that trying to "simulate" the chip like this is always going to result in issues, especially when you try to emulate hardware. Probably the best solution here is to bite the bullet and learn how to use simavr which would pay off regardless of using C or Rust.


That's exactly what I do. I like programming for the ATTiny85 a lot. I basically have my usbtiny programmer always connected to the computer and in a different breadboard the project I am actually working with. Then a simple Makefile takes care of the rest. It's a bit tedious to have to mount and unmount the chip every time, but it's ok. FWIW, this is my Makefile:

    CC=avr-gcc
    FLAGS=-w -Os -flto -fuse-linker-plugin -Wl,--gc-sections -mmcu=attiny85
    OBJCOPY=avr-objcopy


    compile:
        $(CC) $(FLAGS) main.c -o main.elf

    main.hex: compile
        $(OBJCOPY) -j .text -j .data -O ihex main.elf main.hex

    upload: main.hex
        avrdude -v -pattiny85 -cusbtiny -Uflash:w:main.hex

    clean:
        rm -r main.elf
        rm -r main.hex


> It's a bit tedious to have to mount and unmount the chip every time

after ZFS sockets started getting expensive, i started designing my programmable boards to include a header for https://www.tag-connect.com/ and have by now saved the cost of all those bent pins ;)


To be pedantic _delay_ms() takes a double not an int.

In AVR_GCC versions before __builtin_avr_delay_cycles was available, in the compiler, if compiled with the Small Model (-Os) passing an int instead of a float ie: _dealy_ms(500.0f) would produce unexpected delays. This was a frequent source of bug reports in the early days of avr-libc. Those older versions relied on odd compile time coercions, and got confused if not passed a float.

I just wanted to mention that in case anyone has old versions of GCC laying around.


This was highly informative, but the overall tone comes across as (to me) overly condescending, with lots of complaints about having to download "random" software. It's not made clear what the OP's author would expect, where the software can come from in order not to be random.

Also, as has been pointed out in comments on the OP site too, the programming language on the Arduino is not C but Wiring/C++. I see people have pointed that out here, too already.

I'm also curious how large the OP's software was, since it was never considered to manually inspect it for overflow once it was deemed impossible to get the proper warnings exposed. Evidently rewriting it in Rust and getting it to compile was enough to fix the overflow problems since the Rust software ran correctly. Sounds like the code was not too massive (it's an 8-bit chip after all so it is probably not doing something super-complicated) and it should have been possible to review it and figure out the problem.


These are boards with 8K memory total, meaning the program in question would very likely fit on a single monitor screen. Dodging the debugging step and reaching for your one true hammer (Rust in this case) speaks more to me about the author's "pricipaled" engineering anything else in this post.


In the meantime, I'd highly recommend PlatformIO over the Arduino IDE for writing code for embedded- besides having a much nicer UX, (they recommend using their VSCode extension, which is great, but they also have extensions for a dozen other IDEs) it natively supports way more boards and frameworks.


Seconded on platform.io. Unsurprisingly, they support the Digispark: https://docs.platformio.org/en/stable/boards/atmelavr/digisp... .


I recently switched to PlatformIO for an embedded project. The IDE plugins are not necessary, you can easily get by with CLI tools alone (atleast I do).


>Also, disturbingly, its “board manager” seemed to be offering to “install” board support, suggesting it would download “stuff” from the internet and run it. That wouldn’t be acceptable for my main laptop.

I have some bad news about what apt does...


From the article: The programming language offered via the Arduino IDE is C.

This is not true, Arduino uses the C++ language.


Indeed, anyone claming Arduino uses C, has not used the language in a very looong time.


Despite the overall negative tone of the article, the author seems to be aware of their blind spots:

> As will be obvious from this posting, I’m not an expert in dev tools for embedded systems. Far from it. This area seems like quite a deep swamp, and I’m probably not the person to help drain it. (Frankly, much of the improvement work ought to be done, and paid for, by hardware vendors.)

The main issue here is that the entire Arduino ecosystem is, to put it bluntly, a huge mess due to its scrappy nature. It often gets pushed as the default "recommended" way of doing anything with microcontrollers, but it is not and was never meant to be used by experienced developers. Idiosyncracies such as having to install a third-party package just to be able to target the ATtiny85 - a microcontroller based on the exact same AVR architecture as the other boards the IDE ships with built-in support for - are far too common and very little progress has been made so far to tackle them.

If you are the kind of person who wants or needs proper debugging capabilities, a standard build system or the ability to use your own IDE, you would be better off leaving Arduino in favor of a more traditional SDK package (either the official one provided by the vendor or an open source alternative such as avr-libc or libopencm3). Which, incidentally, is what the author did here: the complaints about Rust vs. C(++) have nothing to do with the actual language, they are about the surrounding libraries and build systems. The overflow could have been caught by compiler warnings if it weren't for Arduino disabling them by default in the name of beginner friendliness, as already pointed out by others.

The complaint about vendor support is similarly unfounded. While it is true that the quality of manufacturer-provided IDEs and SDKs often leaves something to be desired, they exist and are generally well supported. Vendors have no obligation to support Arduino or Rust options, neither of which are (yet) commonly used in the industry outside of hobbyists.


> the entire Arduino ecosystem is, to put it bluntly, a huge mess due to its scrappy nature

It's less scrappy in my experience that a lot of the tools "aimed at professionals". (I guess "aimed at professionals" in the embedded space approximately means "no documentation".) I think the fact that you don't have to spend 50% of your time fighting the tools is a big reason for the popularity of the Arduino exosystem. It's just nice to have an IDE that's not so proprietary and stuck in the 90s as to crash every ten minutes (which they try to make you use for whatever vendor lock-in or similar reasons)


If you're choosing AVR, using Linux and program Rust, you're in the 0.1% top technical computers users and it's a bit strange to write an article about software that is targeted for beginners.

Why not use the (seemingly) more suitable avrdude?


I'd say it was really a newbie post.


I'd say it was damning documentation, that such a technically capable person (of whatever stripe, not of embedded obviously, but in general) had this experience.

Picking apart any particular decision along the way does not invalidate the points made.

They pointedly did not start with the least likely most obscure possible approach, exactly to avoid "well you ventured into the wilderness and so of course you got bit by a snake and fell off a cliff"


What I mean is new in embedded, not proficient in C, and green enough to jump to conclusions about the whole "embedded ecosystem" from a product targeting casual amateur market.

Things on the systems end facing the hardware are not really harder but different enough to command their own learning curve.


Isn't Arduino a sort of simple C++, not C? Also, for most beginners to embedded development I would recommend starting with ARM devices and a JTAG programmer for anything beyond simple blinky-type programs with Arduino. Being able to step through code and see the various hardware registers is helpful.

And as great as Rust is, it is hard to set up with embedded right now. Maybe someday, but for now I'd stick to C unless you really want to play around with Rust specifically!


I'm curious, so I'll bite: what do you find hard to setup about embedded Rust? Personally haven't found it too bad (this is STM32/Cortex-M though, which is sort of a happy path)


Arduino is "just" C++, it uses gcc to compile after all. There's minimal amounts of text-level meddling in the toolchain that e.g. includes some headers, declares your methods and defines an entry point for you.


I don't know why he's making it harder than it actually is. It's almost as easy as the MSP430s. Using the arduino-ide complicates things IMO so that's probably where he's gone down the wrong path.

A simple makefile, avr-gcc, avrdude and the attiny85 datasheet is all you need. He has the programmer I use judging by his description.

Makefile i use for my projects https://pastebin.com/pntGaWk7

edit: I think he isn't using the same programmer. He's using that annoying one where you have to either unplug + plugin before each flash with arduino or ground the permanent reset pin. Chuck it and get a proper avr programmer.


I would love to see into the future and know if Rust adoption will meet or exceed C adoption in the microcontroller space. Right now, it seems like the trend seems to be going more towards greater adoption of high-level/scripting languages (MicroPython especially), instead of replacing C with another "low level"* language.

Now, replacing a MicroPython written in C, with a MicroPython written in Rust? I can easily see that.

* I know some consider C high level, especially if you are writing Assembly.


Really high level (as in visual programming) is actually the most optimal way to do some of the embedded programming - in particular when it comes to the setup/configuring part.

Setting up the clock tree, configuring pins/port functions, enabling peripheral clocks, interrupts (which ones and their priority levels), fixing erratas, etc. All of that should at the end of the day come down to a couple loads/stores to some memory mapped registers, and these registers are usually "wide" where they contain N bits to set N things - so if you design a C (or rust) api something like this:

  enable_clock(int clock_id);
You wont get optimal code for:

  enable_clock(FOO);
  enable_clock(BAR);
Because if FOO and BAR are both enabled by setting bit 3 and 7 in register X you get two loads, two bitwise or's and two stores when a single store with a single constant would have sufficed.

So fire up a GUI program that lets you click on "enable FOO, enable BAR", "set main clock to crystal oscillator at N MHz, enable PLL x4, etc", have that generate optimal code and go from there.


Or you could write an API like set(FOO | BAR) and get the same benefit for comparably basically no cost (+ now you get nice diffs in git).


Problem is that some registers have non-zero default values (swd debug pin functions for example) so the set() function would still need to do a read modify write operation which isn't the theoretically optimal code gen.


Then it’s a question of annotating this function with ALWAYS_INLINE and letting the compiler optimise it.


good point on nice diffs. collaborative work (using git or other source control) is really why text is king.

there's no good way to collaborate, compare, etc with a low-code format.


makes sense. did you have in mind something like makecode[0], something else, or wishing for something that hadn't been invented yet?

[0] https://makecode.adafruit.com/


A reason being that many of those microcontrollers happen to be more powerfull than the 8 and 16 bit home computers that we were coding in BASIC, AMOS, Modula-2, Pascal, C, C++, Clipper, Lisp, Forth,....

While Python has taken over BASIC's role from those days.


One of the compiler arguments to gcc is -fpermissive, so the warnings he wanted for the narrowing conversions wouldn't happen regardless.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: