Hacker News new | past | comments | ask | show | jobs | submit login
Writing Linux Modules in Ada (2016) (nihamkin.com)
134 points by slondr on Aug 28, 2023 | hide | past | favorite | 84 comments



Back in university I took a course in legacy programming. We did a project in each of Fortran, COBOL and Ada. I enjoyed Ada very much. The module system made a lot of sense to me and the compiler found a lot of mistakes (compared to C). Thanks for sharing this article. It brought me back and makes me want to give Ada another go now that I have 10 years of real world experience.


I had a digital design course. The instructor hated verilog and preferred VHDL for reason that didn't make much sense to me those days. There was only one VHDL compiler available that time called ghdl which worked fine for course. There was modelsim as well but one had to go to the lab.

Later I learnt that VHDL is related to ADA. I've been itching to try it for a long while. Might give it a try this week. Currently I am in my Rust phase and loving it (thanks cargo). My Haskell fever is gone though.


I haven't tried Ada yet, but do have experience with VHDL. It's the most modular language I've encountered. It's easy to work with big teams in VHDL. Choose everyone's responsibility, design the interfaces (entities) and have a brief discussion on it. Then go separate ways and implement your parts (architecture). They all finally fit together like magic without needing too much coordination or interaction between the team members. I'm pretty sure that's what's great about Ada too. However, I still don't understand what it is about the language's design that makes this possible.


I've been meaning to explore ADA as well, since learning that GHDL is itself written in ADA.


One place where you can easily use ADA syntax is any database that implements ANSI SQL/PSM, and there are many.

"SQL/PSM is derived, seemingly directly, from Oracle's PL/SQL. Oracle developed PL/SQL and released it in 1991, basing the language on the US Department of Defense's Ada programming language."

https://en.m.wikipedia.org/wiki/SQL/PSM


Ada module made me make sense of java oo somehow.

I'd love to get a gig in ada.


Hah, my perfect niche!

I think that an ada-like language could make a real resurgence in embedded programming. Ada gets all the things about bare-metal right that C got wrong. However, it's held up by legacy tooling, clunky syntax, and obtuse compiler errors. Adacore has gone a long way towards alleviating those issues over the last few years, with alire and the ada_language_server. Time will see where this language takes us.


I completely agree. 'Design-by-committee' gets a bad rap, but Ada's designers got a lot of things right when it came to bare-metal programming. The ability to specify the in-memory representation of a type is one of my personal favourites. I don't think Ada's syntax has aged well. I wish it would get a 21st-century overhaul. I don't think that's likely though, but we can all dream. AdaCore do great work, and contribute a lot back to the open-source community.


For what it is worth, Ada was not "designed by committee" more than most languages. It was designed by a design team, and in each revision, there was a strong technical leader of the design team, with the whole design team sharing a strong design aesthetic. I doubt you could say even that about many mainstream languages these days.


Thank you for clearing this up. My use of the phrase 'designed by committee' was more of a reference to a common criticism of Ada, than to any actual historical fact. Not that I was actually any the wiser, I admit. I'm a big fan of your work. Thanks for taking the time to reply.


Usually many forget that their beloved C, JavaScript, Webassembly, OpenGL, Vulkan, Web, POSIX,... are equally design by committee.


The benevolent dictator cachet comes mostly from Python microcosm.


And Perl before that.

It's not a bad way to go, in practice, as long as it lasts.


It's unclear exactly what you mean by your remark that Ada's syntax hasn't aged well, let alone what a 21st-century overhaul would look like.

Is there any chance you'd expand on that? I'm curious to know your thoughts.


They probably mean that they can’t get past the fact that it looks more like Algol or Pascal than C. Which is, frankly, a pretty silly argument.

I’ve heard this exact argument about VHDL versus Verilog, with the former being explicitly based on Ada’s syntax and the latter being explicitly based on C’s. (Turns out though that VHDL is also strictly better than at least traditional Verilog, as it requires separate interface specifications which lead to improved modularity.)

If anyone were to actually try to create “Ada: The Next Generation” I’d encourage you to just go all the way to S-expressions. Focus on the completeness and correctness of _the system_ and stop worrying about superficial complaints—or brush them off with a suggestion that they can use any syntax they want and just translate it to the standard one via tree-walking.


I like your use of "Ada: The Next Generation". I replied to your comment in GP. I'm not really too hung up on the Pascal-ish syntax. I don't think these things matter too much. If I had to design the syntax myself, I'd probably go in a different direction. That's just my own opinion though.


Modern Ada 2012 is full of parentheses now. if-, case-expressions, expression-functions, and recently raise-expressions, quantifiers-everything, delta-aggregates, the thing looks more and more like my olden ocaml code. In a good way.


It always makes me grin when newbies complain that VHDL is too verbose. The pain of hooking up the Xilinx AXI interconnect in about three feet of (pre-system)verilog is something I will not forget in a hurry. Having wrapped it in VHDL with nice neat record types, I can now hook it up in just a few lines of VHDL.

I think of it a bit like rat's nest wiring vs. nice neat labelled cable looms.


Sure. I write a lot of Ada, and I'm a big fan of the language overall. My criticisms are minor, and it doesn't stop me enjoying the language: I feel like the 'begin', and 'end' tokens can be a bit verbose, and their verbosity doesn't add much to the language's clarity; I also dislike the fact that I need to place the same subprogram specification in both the specification and body file; I'm also on the fence about some of Ada's pointer semantics; For what it's worth, I'm also not a big fan of the particular style that the Ada language server auto-formats code into.

I'll use this chance to say that I'm a fan of Ada's declarative blocks, which you could say are part of its syntax. There's a lot to like about Ada, and I'd encourage anyone interested in bare-metal programming to give it a try. Even if you don't intend to use it long-term, there's a lot of good language design ideas in Ada that can be learned from.

To address a sibling comment alleging that I'd rather it look like C, that's not necessarily true. I know this is much more controversial, but I'm actually more of a fan of Python's syntax.


Hey, thanks for following up.

I always felt that repeating myself to the compiler (down to `procedure subprogram is begin`…`end subprogram`) or otherwise being verbose was something of a feature in the vein of defense in depth than the syntax not aging well. It never seemed to me like it was meant to impose clarity on the code, just that you were meant to file your subprograms and types in triplicate. It's useful as a backstop against trying to do things quickly instead of doing them deliberately.

At least, Ada's use of verbosity stands in stark contrast to COBOL, which does have the explicit aim to make things clear.

I'm mostly of the same mind as you in re Python's (and Haskell's and F#'s) whitespace-oriented ways of doing things, at least until the real world creeps in. I've had problems with early block termination because of mixed tabs and spaces that weren't readily apparent, and the language's toolchain wasn't particularly helpful in diagnosing them, leading me to lean on external tools. (COBOL at least has the heritage of being column-oriented, so indentation being significant is less problematic there.)


I think it's evident to most people that write Ada that the language is too English-heavy. This gets in the way of actually representing the computations. This was a design decision when the language was created, since the emphasis was put on maintainable code


Is this meant as a joke, too English-heavy?


I think it’s a reference to keyword heavy in an effort to mimic natural language similar to what COBOL tried to do.


They likely mean the pascal-like syntax


Standard Pascal has 35 reserved words compared to ANSI C 89 which has 32. Not a big difference in terms of English-heaviness.

https://wiki.freepascal.org/Standard_Pascal

https://en.cppreference.com/w/c/keyword


Pascal uses words in a lot of places where C uses symbols


It was designed by committee, but as long as the original author was involved he had a veto right and he used it very often, which alleviated the "committee" effect.


> Ada gets all the things about bare-metal right that C got wrong

I'm curious, never had a look at Ada, can you elaborate?


I think it mostly refers to the build in functions in Ada for Bit fiddeling. You can represent registers and bitmaps in Ada data structures and use relatively simple to understand functions on them instead. https://learn.adacore.com/courses/intro-to-embedded-sys-prog...


There are a lot of aspects where Ada is better than C. Just a few things that came to mind:

General lack of dumb C stuff like switch fallthrough, null terminated strings (Arrays in Ada are passed with fat pointers), undefined behavior, no need for memcpy, no preprocessor bullshit etc.

Ada is more like C++ in functionality so it has Generics, Tasks (Threads), Exceptions, Packages, Strong types, Design by contract etc (All much much saner than C++ stuff)

Despite all of the features it's very embedded friendly. Language allows you to disable features you dont wont with Restrictions Pragma (Very long list of restrictions you can apply: https://docs.adacore.com/gnat_rm-docs/html/gnat_rm/gnat_rm/s...). You can also fairly easily change runtimes: https://docs.adacore.com/gnat_ugx-docs/html/gnat_ugx/gnat_ug....


I know berating C is trendy, but it feels a bit gratuitous and uncalled for in your comment...

> Ada is better than C

> lack of dumb C stuff

Back to your comment, strong types and generics look super nice for embedded. Not sure I would like fat pointers, exceptions and threads in my embedded code though.


Tasks in Ada aren't threads, per se, unlike how many people describe them. They can be implemented with threads, but they don't have to be. For embedded and real time systems you likely have a task system suitable for that sort of environment (I'd hope) and Ada compilers targeting such systems will use an appropriate task system.


>I know berating C is trendy, but it feels a bit gratuitous and uncalled for in your comment...

Sorry about that, English is not my first language so i might sound rude sometimes. I actually dont hate C, but if you used it you know its flaws. I think stuff i mentioned about C is objectively bad, hence why i called them dumb stuff.

>Not sure I would like fat pointers, exceptions and threads in my embedded code though

Fat pointers are just pointer + size of an object. You have to pass array size anyway, so its just convenient. Obviously, if you need to just pass a pointer there are ways to do that.

As for exceptions and threads: As i said, you can disable or modify them just by using Restrictions pragma. Its a language defined thing. For example, Ada 2022 defines two profiles for safety-critical hard real-time computing:

https://en.wikipedia.org/wiki/Ravenscar_profile http://www.ada-auth.org/standards/22rm/html/rm-d-13.html


> Sorry about that, English is not my first language so i might sound rude sometimes

No offense taken, not a native here either ;-)

> if you used it you know its flaws

Indeed, I've been using C for almost 20 years now. I won't say it's without flaw for sure, it has its quirks, but overall I do think it's quite okay for the job.

> Fat pointers are just pointer + size of an object

Yeah I know what fat pointers are, I even resort to handcraft some form in C for some neat performance hackery on x64.

But the thing is, we're talking embedded here. Most ucontrollers I use have 8b address space, no MMU or any form of virtual memory, separated instruction/data bus (MVHA). That kind of thing don't play well with funky fat pointers.

Sure if your definition of embedded is "64b ARM" all is good, but I guess we're on a spectrum.

Exceptions are pretty much non existing as well, since that would require some form of runtime, which you often just cannot afford on a small chip (if not just form the sheer size of it).

Threading is a no go as well. To get threads, or any form of multitasking really, you have to rely on an operating system, which by definition is a bit weird to have on an embedded IC.


>But the thing is, we're talking embedded here. Most ucontrollers I use have 8b address space, no MMU or any form of virtual memory, separated instruction/data bus (MVHA). That kind of thing don't play well with funky fat pointers.

You can definitely use Ada on 8bit microcontrollers, for example on 8bit AVR: https://docs.adacore.com/gnat_ugx-docs/html/gnat_ugx/gnat_ug...

You can make runtime as small as you want.

As for exceptions and threads: I mentioned them because thats what comes to mind first, there are more benefits obviously.


I personally think that its 'representation clauses' are a really awesome feature for bare-metal programming. It's a shame other languages haven't borrowed this idea.


Representation clauses are by far the biggest feature for embedded programming:

https://learn.adacore.com/courses/advanced-ada/parts/data_ty...

http://www.ada-auth.org/standards/22rm/html/RM-13-1.html

Wouldn't it be nice in C to be able to define how a struct is laid out in the machine representation? In Ada, you can and it is part of the standard, so it is portable:

https://learn.adacore.com/courses/advanced-ada/parts/data_ty...


It's difficult for someone with 0 knowledge of the language to really understand representation clauses. It does seem to be similar to enum values in C++?

    // Ada
    for Day use (Mon => 2#00000001#,
                 Tue => 2#00000010#,
                 Wed => 2#00000100#,
                 Thu => 2#00001000#,
                 Fri => 2#00010000#,
                 Sat => 2#00100000#,
                 Sun => 2#01000000#);
    // C++
    enum Day {
        Mon = 0b00000001,
        Tue = 0b00000010,
        Wed = 0b00000100,
        Thu = 0b00001000,
        Fri = 0b00010000,
        Sat = 0b00100000,
        Sun = 0b01000000,
    };
As for the record representation, my understanding is that it is equivalent to having a normalized __attribute__((__packed__)), where smart compiler padding is disabled and you can arbitrarily decide the memory layout of your struct?


Representation clause is just language defined way of how struct (Record in Ada) Enum or Array are actually laid out in memory. So, for example, you can define a struct that represents a register, overlay that register address and use it like this:

  procedure Enable_USB_Clock is
  begin
     Registers.PMC_Periph.PMC_SCER.USBCLK := 1;
  end Enable_USB_Clock;
Example taken from here: https://learn.adacore.com/courses/Ada_For_The_Embedded_C_Dev...


Ahh, Lovely Ada. When I was a wee lad, toiling as an Email consultant, having my mind slowly destroyed by Sendmail configurations, the constant fear induced by anything related to Microsoft and Email, and, well, let's not mention Lotus Notes, some wounds never heal, I had the grand idea of writing a secure collection of Email tools.

This included SMTP, IMAP, POP3 daemons, and various other tools. I was going to write them all in Ada. I had a basic SMTP daemon that would accept mail and deliver it. But then, one of the various Outlook worms generated a ton of revenue at the expense of sleep, hygiene, and dignity, and I decided to get out of the Email business ASAP.

Still wish I would have kept working with Ada. I really liked it, and one could write tight code with it.


Look up Ada Web Server :-) no IMAP though. And no QUIC yet, sadly. I hope someone is going to write a whole HTTPS stack using RecordFlux. One can dream.


I have actually looked at it, in my endless quest to stop having Naviserver/AOLserver be my favorite web server. ;->

I'm going to be giving it another go here in a bit, I got a small little personal project where I'm going to make a serious attempt to curse at it for a bit.


I'm actually curious whether someone has formalized message formats/grammar, protocol specs (state management, timing specs,...) for QUIC or IMAP, in some parseable form other than free-form text.


I've never used Ada, and I also don't really do systems stuff, but it does seem like it's a pretty neat language, at least compared to C. From what I have seen, it looks like it has better memory guarantees while still being fast and low-level.

With be the popularity of Rust, it makes me kind of wonder why Ada isn't more popular. I should give the language a go.


There wasn't an open source compiler from the beginning, and thus a lot of the compilers where expensive. You could only use the language in an awful legacy setting behind a wall of NDAs and security clearances. Not my experience, just a bunch comments on HN of actual retired Ada devs on why the language didn't take when I was researching the language.

Adacore in the last few years have been investing heavily in modernizing the tool chain, but now it seems they are also investing in Rust.


This is true, I've worked on a actively maintained Ada codebase in aerospace industry and we had to use a proprietary Ada Xd compiler that was being sold for hundreds of thousands of dollars per installation by company not interested in doing any updates, that came with a phonebook-size paper errata of known bugs.


>> Adacore in the last few years have been investing heavily in modernizing the tool chain, but now it seems they are also investing in Rust.

Adacore is working with Ferrous Systems on Ferrocene (https://ferrous-systems.com/ferrocene/), a Rust toolchain for use on safety critical applications:

https://blog.adacore.com/announcing-publication-of-the-draft...

https://ferrous-systems.com/blog/ferrous-systems-adacore-joi...

I am hoping that Ferrocene's work will help the drive the standardization of Rust over the next few years:

https://github.com/rust-lang/rust/issues/113527


> With be the popularity of Rust, it makes me kind of wonder why Ada isn't more popular.

It looks like Ada is having a resurgence due to Rust. I see Ada being pushed alongside Rust in a lot of places that emphasize safety.


Here is a really good ada tutorial for low level and systems programming.

https://learn.adacore.com/courses/intro-to-embedded-sys-prog...

Here the bit fiddling in Ada data structures is explained https://learn.adacore.com/courses/intro-to-embedded-sys-prog... Quite cool


I want to like Ada, but the lack of support for Mac OS on anything Apple Silicon related is a huge reason to skip it and do something else (for me).

One thing I still haven't wrapped my head around is how "dynamic" memory allocation and cleanup works in Ada. It doesn't seem as important to mention that early in any documentation anywhere. And, maybe it's the C/C++ programmer in me, but that strikes me as a bit odd. Or, perhaps I just can't see past the tip of my nose and it's there.

I kind of need to know how dynamic memory works in any programming language before I plan to invest deeply in learning it. And it needs to work on my hardware.


There’s a GNAT release for M1 now. The FSF Ada compiler is based on GCC so it has worked for RISC-V and other ARM CPUs for a little while now as well.

Ada’s dynamic memory principles are definitely unique. For heap allocation its based around memory pools, at least in GNAT. For the most part it’s RTTI but you can do manual new/free style too (though discouraged).

Ada uses a secondary stack as well for variable-length function returns, so in practice you don’t need to do heap allocation very much.

There are also equivalents of some STL containers like vector that can handle heap allocations for you safely.


Thanks for this... I'll take a look at how to get started on Mac OS with GNAT.

I went down a bit of a rabbit hole recently looking to see if there was a LLVM way to do this. It looked like it was being worked, but I'm not sure it's the best way to get started with Mac OS and Ada on Apple Silicon.

I'll check out homebrew and macports too... again, just in case!

Thanks again!


Not sure how well it works (I don’t have a Mac), but there is this repository that describes how to compile the whole gcc/gnat ada system in MacOS.

https://github.com/simonjwright/building-gcc-macos-arm-eabi

Bonus point is that you get the compiler for ARM bare metal


Happy to help! I can also recommend alire (alire.ada.dev), the open source package manager for Ada. It can manage your toolchains for you and makes it very easy to get started and create new projects. It works well on M1s.


Memory management in Ada is such that you need pointers (and the heap) far, far less than in other languages: the language allows the return of values to objects of unconstrained subtypes to provide the constraints; example:

    Text : Constant String := Read_Chapter( Book );
Additionally, nesting DECLARE blocks and subprograms allows a fairly fine-tuned memory-usage/cleanup using the stack. The above example could, for example, be part of an outer DECLARE block, which has an inner DECLARE, perhaps with "Paragraphs : Constant String_Vector := Get_Paragraphs( Text );" in its declarative region and "For Paragraph of Paragraphs loop" in its body... as soon as the block is exited, the stack is popped, reclaiming the used memory. This, in turn, means that the need for heap allocation is greatly reduced.

Here's an excellent presentation on Ada's memory management: https://archive.fosdem.org/2016/schedule/event/ada_memory/


It's interesting to see the positive talk about Ada in here. I think that's good, but also indicative of perhaps a change in perspective. I feel like if the topic of Ada had come up here 10 years ago (probably did), the responses would have been quite different and more along the lines of "that annoying stuffy/overly-verbose/old/obsolete/design-by-committee language" .

I think there's a good growing consciousness of the fairly terrifying unsafety of C/C++, and the relative success of Rust is some evidence of that, at least.

Many moons ago I bought an Ada 95 manual, and learned a bit of the language with intent to fiddle with it but never finished. I like the idea but not sure I'd be wanting to give up various... modern conveniences... I get from Rust in order to work in that world.


I always recommend this book, "Building High Integrity Applications with SPARK" as an introduction to how SPARK can be used for high-integrity, safe programs for mission-critical applications. SPARK has a legacy of large, high-integrity applications over the past two decades. This puts it ahead of Rust in real-world usage for such applications. SPARK2014 the latest, is a formally verified PL along with verification tools/ecosystem made for these type of uses.

I am trying to write show control software in SPARK2014 at the moment. Show control are critical since it is used to power lifts and stage machinery as well as performer flying systems where safety and high-integrity software is critical. I like Rust, but I feel it is not quite there yet especially in terms of the number of real-world systems in this niche. I also find SPARK2014 easier to write and read. I have been programming since 1978, and although I gravitate towards terse, functional languages like Haskell, APL/BQN/J, I experience a lot of friction whenever I dive back into Rust. SPARK2014 is very verbose and Pascal-like, but this is tedium vs. confusion or confidence in what I am writing. I know AdaCore is working with Ferrous Systems to bring Rust more up to the features of Ada/SPARK2014, but for now I needed to make a pragmatic choice based on real-world usage and ease of use and understanding.


Just want to add that I prefer Ada over C++ despite it having less mind share, tools, and libraries because its productivity is so high. (Not saying that I dislike C++.)

And, speculating here, with the encroachment of AI into programming/software engineering, I assume that it's convenient to use languages that are declarative (e.g. Haskell) and/or designed for verification/formal methods (e.g. Ada/SPARK) to integrate AIs of various kinds.


I’ve always been attracted to the idea of Ada (particularly SPARK) as a “really safe C-like”. I guess my main concern versus C is portability and ease of integrating libraries or exporting a usable C API, and secondarily the quality of the optimizer.


>> I guess my main concern versus C is portability and ease of integrating libraries or exporting a usable C API, and secondarily the quality of the optimizer.

Interfacing with C APIs / libraries is really easy and portable across Ada implementations:

https://learn.adacore.com/courses/intro-to-ada/chapters/inte...

http://www.ada-auth.org/standards/22rm/html/RM-B-3.html

The quality of the optimizer depends on the Ada implementation.

GNAT, the free software Ada implementation, uses the GCC backend so it is pretty good:

https://www.getadanow.com/

https://www.adacore.com/gnatpro


>GNAT, the free software Ada implementation, uses the GCC backend so it is pretty good:

GNAT also has LLVM Backend: https://github.com/AdaCore/gnat-llvm It's stable and Adacore plans to ship it in GNAT Pro 24, i.e. next release. Note that it's the same front-end, so you get best of both worlds basically.


Thanks for showing me that, I agree it looks pretty convenient.


A very long time ago The University of York (UK) secured the contract from the UK Science and Engineering Council (SERC) to write a unix Ada compiler.

It was a multi pass, 5-10 stage process (or more. I want to say 13 but time plays tricks) Very costly language to compile, in those days. (Vax 11/780 running Unix 32V, a precursor to BSD and Ultrix by some years)

The story was it emitted an error/warning code along the lines of "Congratulations you have used the most abstruse feature of the ADA language" -which the approval people made them take out before it got certified.

Wirth had a sabbatical residency in York around the time of the Ada language selection process, his choices didn't make it through the strawman/steelman process, I think they resurfaced in Modula-II. It was a pascal teaching department like many others in the UK of the time, so it made sense for him to spend time there. Modula-II is said to be a systems programming language too.

Ada was very hard to teach. The ideas of asynchronous, and exception handling didn't sit very well on young minds. Maybe now they're well enough understood to teach in Rust. At the time, the absence of a rationale around "why" was very strong. York had a miniature 2-lift engine model which it used as a proving ground for Ada programs and undergraduate projects. Lift sequencing is a bit of a black art in itself but if you put that optimality of "which lift, which direction, which floor" to one side, the mixture of real-time controls and sensors were probably a good fit. (lift == elevator for the other side of the Atlantic)

I remember some concern in the department the only logical endpoint for Ada was to code military flight control/weapons/radar systems, and people felt uncomfortable about the implicit participation in the UK War economy. This was during the time of the Greenham common protests against US nuclear forces on UK soil.

During the Alvey 5th Generation funding debacle ("Catch up with Japan at all costs") there was another round of this using GEC400 computers, again very directly related to Uk MOD needs for weapons control systems and what I think became the Nimrod airborne radar. Probably signals processing is a very good fit for Ada. (I didn't work on that project, or the compiler)

People said that the consistency of mapping data structures to devices, chip signal lines, real things, and the abstractions around that in types worked well in Ada. I found it horrendously complicated to understand. People might say C is a hack but the literal directness of C structs on a PDP11 or Vax to the underlying architecture worked pretty well to me. I guess the problem is that C was always too close to Assembler for some people. Bliss/32 was the systems programming language of choice in Digital, and I think continued to be used to write VMS, although I read now it was almost entirely written in DEC Macro assembler.


10 or so years ago they had a collection of marble runs with mixed steel/glass marbles with a Hall effect sensor, a proximity sensor and some track switches. Students would use Ada to run the marble tracks and have them sort the marbles.

It’s interesting to hear that they had a longer history of teaching Ada with real-time sensors and controls.


Memory safety might not be a bad idea for Linux in general.


Ada’s take on memory safety is pretty limited. Heap allocation is explicit; there’s a procedure literally named Unchecked_Deallocation to free a pointer. It does have thread-scoped locals and arenas, but nothing like declared lifetimes or borrowing. The spec allows for GC but I believe it’s rarely offered.

It’s safer than C, but I’m not quite sure where recent specs line up against C++.


Only true for those stuck in Ada83.

People keep repeating this nonsense without updating themselves beforehand.

EDIT: To simply education on Ada,

Yes there was an optional GC, no one ever implemented it, so in Ada2012 got removed from the standard.

Almost everything can be allocated on the stack, so a strategy is to catch exceptions of not enough stack space and retry the same function with a smaller size for the data structure.

Ada95 introduced controlled types, which is basically Ada's version of RAII, no need to call Unchecked_Deallocation outside implementation details. Hardly any different from Rust code that uses unsafe underneath.

Ada/SPARK, now part of regular Ada specification, provides theorem proving capabilities, and contracts, allowing another safety level still not available in Rust.

Additionally Ada Core is contributing improving lifetime rules for access types, to have a kind of borrow checker light, when needed.

Finally, there are still 7 Ada vendors in business, with 40 years of experience deploying Ada into safety critical scenarios.


>Ada95 introduced controlled types, which is basically Ada's version of RAII, no need to call Unchecked_Deallocation outside implementation details. Hardly any different from Rust code that uses unsafe underneath.

This is basically like C++ destructors, but the problem is there are no move semantics in Ada, so you can't implement something like unique_ptr.

It's hardly comparable with Rust.


Yeah, that is what happens when one doesn't understand contracts and formal proofs are used, my dear newly created account to advocate for Rust.


I've noticed a pattern that many Ada advocates on HN don't really seem to know the language. So they often conflate SPARK with Ada (as you just did) and make unfounded claims about Ada's memory safety, portraying it as being on par with Rust.

It isn't on par, Ada has no lifetime management whatsoever. It doesn't even provide C++ style smart pointers out of the box. It is possible to implement something like shared_ptr, AdaCore even has a tutorial[1], but as the other commenter pointed out, the language doesn't provide the primitives necessary for unique_ptr.

SPARK does have something like this. In fact, in SPARK, every pointer assignment transfers the ownership. But SPARK and Ada aren't synonymous. SPARK is a formal verifier built on top of Ada. Like most such tools, it's very constraining and time-consuming. It's not something that every (or even most) Ada projects use.

Nevertheless, Ada is a perfectly good language, and it's probably safer than C++. It has some really cool features. I'm really fond of in/inout/out references (cppfront stole this), named function parameters, secondary stack, fine-grained control over records (struct) layout, etc.

However, I'm not so fond of the extreme verbosity. It isn't just because of Algol-like keywords, Pascal is less verbose than Ada. Even basic stuff like instantiating generics is very noisy:

   procedure do_something is
     package Integer_Vectors is new
       Ada.Containers.Vectors
         (Index_Type   => Natural,
          Element_Type => Integer);

       vec : Integer_Vectors.Vector;
     begin
       return;
     end;
In C++ that would be:

    void
    do_something(int arg) {
        std::vector<int> vec;
    }
[1] - https://www.adacore.com/gems/gem-97-reference-counting-in-ad...


SPARK is part of Ada, time to update your ISO Ada knowledge.


No, it isn't. I think you're confusing contracts (added in Ada 2012) with the entirety of SPARK.


What’s your sense of the size of the Ada job market? High hundreds? Close to 10,000? Is it static or growing or shrinking? Is there a preferred online community for Ada devs?


The Ada job market is still large in aerospace and defense industries where lives are on the line.


When you say large, what do you guess in actual numbers? Just a rough impression; I'm not going to hold you to anything.


Probably a thousand max in France? extrapolating from the ones I know, if you're talking defense/aeronautics (but I don't know or see everyone). Someone from AdaCore would give a better approximate.

We (embedded company) don't hire specifically with Ada experience (it's welcome but not mandatory) we just train people and then focus on shipping stuff.


Recent work have introduced lifetimes and an ownership model into SPARK (the reduced easier-to-prove Ada subset) https://blog.adacore.com/using-pointers-in-spark and hopefully it'll trickle down soon in Ada.

Edit: there's also reference counting and controlled types of course. And the secondary stack makes many uses of heap allocation go away.


That secondary stack is cool! I want it for my C++ programs, particularly std::string which always heap alloc.

begin

   return "Forty Two";
end Get_Answer;


Yes the secondary stack is this thing that make many (most?) heap allocations inexistant in Ada. So when you're talking about memory safety to an Ada dev as if it was the biggest pain, they might look strangely at you.

Yes indeed, heap allocated pointers are a PITA but we don't use so many of them. In 16 years as an kinda embedded dev on multiple MLOC codebases, I can count on my fingers the times I had to reach to pointers (and unchecked_deallocation) and didn't have a safer alternative (usually Controlled scoped pointer types, but also containers, indefinite types, and if really stuck reference-counted things if really you must murky the scope - but I'd flag the last one as 'need to show why you can't just copy the data around' at code review...).

Not saying it's not missing but I wish we spent more energy on automated proof of absence of runtime errors, large-scale whole-system static analysis, and even more stringent runtime or compilation checks. And better tooling/language support around network, threading, and distributed processing, improved test and coverage tools.


You're returning a string which size might be known at compile time through inter procedural analysis. Too easy.

The secondary stack would better be used for 'functions returning objects, of which you don't know the size, at call time'.

   function Any_Number (A : Natural) return String is
   begin
      if A = 42 return "A = Quarante Deux";
      end if;
      return "A =" & A'Image;
   end Any_Number;
Or for those allergic to begin/end

   function Any_Number (A : Natural) return String is (if A = 42 then "A = Quarante Deux" else "A =" & A'Image);


It does have Rust like borrowing in SPARK subset of the language.

There was attempt to have it in Ada 2022 Standard, but it came late. Committee decided it's best not to rush and let compiler vendors implement how they think is best and go from there.

>It’s safer than C, but I’m not quite sure where recent specs line up against C++.

Ada allows to return objects of variable size, so it's not that common that you actually need to explicitly allocate stuff. The general guideline for dynamic memory allocation is:

1) Use Second stack (this is how Ada allows to return object of variable size) 2) If cant use containers 3) If cant use controlled types (RAII) 4) Only then use New.


Cool project for a proof of concept. That said, the kernel has so many unsafe features that making real world modules that are safe requires a lot of boilerplate to get around those features. Sometimes it's impossible. Rust is only barely useable now after tons of work to make it viable.


(2016)


Added. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: