Hacker News new | past | comments | ask | show | jobs | submit login
Levels of code in Forth programming (2002) (ultratechnology.com)
163 points by pointfree on July 29, 2019 | hide | past | favorite | 51 comments



Intriguing article. Just updated some Forth code for interfacing with an ADC on a sensor I’m building. Dealing with SPI/I2C and sensors/adc’s in Forth really is fantastic. The results in much more succinct hardware code IMHO than C or even higher level languages. Really Chuck Moore seems spot on when dealing with specific hardware.

One example is a simple Forth word (macro) to convert 3 bytes into one 32 bit number integrates well into code for dealing with an adc chip.

However I wouldn’t want to write while applications in Forth as dealing with stack swaps becomes annoying. Still writing your own Forth is pretty fun too. I did mine by basing C compiler XMacros which made porting to an Itsy M4 trivial (about 3-4 of work) [1].

Still there’s a few Forth’s for Arduino’s, Itsy’s, ESP’s [2] which are really fun as they enable repl style interactive programming with hardware while still being blindingly fast!

1: https://github.com/elcritch/forthwith 2: https://github.com/zeroflag/punyforth


Forth is pretty amazing. It really delivers on that old promise of “a language that will make you a better programmer in other languages”.

However, from my experience, since there is basically no syntax, all Forth programs tend to be a DSL for the problem at hand. It’s almost like having to learn a new language ok each new project. It’s like a complete opposite of what makes Go great.


That is exactly what Chuck Moore wanted when inventing Forth.


I know. It’s a feature, but it makes it harder to share code (and Chuck Moore believes that sharing code is rarely worth it). Maybe he is right. His productivity is unbelievable :)


The more time I spend dealing with blowback from excess complexity being imported in the form of 3rd-party libraries that offer complicated solutions to simple problems, the more I think that Chuck Moore was very, very right on that point.


I agree with you, but I wonder what his answer to stuff like GUIs would be. There’s a tremendous amount of complexity and domain knowledge in stuff like drawing fonts, and in cryptography, and so forth — and very very few of us have the time to become competent in even one of those, let alone all of them. Then consider the amount of work necessary to have a modern browser: text parsing of not one but three languages, language interpretation, more graphics, more cryptography.

It would be awesome to get back to first principles, but modern systems try to do so much that I wonder how practical it would be to reinvent them — and I would how practical it is to say, ‘well, don’t do that then.’


I don't know what Moore would say. Personally, I've retreated to the back end - used to be full stack, but I'm just sick to death of how overcomplicated front-end work has become.

I'm inclined to say that, e.g., the modern browser is a cautionary tale that complements the Chuck Moore approach to things: By forever piling thing on top of thing in an evolutionary way, you end up with a system that ultimately feels more and more cobbled together, and less and less like it ever had any sort of an intelligent designer. Perhaps the lesson is that it can be worthwhile to occasionally stop, take a real look at what things you really do need, aggressively discard the ones you don't, and properly re-engineer and re-build the system.

Obviously there are issues of interoperating with the rest of the world to consider there, and Moore has made a career of scrupulously avoiding such encumbrances. But a nerd can dream.


Also consider that all we know is a world that has become more global, open, and relatively peaceful post 1970s. If collaboration were to slow or decline, open-source would be harmed. And/or if Google and Facebook lose its dynamism from politics, regulation, and maturity, corporate sponsored open-source could be shaken. Google could become like AT&T and Facebook like Ericsson or something in some way.

Once unstoppable sectors, like aerospace (to mix comparisons) began to reverse and decline in the early 70s. No one really saw it coming. I can't think if one publicly known or credible person called it in 1969 shortly after the moon landing, at least on record. Oversupply of engineers in the US and the West became a thing. And engineering still suffers here because of aerospace's decline. Forth began to lose steam around then, right? Forth, hardware and Cold War (barriers) politics are inextricably linked, perhaps. And then GNU/Linux and BSD saw its high-collaboration paradigm birthed around that time. Nixon/Kissenger talks with closed China began around then too, and now relations are breaking down with a more open China today.

Look how Lua scripting came about not terribly so long ago. Some parallels. Brazilian trade barriers. Now half believe Huawei is evil. Cross-hardware story may be cracking. Many believe Google is evil. Open software may be cracking. And there are rifts between US, EU, and China on how to regulate the internet. A new Cold War may be brewing. It's a nerds nightmare.

If anyone can tie in distributed ledger and specialized AI coder productivity tools, or something to counter this argument or round it out, that would be awesome.

EDIT: I was mistaken. Forth caught on with personal computer hobbyists in the 1980s, per Wikipedia. However, as a career or industry,slow downs with NASA and Cold War spending seemed to take some wind out of Forth's sails. I've noted that lot of that type of work was what paid people to write Forth. And the open-source paradigm with C/C++ and GNU Linux was even more limiting, I believe.


“I agree with you, but I wonder what his answer to stuff like GUIs would be.”

Couldn’t say exactly, but it’d probably look something like this:

https://en.wikipedia.org/wiki/Display_PostScript

:)


As far as I recall, Display PostScript was display only - what you really want is NeWS which used PostScript for display and for building applications:

https://en.wikipedia.org/wiki/NeWS


Potayto, pohtato… it’s all Polish to me. ;)


Ehh... the reductio of this argument is writing everything in assembler (libc? giant hunk of third party code right there). I surmise that, by comparison, the blowback you encountered was relatively minor.


No, not writing everything in assembler, this isn't about high or low level. It's about writing things yourself for what you actually need.

Because most of the complexity comes from code (esp. libraries and drivers) trying to solve a larger problem than you actually have.

That's the same reason why, when you follow that logic, you eventually write your own Forth. Not because it fun. Not because you want to learn about Forth or compiler. But because my Forth solves my problems the way I see fit, her Forth solves problems the way she wants, and your Forth is going to solve the way you want.


It is entirely and completely about high level vs low level.

"High level" means details abstracted away and solved so you don't have to think about them. Our CPUs understand only the most primitive of instructions; the purpose of all software is to climb the ladder of abstraction, from a multiplication routine abstracting over repeated addition, to "Alexa, set an alarm for 8 AM." To write things yourself is the very essence of descending to a lower level.

Abstraction comes at the price of loss of fidelity, yes - Alexa might not ask you to specify exactly what form your alarm will take - but the benefits are a vastly increased power/effort ratio. It's worth it, because most of the time you don't care exactly how a task is done - you just care that it IS done. And - mostly - your needs are not that special.

Frankly, sharing information on how to do things so that others can build upon them is the only reason we have technology at all. Perhaps you've read "I, Pencil"? With a lifetime of effort and study, you would struggle to create a single pencil drawing from "scratch". Chuck Moore's supposedly astonishing productivity notwithstanding, I notice that all of the software I actually use is a heavily layered tower of abstraction (and, curiously, none of it is written by Chuck Moore). It appears that by and large the choice is between layered, multi-author code - and no code at all.

https://fee.org/resources/i-pencil/


> Chuck Moore's supposedly astonishing productivity notwithstanding, I notice that all of the software I actually use is a heavily layered tower of abstraction (and, curiously, none of it is written by Chuck Moore)

Perhaps you never saw the images from the Philae space probe? Because that's an RTX2010 that powers it, one of Chuck Moore's designs.

Maybe you don't use Moore's software directly, but you never know when it has been used for you [1].

[1] https://wiki.forth-ev.de/doku.php/events:ef2018:forth-in-tha...


There's a significant practical difference between importing the complexity at build time versus as part of the running application. Building on top of a compiler is not the same thing as importing external code.


Software today is developed by teams, not individuals. Systems custom-fit to an individual programmer are next to useless. You need libraries of common code in order to collaborate effectively without duplicating effort.

See also: Emacs, the ultimate customizer's editor, easily shapeable to your particular needs -- and currently losing badly to Visual Studio Code which is only readily customized with configuration options and third-party packages. When you need to pair or mob, having a common toolset and vocabulary beats having a special-snowflake environment.


at least we can get rid of the bloated web apps once everyone begins to do so...but would it be possible if libaries are written in a way that it's easy to just integrate just a portion into existing code?


The problem there is that most "libraries" are actually frameworks.

The difference I'm drawing being, libraries just provide a mess of utility functions. Theoretically, even if your compiler won't strip the library stuff you don't need, you'd be able to take just the bits you need by copy/pasting a relatively small volume of code. And dropping the library would be a small change, that just requires finding replacements for the functions and classes you were using.

Frameworks tend to involve some Grand Unifying Abstraction that you need to inherit from, and that gets imposed on your own code. Things tend to be so tangled together at a conceptual level that it's not really possible to use them in an a la carte manner. Migrating off of a framework tends to require more-or-less a rewrite of all the code that interacts with it.

To take some Web examples: jQuery's more on the library side of things. D3 is more of a framework. React is very much a framework.


Wow that go me thinking. What if specialized AI code recommenders could sniff out solutions. Get away from libraries with objects or structs with methods that mutate. As more people realize composing functions (Forth has concept of composing words, correct?) with fewer side effects is a good thing, I wonder if it's possible. There is some amount of my workflow where I'm looking at StackOverflow, my git project history or others, examples even on blogs (at least when I was new), or my little code snippet journal for stuff already solved. Automate getting idiomatic solutions from a StackOverflow or Github commits of sorts, or something. I know we are no were near, but FB's Aroma and others have the first gen AI recommenders in the pipeline that at a high level do this. That way we are just dealing with code snippets. I've only read Forth code and introductions to it, but it seem all about composition. However this is hard to conceive with today's coding forums and repos because most are gluing mutating library APIs (turtles all the way down) together. So a code recommender paradigm of this sort is chicken vs egg.


In a sense, any API that has its own functions and data structures becomes like a DSL. For example OpenGL feels like its own language, even when you write it in C or C++, or C#.


Moreover, in any nontrivial application, you have modules and layers of abstractions - and those boundaries are DSLs on their own. It's actually good to think of them as languages the client code will use to write its solutions in (SICP makes this point early in the book too).

People are getting too hung up on the word "language", like it was something only the most experienced and smartest of programmers were allowed to build. Nope, programmers build new languages daily in their code; it's how you abstract things.


I don't see the semantic difference between calling a subroutine or using a Forth word.

65 emit

vs

putchar(65);

vs

(print 'A')

It's all the same to me.


Are there any good but not too large codebases where someone new to Forth could get a feel of how it's done? I'm intrigued by building DSLs, but a university course a long time ago left a totally different impression.


"Over the Shoulder 1 - Text Preprocessing in Forth" https://www.youtube.com/watch?v=mvrE2ZGe-rs - Sam Falvo demonstrates writing a blog engine in Forth.


Before I learned Forth, I was very comfortable in Common Lisp. Now I miss the convenience when writing Forth and raw simplicity when writing Lisp.

I realize the problem isn't Forth, some people (such as the writer for example) are capable of pulling amazing feats for such a primitive tool.

But as a result, the programming languages [0] I've designed since have all been part Forth and part Common Lisp.

[0] https://github.com/codr7/cidk


> Portability is not possible. Real applications are closely coupled to hardware. Change the platform and all the code changes

Some lessons simply did not age well.... Even in world of today's microcontrollers, which often have kilobytes of RAM and very different CPU styles, people still write mostly hardware abstracted code.


It's the usually game of MPUs becoming cheaper or more stuffed for the same price because the demand allows to manufacture bigger batches, and the demand is to allow more portability by the means of more stuffed MPUs that allow more abstractions...

But it's a miniature of the same tragedy as Node.js applications that consume ten times the resources needed just because it allows people to do more with more... In a logarithmic way.


> Portability is not important. Portability is not possible. Real applications are closely coupled to hardware. Change the platform and all the code changes. If it didn't, you wouldn't have changed the platform.


Most code should be abstract, because the things that are specified are abstract. Some code needs to be closely coupled to hardware, but most code should only need to be closely coupled with other code.


This is a really interesting read, and as someone who’s almost exclusively programmed in high level languages, this approach seems alien to me.

A couple of questions:

1. Is it possible to write complex, modern applications (things like browsers, photo editors, etc. — things that would take millions of lines of Java or JS) using this style of programming?

2. What is “sourceless programming”? Where is a good place to learn more about it?


1. "This style" is kind of hard to pin down. If you mean Chuck Moore's dramatic minimalism, then, yes, but it won't resemble what most people in computing expect from a browser or photo editor. If you mean expressing the abstractions you want directly in the primitives you have without bothering about layers or even accepting the idea of higher vs lower levels, then, yes, certainly. It requires a lot of unlearning, though.

2. Sourceless programming was something Chuck Moore tried for a while where he designed a machine that was the virtual machine he wanted to program, implemented it in hardware, and then edited byte code directly for it. Later he stepped back and went to colorForth, which has the source/binary separation we are all accustomed to.


Thanks for the reply!

> but it won't resemble what most people in computing expect from a browser or photo editor.

In the sense that an end user would interact with some low level API primitives, rather than a full GUI? I’d love more examples or metaphors, or maybe a link where I could learn more.


Chuck Moore’s software is somewhat famous for reducing the complexity of both the software itself as well as the requirements. In developing a web browser, he would probably eliminate all of JavaScript, the user interface, and most of CSS, and leave it to run only on a chip he designed for the purpose. His ideology is super cool, but isn’t what is expected of software like this.

For instance, OKAD (chip design and simulation CAD package) is 500 lines of colorForth. Although it includes all sorts of fancy tools, he also applied his ruthless minimalism to the requirements.


I would welcome all of these except maybe a separate chip. E.g. having 3 separate languages built in a browser does look like a overcomplication.


> 1. Is it possible to write complex, modern applications (things like browsers, photo editors, etc. — things that would take millions of lines of Java or JS) using this style of programming?

If it were a complex application it wouldn't be very forthy would it? Factor the problem instead of trying to factor a preconceived solution.

> 2. What is “sourceless programming”? Where is a good place to learn more about it?

Sourceless programming was used in Okad a vlsi design tool written by Chuck Moore.

http://www.ultratechnology.com/mofe16.htm

Brad Nelson also experimented with sourceless forth:

https://docs.google.com/presentation/d/1wL2eqf7eHGEybsK0C4MU...

https://github.com/flagxor/bicol


>I was also researching AI in Forth, implementing ideas from LISP examples and doing expert systems and neural nets and mixing them and building robots. In the software I added a layer for an inference engine for English language descriptions of rule sets and a layer for the rules. I wrote a learning email report and conversation engine AI program and had it running for a few months. My boss could not distinguish it from me. That was my idea of AI, smart enough to do my job for me and get paid at my salary while I took a vacation.

Is the author exaggerating here, or did they actually succeed at writing something that could pass whatever Turing test level his boss could offer?

If it's the latter, what then-current knowledge would they likely have sourced?


He’s throwing shade on his former boss


I can't know what the author means, but the thing which is known as chatbots now is very old tech, in fact:

https://en.m.wikipedia.org/wiki/ELIZA (1966)


Eons ago as an embedded programmer I came to respect Forth. I encountered numerous situations where using Forth led to a much smaller footprint (size in particular) -- Why? For exactly the reasons that Chuck Moore espouses here; you are writing a purpose-built VM from the hardware up.

Even then I don't agree that portability/abstraction isn't important - it's got the potential to be an extremely reductionist position. Instead I'd argue it's incredibly expensive and should be treated as such.


"Instead I'd argue it's incredibly expensive and should be treated as such."

Using a cross-platform framework isn't incredibly expensive in time or performance cost. It's done by one-person projects and large businesses alike. There's issues but that's way overstating it.

Then, there were 4GL's like Lansa and Windev that made it easier to do than creating non-portable, native applications. Those weren't used for performance-sensitive code, though. Mostly business apps.


I wasn't being specific enough - I meant expensive in the most general sense. e.g. An abstraction is a cost not only in terms of (potential) performance, but developer headspace, etc, etc.

The right number of abstractions is very powerful. Too many any you'll sink under the weight of them.


This is really captivating. I think I’m inspired to finally dust off my of TI-Forth for the 99/4a


Chuck Moore has always struck me as some kind of alien, not unlike the way stories about Von Neumann do - that this is a person who is, in his specialized field, capable of thinking in ways that I just can't, and achieving things that seem practically magical with it.


You might be right... but I have the impression that if you talked to him, he'd tell you that you'd get similar milage by following his philosophy about program design. It's just that most people just can't stomach it.


Does anybody know what Chuck Moore does right now? I was following a while his posts on patent lawsuite but then he went quite.


To sum up Chuck Moore's quotations: you write code that takes all of the machine, and you got to write all of the code. If you have this, you can squash out all abstraction and build the ideal solution directly.

This may hold true for small-scale hardware like controllers. They have a well-defined set of tasks, small enough to fit in your head.

This means that you have a certain trouble sharing the code with your colleagues, making the bus factor of your project closer to 1, and lowering the usefulness of code reviews.

This means that you have trouble sharing code with yourself in your next project.

You become tightly coupled do the machine. This is, on one hand, liberating, you can do anything easily. But this is also limiting, because you spend your mental resources on optimizing for this particular machine.

I personally think that deep optimization is something that the machine should do, they are better than humans at this most of the time. And humans should do want machines currently can't.


My takeaway was that you're always writing against a machine. The difference is whether you're writing against a physical hardware machine or an abstract virtual machine. OSes and VMs abstract over existing machines and present a virtual interface for the sake of portability, but they're still themselves machines.

You're still coding against a machine, and you still have to spend time learning the vicissitudes of that machine. But you also have the impedance mismatch between the virtual machine you're developing against and the physical machines you intend the software to run on. All software abstractions are, to some extent, leaky.


> This may hold true for small-scale hardware like controllers. They have a well-defined set of tasks, small enough to fit in your head.

People usually insist that software should be modular, so that you don't have to have millions of lines of code in your head when making a local change. That's what drove the procedural evolution and later the OOP evolution.

So if you're a good boy/girl/etc. you write your million-LOCs PC application as modules that are manageable for (ideally) a single person. Then you need an extra programmer to glue the modules together.

> This means that you have a certain trouble sharing the code with your colleagues, making the bus factor of your project closer to 1, and lowering the usefulness of code reviews.

Where does Forth prohibit peer reviews and pair programming? If you have a bus factor of 1, it is because you don't want to pay the price of increasing it. It has nothing to do with Forth; plenty of projects in super-high hyper-readable projects have a bus factor of 1.

> This means that you have trouble sharing code with yourself in your next project.

Not really. It's easier to copy/paste/hack Forth code. The code is more compact for various reasons: point-free style makes it less verbose, you tend to factor more intensively and you code exactly what you need.

> You become tightly coupled do the machine. This is, on one hand, liberating, you can do anything easily. But this is also limiting, because you spend your mental resources on optimizing for this particular machine.

No, that's the other way. When you code e.g. for a little-endian, two's complement CPU then you don't have to worry about big-endian and sign magnitude. You are actually also optimizing programmer's cycles too.

Being tightly coupled to the machine is what embedded programming really is about. Embedded programming is often about writing esoteric values at occult addresses in order to bang out bits on an SPI bus. Running a Python program on Debian on a rPI is not really embedded programming.


It kinda sounds to me like Forth wants to be used in the context of embedded programming, then.


Not only. Checkout Forth Inc. last year's projects [1]. In one of them, Forth is used everywhere from micro-controllers to the monitoring PCs.

[1] https://wiki.forth-ev.de/doku.php/events:ef2018:forth-in-tha...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: