Hacker News new | past | comments | ask | show | jobs | submit login
Enlightenmentware (mmapped.blog)
444 points by zaik 8 months ago | hide | past | favorite | 238 comments



I would say the compiler explorer[0] fits the definition perfectly. It may seem like a straightforward piece of software, but it has immensely changed the way people discuss and share knowledge around compilers and performance optimization.

I regularly feel the impact on the quality of forum discussions. There's a lot less speculation about if "call X gets inlined", or "Y gets vectorized". Bold claims can be supported or disproven quickly by sharing a link. And then you have tools like llvm-mca[1] or uiCA[2], if you don't mind going into the weeds.

[0] https://godbolt.org/

[1] https://llvm.org/docs/CommandGuide/llvm-mca.html

[2] https://uica.uops.info/


along those lines, the entire notion of a web playground (a sandbox where users can just write and execute or otherwise process code) has vastly reduced the barrier for checking out a project or experimenting with its behaviour


> web playground

This _so_ much. Where in the past I've used Jupyter Notebooks for short, one off stuff or to test something, I now do that online for almost any language.

Notebooks are still useful to write documentation though.


I was thinking about including Mathematica as enlightenmentware. Mathematica 6 (https://www.wolfram.com/mathematica/newin6/) was the first truly interactive system I used (we happen to have a box with a license at the university). It impressed me so much that I still have a lot of warm fuzzy feelings toward Stephen Wolfram and his work.

Unfortunately, my relationship with Mathematica didn’t go anywhere: It was too expensive back then, and I never found a good use for it except for double-checking my homework.

I tried other computer algebra systems, but they didn’t impress me as much.

If you own a Mathematica license and found a good application for it, please let me know!


I have a network that work for numerical in Lagos Nigeria headquarters address Ojuelegba Stadium Bus Stop Premier lotto numbers work for CSI if you can decor ideas for what information you need help with .I use a NECO ECOWAS textbook to jackpot liquid cash from the organization Reach out me directly to help you post on the machine number agl2329762557@gmail.com


Well, Notebooks main use case is a different purpose, not for trying one-off stuff or checking if some syntax is valid. It's for doing stuff step by step, annotating the steps and/or explaining each result.

Web playgrounds are ok for testing some syntax (if you don't have a local REPL/easy way to test), but not for one-off stuff that involves file input or that you want to check against real environment assets.


> Well, Notebooks main use case is a different purpose, not for trying one-off stuff or checking if some syntax is valid.

I know, but that's what I had used it for too. Like posting some code, for example in a HN post ;) As I've said, I still use it mainly for documentation.


I would dare say for me the https://pythontutor.com/ even more so than compiler Explorer. (hint: it's not just for python)


> (hint: it's not just for python)

I would definitely have overlooked it, and it really needs a better domain name.


Pahole is a related utility for high performance programmers. I've been able to attain orders-of-magnitude performance improvements in trading applications using it.


> I couldn’t fathom why anyone would use Windows²

I saw this sentence, was about to type something in response, and then I expanded the footnote (side note: is it really a 'foot'note if it's not in the footer of the page?):

> I became significantly more tolerant since my early university years. Windows (specifically the NT family) is a great operating system. I even have it installed on my gaming pc so that I can buy games I never play.

It's pretty rare to see such a balanced perspective with respect to Windows when someone starts off with 'UNIX'.


Windows 2000 was great IMO, and peak Windows!

XP was peak if hardware compatibility is important.

Then it went downhill slowly. UI decisions and telemetry and now needing an internet connection and a MS account to install and now Win11 refuses to install on perfectly good but older hardware.

Microsoft cloudifying Office ironically makes going to Linux as a normie fairly easy as Office is the only thing I would miss. And mainly due to it’s dominance rather than it being great.

Windows dark side is a shame as MS as a developer’s company is really good. VS, VSCode, Typescript and C# and F# are awesome. And also some changed to Windows are good.


Occasionally I have to help my father with his windows computer and each time it actually gets worse. Edge. Suggestions.

Like I am sure the actual fundamentals of the OS are fairly high quality but holy shit it sucks to use.


I had a similar, frustrating experience last weekend trying to copy data from a phone to a windows machine.

Windows is trying to hide all "technical" stuff so hard that it becomes impossible to do anything.


I put the peak at Windows 7.

Windows 7 UI is like Windows XP but prettier thanks to GPU acceleration. Compared to the XP generation, it had better security, 64 bit support out of the box, it was an "internet age" version of Windows, but it could still run offline and wasn't too obnoxious with ads, telemetry, etc...

It definitely got downhill after that.


I mostly agree. But Windows 10 added virtual desktops[0]. Took them long enough!

Its dialog when copying files in nicer as well IMO.

[0] Windows 7 supported up to 4 virtual desktops, but not out-of-the-box: https://learn.microsoft.com/en-us/sysinternals/downloads/des...


> Windows 7 UI is like Windows XP but prettier

Isn't 7 UI basically Vista?


Sorta, but Vista was slow and stuttery, so they don’t feel the same.


Windows 7 effectively cleaned up some details, and by the time of windows 7 the vendors shipping broken drivers already had to correct for NT6 changes or deal with their hardware not working.


Is Windows 7 fast on Vista-era hardware?


> Windows 2000 was great IMO, and peak Windows!

> XP was peak if hardware compatibility is important.

It all started with Windows NT 4. It had its own kernel and was enterprise (and network) focused, compared to Windows 98 (which was Windows 95 (which was Windows 3.11 minus the DOS host dependency)) with bolted on network capabilities.

Windows 2000 is literally "Windows NT 5.0", and Windows XP is "Windows NT 5.1". The win95 lineage was decisively killed by Windows Me, after which the NT kernel took over the entire product line.


> From that moment on, unix followed me through all stages of my life: the toddler phase of keeping up with the cutting-edge Ubuntu releases, the rebellious teens of compiling custom kernels for my Thinkpad T61p and emerging the @world on Gentoo, the maturity of returning to Ubuntu lts and delaying upgrades until the first dot one release, and to the overwhelmed parent stage of becoming a happy macOS user.

I started off as a Linux zealot and followed a very similar trajectory. I think it’s a sign of maturity to realize there is no absolute “best” in engineering, just a best solution in a particular problem space, and Windows is the best for a large number of users for a reason.


I started as a Linux enthusiast a long time ago, these days in my own time I use macos and I don't miss Linux that much, as long as I'm in the terminal I don't feel a difference.

At daily job I'm forced to use Windows, the only thing that's keeping me from changing jobs is WSL2. I'm just not productive with mouse based tools, I need a terminal and powershell doesn't do it for me. Everything feels alien and less usable to me even after years, fonts, window decorations, file manager, UI inconsistencies between different tools. Everything seems slightly hostile and out of place.


Pedantry: it feels like false balance.

Windows is fine, that's it. I use it every day, but it's got so many weird quirks that (that I can't do anything about, being the difference with Linux) it seems ridiculous to call it "great".

I'm in the other room from my windows laptop. It's late, there's almost nothing running in it, the lid is closed, surprise surprise I can still hear the fans.


> I can still hear the fans.

I could not deal with that I’m sorry. I’d have to turn it off.

The only way I could make Windows usable day after day in a previous job was to shut it down every night, and then I had the BIOS configured to start it up again and iterate through my extensive startup list before I got into work. Was quite effective like that.


My Ubuntu machine is also noisy. I’m fairly certain it has nothing to do with system activity.


Fans are probably either a rogue service you can find in the task manager or hardware problem (maybe it needs some new thermal paste or a good air blow to remove dust).

That has usually nothing to do with the OS itself.


Counterpoint: I bought a System76 laptop last year. As it came, the CPU fan was never off for longer than a few seconds, even when there was no load at all. The fans are not very loud, but the coil whine just before re-enabling the fan was disturbing.

The motherboard's firmware is open, however, so I rebuilt it with a slightly adjusted "CPU fan curve." After flashing it, the fans now go online when there's an actual need, which is to say - rarely. The coil whine still happens, but hearing it once or twice a day is much less irritating than hearing it every 10 seconds.

So it's possible the problem is on the OS side (I think we can agree the firmware is part of the OS) and it's sometimes possible to fix the problem in software... as long as you have control over it.


And what about rogue services that come with the OS itself? Things like Windows Update, Windows Defender, the Phone app thingy, diagnostics policy service, the Xbox game bar or whatever it's called, the .NET optimization thingy and a dozen other things that like to wake up randomly and start consuming resources whenever they feel like it. Most of these things you can only disable temporarily, if at all, without resorting to dubious 3rd party tools.


> That has usually nothing to do with the OS itself.

More often than not this mystical rogue service ends up being something internal to windows.


Recent Windows releases are notable for not running software without the consent or wishes of the user :)


After 20 years of using Linux on the desktop (and FreeBSD, and NetBsd) in parallel with Windows, I gave up. I don't like to always configure things, I don't need 20 different ways to accomplish a task and some of the software I use is not available on Linux. So I went Windows only for the desktop since 4 years. Of course, when I had to do something server side, it was Linux only.

Recently I bought a MacBook Pro and the experience is very Windows like. I don't have to mess with the OS and it just works.


Stick to one Linux distribution and you can have the "one size fits all" experience you want. Who's forcing you to unixhop and constantly fiddle with your stuff? I'm on Debian and never have to change anything and my setup just works.


For me at least, it's personal discipline. If I can fiddle and change stuff, I will.


> It's pretty rare to see such a balanced perspective with respect to Windows when someone starts off with 'UNIX'.

I dunno. I've posted some pretty balanced opinions on OSes: I've frequently criticised Windows, Macs, Gnome, Plasma and more.

They all each suck in their own specific ways. Most people acknowledge this.[1] Many people are just like me: we put up with the crap on each system in order to get work done.

[1] The exceptions are almost always Mac and Gnome users: trying pointing out that UI can be objectively bad, and the default Mac/Gnome experience fails on more than a few of the objective UI metrics, and you almost always get multiple Mac/Gnome users saying that UI is all subjective.


To some extend it is? After all, those users like that objectively bad UI that you are talking about. The fact that they’re idiots doesn’t make it any less subjective.


On the contrary, footnotes on a webpage belong in the margins. Putting them at the foot of the article in current year is like banging rocks together.


I'm not the Linux zealot I was as a kid, but I can never see myself going back to Windows. The particular niceties of a Unix environment are ones that I've come to rely on, and I could never go back to managing all my files and data through rat wrestling, the way Windows seems to want you to do.

That said, I can see the merits of Windows, especially for normies or video game players. It's just absolute friction town for me to use it.


The main thing keeping me on Windows is touch screen support. I know I'm a freak for wanting it, but we all have our ergonomic preferences.

Beyond that, I treat the OS as an appliance. Most of the software that I use is platform independent.


> (side note: is it really a 'foot'note if it's not in the footer of the page?)

These are rather sidenotes (or margin notes).


I think the peak of windows was when they introduced WSL, making windows the ultimate crossplatform dev OS.


Docker is one of these tools for me.

The unspeakable amount of my time and headache it saved during my consulting career puts a smile on my face.

Docker allows me to quickly iterate over the steps required to run ancient projects.

Not having to install 5 different relational database servers on my host OS alone is worth the learning curve.

Also crucial for running concurrent, reproducible, Python environments in my laptop.


While it takes heavily from the UI, podman with user namespaces enabled is the completion of this idea for me.

No more sudo requirements, greatly reduced attack surface, isolation as a "user" concern. It sits happily in my UI doing exactly what I need it to do for all these use cases.


Imo, making a Dockerfile for an ancient project isn't always easy


But usually easier than making it build locally.


And more easily repeatable. Plus, it avoids polluting the full environment.


In comparison to what? Making a chroot env for such a project is way harder than dockerizing it. VirtualBox and Vagrant might not be much harder, but are slower to the point of being irritating. I might be missing some alternatives, but among the approaches I tested, Docker is still the easiest way to build and run unfamiliar projects.


Not to mention, Vagrant projects just... seem to stop working, eventually?

I've had this happen to me a few times, always for different reasons, always a huge time sink to debug.

Did anybody else have the same experience?


I can't really think of a good example that other people haven't mentioned, but I have an anti-enlightenment piece of software, spring framework.

Spring actively hindered my ability to understand the simple concept that is dependency injection, and I know I'm not alone. Many a Java developer think that you need a big complicated framework to pass dependencies into your modules, instead of having your modules fetch them for themselves.

This isn't a criticism of spring per se, it's fine, it provides value for people, but I think it can lead people to build software that is more complicated and less portable than it needs to be.


I've had my short stint at using Spring. Often I was dropped into a project where it's already setup and working. When something breaks or I want to extend/modify what was working, I hit a wall in terms of discoverability: how it's been working all this time, and how to find a suitable level of documentation to help me. There are reams of documentation for Spring, but nothing of the kind that'll help me if I'm lost. So, from my perspective, it's write-only framework; it's hard to reason back.


I spent a lot of time with spring, and became a bit of an expert on it, I'm able to debug and understand most issues that come up with it.

Still, I agree wholeheartedly, there is too much magic, it's very difficult for someone to come into a spring codebase without a lot of background experience, and understand how things tie together, and I don't think it's really time well spent acquiring that experience.


> Many a Java developer think that you need a big complicated framework to pass dependencies into your modules, instead of having your modules fetch them for themselves.

Not sure what you mean by "module" here, but DI means that the dependencies are defined externally to the actual business logic, which kinda contradicts this "fetch them for themselves".

I think the problem with Spring is that it has too many features, too many different ways to do the same thing, too much configurability. Historically, Spring jumped on most hypes, tried to basically support everything. We ended up with this complexity monster, but this also made it popular. Spring is the Oracle DB of the backend development in the sense that you won't ever get fired for choosing it, it's the safe choice which will support in some way everything you might need to do.


> Not sure what you mean by "module" here

In the context of Java, by "module", I really mean "class" or "object".

> but DI means that the dependencies are defined externally to the actual business logic, which kinda contradicts this "fetch them for themselves".

Yes, that is what I am saying, DI is an alternative to patterns like the singleton class.

For example if you have a singleton for a database connection, and a bunch of services that use it. The services have a direct reference to the singleton, they "fetch the db connection for themselves".

Compared to DI, a singleton makes it more difficult to mock the connection for tests, also running tests in parallel is more difficult. If you get a requirement to make the app multi-tenant, refactoring the application to talk to multiple databases is more difficult.


I see what you mean now, and I agree.

I thought that you were hinting that DI can be done (esp. on a smaller scale) often manually without any complex container magic.

Like in a smaller app, construct all your service instances in the main method while passing the dependencies into constructors and voilà - you have a DI based application without a container / any framework.


> I thought that you were hinting that DI can be done (esp. on a smaller scale) often manually without any complex container magic.

I also agree with this, and at any scale.

Actually I think the larger scale the codebase, the more it would benefit from manual DI, obviously at that point, the wiring code grows and you would probably break it down into multiple methods.


I would dare to say Dependendency Injection as a concept is unnecessary and creates more problems than it solves.


Dependency injection is a basic tool of writing robust, testable code. The alternative is strict hard-wiring of the dependencies, which deprives you of places your code can be tested.

But do not confuse "dependency injection" with "massive heavyweight opaque framework with a billion bells and whistles that breaks constantly". Dependency injection includes things like passing in a handle to the SQL database instead of it being a global variable, which your test suite uses to switch between various test instances of the database instead of the target code being hard coded to some variable, or even hard coded with its own connection credentials.

If you're not using dependency injection you are almost by definition using a lot of global variables. I'm as happy or happier than the next programmer to be a contrarian, but, no, the collective wisdom on those is dead on. They're deadly and should be avoided. Dependency injection is the biggest tool for that.

Not having dependency injection creates more problems than it solves.

However, I'm not sure that most "frameworks" for it aren't creating more problems than they solve. Probably one of the classic examples of lopsided accounting, in this case looking at the benefits but neglecting the costs. Anything looks good if you do that. But a lot of "frameworks" seem to bring along a suite of middling, sort of convenient benefits at the cost of massive opacity, unreliability, and the bad kind of magic. Not a good trade for a lot of programs.


Dependency injection adds a lot of complexity even without a framework, and it has nothing to do with global variables, you can use global variables with dependency injection, and you don't have to use dependency injection to avoid global variables.

Dependency injection has to do with coupling. It is a weak coupling technique, and the point is to make components interchangeable. That's also the reason why it is considered good for testability, because it makes mocking easier: just replace the functional component with a test component. It sounds nice, but of course it has downsides, lots of them.

First is that is simply more code. Instead of calling B from A, you have to wrap B in an interface, instantiate it, pass it to A, and make the call. On small components, it can easily double the amount of code, or more. Of course, mechanically, more code means more development time, more things that can go wrong, more maintenance costs, etc...

You also lose track of what happens. When you call B from A, to see what is happening, it is obvious just by reading the code of A that you should look at B. Using dependency injection, there is no way to know what is called by just looking at A, you need to figure out what is instantiated first, which is not always an easy task.

Of course, there is performance too. Dispatching is expensive, typically requiring two or more memory reads to call a function. And all these little objects can take up valuable space in L1 cache.

Now, there is the argument of testability, ok, testability is good, but in many cases, there are ways of testing things even with a strongly coupled architecture. For instance, if you have a configuration file reader, you don't need to abstract it to write tests, just have test configuration files that are read by the reader. It saves you an abstraction and a bunch of dedicated test code.

Dependency injection has a place. Unix file descriptors are a kind of dependency injection and very few complain about them. It is just that it is an expensive pattern that should be used wisely and not all over the place.


> Instead of calling B from A, you have to wrap B in an interface, instantiate it, pass it to A, and make the call.

There is a place for interfaces, but not all dependencies need to be interfaces. The database handle mentioned in the previous comment is a good example of where dependency injection can be useful without the need for interfaces. In your configuration example, a string path to the configuration file might be the useful dependency.


One might about as sensibly say the same of functions being able to take arguments. If this is meant to illustrate the damage working with Spring does to understanding the concept, then it's an excellent, if mildly horrifying, illustration.


One elementary need DI (or perhaps IoC more generally) provides is the ability to mock certain parts of your application for automated tests. While I'm not a fan of mocking too much and prefer higher-level tests (integration/component), mocking is still quite often needed. Is there some alternative to IoC/DI?


Effect systems spring to mind but they're rather esoteric (in the Java world).


I'm still very much pro DI as a concept.

I don't think DI itself really causes any problems, the solutions designed to save you from a little bit of boilerplate code, cause the problems.


Are you pro DI as in "Dependency Injection", or pro DI as in "Dependency Inversion Principle"?

DIP is a good way to build software. When injecting dependencies becomes so complex you need a framework or need a separate concept of DI (sans P) then I think something has gone wrong, incidental complexity has won.


I mean the original statemement from the perspective that only certain languages/environments (Java, etc.) propose DI as a solution. E.g. in my current language of choice, C++, DI is nowhere to be found.


> in my current language of choice, C++, DI is nowhere to be found

STL, for example when passing explicit allocators. You can even call any higher order function using dependency injection.

And of course there are C++ codebases that look like Java - the pattern book works with C++ too.


C++ has constructors doesn't it?


And higher order functions.


Totally agree. Spring and actually everything in that realm is plain horrible. DI is awesome on it's own (especially if you do hexagonal architecture), but spring hides behaviour and brings in undocumented changes on updates and things like that (same for any spring related project and hibernate). That's my biggest problem with it.


Magit, the git client for emacs, fits the bill perfectly. It is a masterclass in simplicity, effectiveness, and discoverability. It is one of those rare tools that makes you better at the underlying tool it abstracts over; instead of introducing its own jargon and workflow it exposes git's capabilities better than git does.


I'm sure Magit is lovely. But I can't resist sharing the story of my bad experience with it over a decade ago (which has left me scared away).

It was maybe early 2012, and I was excited to try Magit. I got it set up, and called 'M-x magit-init' from a source file I was editing. My understanding was that this would create a new git repo in that source file's directory, ending up with something like "/home/beautron/myproject/.git".

But something else happened. The git repo was put here instead: "/home/beautron/myproject/~/myproject/.git". Note the peculiar "~" directory, created inside the project directory.

Huh. Weird. Well, let's get rid of this mess and try again. I went to the project directory in my bash shell, typed "rm -r ~", and hit enter. Somewhere between my mind firing the signal to hit enter, and enter actually being hit, I realized with horror what this command would do. But it was too late to cancel the brain signal.

I didn't lose everything, because I had not typed something worse like "rm -rf ~", and somewhere in my home directory tree was a read-only file. So the command only deleted so far as that file, and then paused to ask for confirmation.

I estimated I lost about half of everything (the first half of the alphabet was gone from my home directory). The most frustrating thing was not even being sure what all I had lost. On the plus side, this experience improved my regimen around backups.

As I was trying to salvage the wreck of my system, I had a separate laptop out on the side, where I was trying to get some help, or maybe just some sympathy, from the #archlinux irc channel on freenode. But the two people who responded to me on the channel were very snarky to me. I felt they thought I was clearly an idiot for having run that command.

The irc people refused to believe that Magit created the "~" directory. They were convinced I had done that myself, with some other series of stupid commands. (If you had to guess the source of weird "~", who would you choose: the established Magit project, or the guy who just deleted half his home directory?)

But a short time later I was vindicated! From Magit's github issue 383, Feb 29, 2012:

> So if you're editing "~/Temp/foobar/boo.txt" and call "M-x magit-init" which defaults to "~/Temp/foobar", instead of creating a git repo in "/Users/jimeh/Temp/foobar" it creates it in "/Users/jimeh/Temp/foobar/~/Temp/foobar".

Source: https://github.com/magit/magit/issues/383

It was a long night (and I had to leave on a trip the next morning). Now it's fun memory, perhaps with a number of lessons in it.


Sorry to hear. Codes have bugs that evolve over time. I hope you try magit again some day. It is sometimes hard to remember to apply correct quoting in a shell, so if in Emacs you can also use dired for dealing with such errors in less risky ways.


Do you have backups now?


>On the plus side, this experience improved my regimen around backups.


Doubt anyone will be surprised by this from me, but, Nix, 1000x. The amount of crazy stuff you can make work with Nix and Nixpkgs is nuts. This weekend someone pinged me wanting a static build of a Rust binary that had some gnarly bindings to C++ libraries. In under 100 lines of Nix, we have everything: static musl-based build, dynamic glibc build. Want an AppImage? `nix bundle` with an AppImage bundler. Want an OCI image? dockerTools.buildImage on top of your derivation. Throw it in GitHub actions using the Determinate Nix Installer action and you get automatic caching of the Nix store using GitHub actions cache; pretty useful since neither musl Rust nor static LLVM are cached in Hydra. Want to share your cache? Pipe a list of Nix store paths to Cachix, then they can pull it down, or add the Cachix GitHub action to automatically pull from and/or push to it for the CI build. So if anyone wanted to re-use your cache from GitHub Actions CI runs, they could, provided they trust you. You can even cross-compile with MinGW, or run it on macOS.

It's a hugely complex time sink, but my God, it's great. Whereas I don't generally recommend people go down the NixOS rabbit hole unless they're convinced it's right for them, I definitely think Nix is worth having in your toolbelt, it's ridiculously versatile.


I had only a dim awareness that Nix was even a thing before a few weeks ago. Then somehow I decided that I needed to commit to it 100% and make NixOS my daily driver and make myself learn this. It's been a lot of fun. Like, Dwarf Fortress kinds of fun. The kind of fun where when I first looked at it I thought it was insane inscrutable nonsense, and now I kind of wonder what happened to me that it's kind of making sense now. The kind of fun where I keep telling myself I just want to make some tiny little thing work, but actually I find excuses to rabbit hole down a bunch of different pathways and find amazingness under every stone. The kind of fun where I know better than to try to count how many hours I've spent on this now.

Except unlike Dwarf Fortress, I feel like things are actually improving over time instead of shambling ever-onwards towards an inevitable downfall. So I guess maybe it's more like the kind of fun the very first time I installed Linux and didn't know my way around anything.

I'm surprised how much I enjoy customizing things now. I always thought of my desktops sort of like betta fish before -- like, I've taken care of them, but also known better than to get too attached. Eventually the reformat is gonna come, and I'm never gonna set things up QUITE like I had it before. That's definitely not true now. I could start from scratch and be up and running with my entire suite of applications, themes, add-ons and configurations in no time, because it's all just a git repo of nixfiles.


> I had only a dim awareness that Nix was even a thing before a few weeks ago. Then somehow I decided that I needed to commit to it 100% and make NixOS my daily driver and make myself learn this. It's been a lot of fun. Like, Dwarf Fortress kinds of fun. The kind of fun where when I first looked at it I thought it was insane inscrutable nonsense, and now I kind of wonder what happened to me that it's kind of making sense now. The kind of fun where I keep telling myself I just want to make some tiny little thing work, but actually I find excuses to rabbit hole down a bunch of different pathways and find amazingness under every stone. The kind of fun where I know better than to try to count how many hours I've spent on this now.

That's absolutely what diving in with NixOS was like for me. Wonderful description. :)

> I'm surprised how much I enjoy customizing things now. I always thought of my desktops sort of like betta fish before -- like, I've taken care of them, but also known better than to get too attached. Eventually the reformat is gonna come, and I'm never gonna set things up QUITE like I had it before. That's definitely not true now.

Yes. NixOS makes customization feel more worth it than other operating systems can! I first heard it expressed best in this video: https://www.youtube.com/watch?v=17-TRCpDizA


I see tons of people very happily using Nix, but also from what I've seen Nix is one of the most opaque hard to understand inscrutable systems on the planet. The language is impossible to learn, and my limited experience has been that there are extremely few good guides to peering under the covers.

Nix is one of the most powerful & well-used bits of modern computing, hugely adopted and loved, but my limited understanding (and what's kept me broadly uninterested) is that it is on the wrong side of the Age of Reason trying to overthrow the Age of Magic war.


The language is not the hard thing, it's actually straightforward once you get it. It's the crazy amount of convention and the number of layers in the stack that make up the Nix ecosystem that takes so long to grok.


Honestly, I suspect a lot of what makes it opaque is the fact that it hinges a lot on functional paradigms, this definitely made it a lot harder for me to understand. The trouble is that now that I understand it, I'm not sure how they could architect this in a more scrutable way. This is the problem.

I think most of us who use and love Nix understand that it is too complicated, but it's too complicated because none of us can figure out how to make it simpler.


I'm also nix-curious and intimidated by the wall. I feel like things would be a lot easier to grok if Nix actually had a static type system or even something opt-in instead of "the object has whatever fields I happen to put in it, good luck lol"-typed.


> The language is impossible to learn, and my limited experience has been that there are extremely few good guides to peering under the covers.

Maybe my "Nix from the bottom up" page would help? It peers so far "under the covers" that I only introduce the Nix language very briefly near the end (as a saner alternative to everything else on that page!) http://www.chriswarbo.net/projects/nixos/bottom_up.html


Nix is one of the reasons open source wins in the long run. I would have never imagined that package dependencies could be installed and managed across programming language boundaries, kernel boundaries, shell boundaries, dotfile boundaries etc.

Given that computer complexity grows exponentially all the time, then at some point everyone will be forced to use something like Nix.

I agree with the article with Emacs and Unix/Linux.


Functional Programming approaches usually win in the long run. This is a common thread in the article too, although not explicit.


I think you mean declarative. Whether something is FP or OP usually doesn't matter too much (unless you go to the extremes of each end of the sprectrum a la AbstractBeanFactory/IO Functor State). Declarative solutions last longer by virtue of their declarativeness; the underlying implementation can evolve as long as the declarative layer is untouched.


Isn’t a class a declaration (of a class type)? Isn’t a function a declaration (of an algorithm)?

I’d assert that every programming language is declarative. Especially once you get enough written to constitute a DSL.


I think they mean declarative in the sense that you write down want you _want_ in a solution (prolog/datalog/terraform/etc) without specifying exactly how to compute it (day to day programming languages).


The definition of computation depends on the abstraction level. Languages span multiple abstraction levels. The point is that there’s no meaningful definition of “declarative language” as proven by the examples I’ve already given.

For example you can “compute” a roguelike in TypeScript types, but a type system is declarative!


If you are willing to go to extraordinary lengths, then yes any programming language can be written in a declarative style, at least some part of the program. Some parts of a Java or PHP program can be written declaratively.

In the Nix language context however, they mean declarative and functional like Prolog.

There is also Guix which uses Scheme to declare dependencies, and Scheme is not a declarative language. Scheme is a functional language, but it can be easily written in a declarative style.


Everyone knows this, friend. It's still a useful distinction.


I'd say the distinction is more at the semantic level, i.e. denotational vs operational.


What's the right way to learn how to use NixOS?

There are a few different approaches to doing everyday development (oh use Flakes or just make it a default installed package) that it's tedious to do lots of things I've found.


The No Boilerplate video is what got me started: https://youtu.be/CwfKlX3rA6E?si=hJZ_mm9vaeKI0w8V

It's just super nice having everything in a git backed config, for multiple systems. Working with NixOS seems like I'm simultaneously working 10 years in the future and 10 years in the past.


> What's the right way to learn how to use NixOS?

Just commit. Make it your daily driver, engage the community for help, and learn whatever you need to learn to do things the idiomatic way. Then settle in and get comfy.

That said, I hear you about the paralyzing number of choices. Here are some okay choices you can fall back on when you get dizzy:

  - use flakes with direnv for all your projects
  - when in doubt, just write the damn package
    - need to use some Python library that depends on another Python library that's not in Nixpkgs? Package 'em. Package all the recursive dependencies, then package the library you wanted to use.
    - want some program that's not in Nixpkgs? package it
    - want to reuse a binary built for a traditional distro? package it. Look at examples in Nixpkgs of similar things (e.g., Discord, Steam)
  - seek community engagement on Discourse and Matrix, not elsewhere (not Reddit, not Discord, not Slack, not IRC)
This is the best way to learn NixOS, imo. But if it seems like too much, it's not the only way.


I found using Nix package manager on my current daily-driver OS was a great way to break the ice. After translating my dotfiles to Nix and figuring out my project-specific development workflow I had given myself a strong foundation for NixOS.

Jumping into the deep end and going straight to daily-driving NixOS, is certainly also a good option.


I don't think there's a right way to do it, you are correct in that learning NixOS is pretty tedious.

Re: flakes, my personal opinion is to use flakes. While Flakes are imperfect, they still provide a lot of functionality that Nix doesn't otherwise have. In my mind, it's like Nix's equivalent of "Go modules" or something like that. I do feel like people who do not like flakes make many valid points (the boilerplate, the fact that the top-level flake expression is a subset of Nix for some reason, etc.) but the argument isn't that those problems shouldn't be solved, it's that flakes are a sub-optimal design. Since they're so proliferated throughout the ecosystem though, it is quite unlikely that Nix or any prominent fork will outright drop flakes support any time in the near future. For better or worse, Flakes are part of the Nix ecosystem for the foreseeable future. In my opinion, one may as well take advantage of that.

If you haven't already, I'd get your feet wet with installing Nix on a non-NixOS machine first, and please feel free to ask questions about Nix in the NixOS Discourse "Help" section.

I have some recommendations:

1. https://github.com/nix-community/nix-direnv - Since Nix derivations usually wrap around other build systems, the entire derivation is recomputed when any file in it changes; using direnv, you can just get your normal dev tools upon cd'ing into your project directories. This gives you a lot of the benefits of Nix during local development, but with your normal stack, and without needing to globally install anything. Importantly, this works around the problem of needing to rebuild your project every time a file changes; Nix caching isn't granular enough (at least when wrapping around other build systems as it normally does.)

2. If you are trying to build something, chances are you can find inspiration in Nixpkgs. Are you curious how you might package a Bevy game? No problem: literally search "bevy" on the Nixpkgs GitHub repo and see what comes up. I found a derivation that does: https://github.com/NixOS/nixpkgs/blob/master/pkgs/games/jump...

3. If you use flakes, you should keep the flake "schema" handy. There are a lot of different kinds of flake outputs and there are different ways to specify the same thing, which is somewhat needlessly confusing; keeping the flake schema handy will make it easier to understand what Nix is looking for in a flake, which might make it easier to see what's going on (especially if it's obfuscated.) The most important takeaway here: A command like `nix run flake#attr` will try multiple different attributes. https://nixos.wiki/wiki/flakes#Flake_schema

4. Likewise, I really recommend reading up on what NixOS modules are. NixOS modules are the basis for configurations on NixOS, and having a clear understanding of what is even going on with them is a good idea. For example, you should understand the difference between the Nix language's `import` directive, and using the NixOS modules `imports` attribute to import other NixOS modules. Understanding how the configuration merge works saves a lot of headache, makes it easier to understand how people's configurations works, and also makes it easier to modularize your own NixOS configurations, too. https://nixos.wiki/wiki/NixOS_modules

Unfortunately though, there's just no way to make it "click", and I can't guarantee that it's worth all of the effort. For me, I felt it was, but yes, there's no one correct way to do it.

But please feel free to ask questions if anything seems confusing.


Thanks for the inspiration. I actually already have nixOS installed on another laptop, but lapsed back to my Ubuntu machine out of a bit of frustration. I'll try it again and see how far I get with these tips.


Point 4 is incredibly under-marketed. Almost all of Nix/NixOS documentation focus on the language-y and build-system-y parts of Nix, and the NixOS modules are usually not talked about. Terrible state of docs doesn't help.


Knowing what are the 3 official manuals for the 3 most important projects in the core Nix ecosystem (the manuals for Nix, Nixpkgs, and NixOS, respectively) that together make up the core of the official docs will save newbies a lot of trouble.

> the language-y and build-system-y parts of Nix Language-y manual: https://nixos.org/manual/nix/stable/language/index.html

Build system-y stuff manual: https://nixos.org/manual/nixpkgs/stable/

> the NixOS modules

Using the NixOS module system in the sense of leveraging existing modules is the main topic of the NixOS manual: https://nixos.org/manual/nixos/stable/

Details about the module system mostly live in this section of it: https://nixos.org/manual/nixos/stable/#sec-writing-modules

The piece they don't tell you is that as a NixOS user, you generally want to look for a NixOS module that supports your application first. Only if you don't see one should you then directly/manually install a package into, e.g., environment.systemPackages.

In other words, search here first: https://search.nixos.org/options

And search here second: https://search.nixos.org/packages

The landing page that ties these reference docs together and also contains a lot more example-centric and tutorial content is nix dot dev, here: https://nix.dev/

Imo nix.dev is a great entrypoint and quickly getting better. In addition to providing a bit of a map to help you navigate the official docs, it includes the best official starting docs on flakes, and links to the highest-quality external tutorial and expository material about Nix.

Make a mental note about the 3 big reference manuals, and bookmark nix.dev, and you have everything you need to learn your way around the official docs.


Wait, this is a pretty good sell. I'm going to give this Nix thing a shot. All the other times it's posted people talk about things I don't care like replicable builds and stuff.


My guess is that if you use it long enough for it to start being useful, you'll find that "replicable builds" solves a wider variety of problems than you initially thought it did.

At that point, the hard part becomes getting your co-workers to recognize that all of these little problems that they perceive as separate are actually just facets of the the same huge nondeterminism problem.


I'm sure. It's just not a selling point for me right now.


Perhaps that's exactly the problem: 'reproducibility', 'determinism', 'functional'-- these are unfortunately too much 'insider language' to be great sells to most people who aren't already FP or Nix people... even when they're closely related to benefits those prospective newcomers might enjoy!

I'm glad you've identified a use case that appeals to you. Have fun!


This 100%. It's the gift that keeps on giving.


You need to give the first gift, however. Time. And lot of it.


Smalltalk and as a particular case Pharo is an example of this for me. (https://pharo.org/). When I was in uni a paper that I always came back to was Licklider's 1960s paper on human-computer symbiosis.

"[...] to enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs."

Experimenting with Smalltalk (and also with Clojure and Emacs) was one of the things to me that genuinely felt like that vision of programs as living, interactive, organic things rather than the formulaic, static and low level programming that I was used to learning. I think it's still such a shame that in daily jobs it's so difficult to convince people of trying these technologies out because it requires such a big shift in how people think about software.

https://worrydream.com/refs/Licklider_1960_-_Man-Computer_Sy...


> They are "round": they pack the most volume in the smallest surface area. unix surface area is tiny, but it unlocks much power. Emacs and Git are all over the place, but their core is small, sweet, and easy to appreciate.

I really like this 'round' concept, seems very precise. Maximum interface area / use cases (surface area) with minimum core volume.


Unfortunately, a sphere of characterized by exactly the opposite property. The article switches the intended meaning of surface and volume to get away with the metaphor, but I’m less than thrilled about the metaphor.


I am not defending this specific metaphor as I am not sure it is really good in this case. But they are right about this specific property of spheres: they have the highest volume/surface ratio (i.e. that’s the way of minimising the surface area of the enveloppe for a given volume).


"encapsulate the most complexity with the smallest API" seems to fit the metaphor better.


In Ousterhout's "A philosophy of software design", this aspect is described similarly via "deep classes", as opposed to "shallow classes".


> I also must confess to a strong bias against the fashion for reusable code. To me, "re-editable code" is much, much better than an untouchable black box or toolkit.

I had never come across this particular thought by Knuth, but this hits home so hard.

It feels that most of our productivity is a function of how re-editable the code is in the codebase we operate.

It’s intriguing to think about what makes code re-editable vs simply reusable.


Reusable = interface (UX or API) that is intuitive, simple, and powerful; implementation that is fast and reliable.

Re-editable = Implementation building blocks are also intuitive, simple, and powerful; and coupling is minimal (implies intuitive/simple/powerful) and not error-prone (i.e. missing links raise compile errors or fast, descriptive runtime errors).

You can have software that is useful but the implementation is an obfuscated spaghetti mess. This is common for old but popular software e.g. C++ compilers, Microsoft Word, and Firefox, which has gone through maintainers and bitrot but is continually improved and bugs continually fixed. And you can have an application with great code quality but terrible CLI or GUI, such as Git and ffmpeg (for those who disagree on Git, read https://jvns.ca/blog/2024/04/10/notes-on-git-error-messages/... my argument is that you’ve adapted to some design choices, such as the error messages, which could objectively be made more intuitive without sacrificing usability). Although I bet it’s more common that surface and implementation are similar quality.

You can’t really have a bad library with a good implementation, because if the structures/functions/… in your implementation are reusable, that extends to the outermost ones that make up your API. But the larger the implementation, the harder it is to make a good API, so it might just be the case that the low-level design is good and it (gradually or not gradually) gets worse at the higher level.


> These tools capture our imagination, open new possibilities, and affect how we design our own systems.

Puppeteer/Playwright fits the bill for me. Learning how to scrape websites with those tools subtly changed how I approach lots of other programming tasks and opened many new possibilities in both personal and work projects.

The event model of the web platform has had a deep effect on how I generally think about systems.

Some people may throw tomatoes at me for this last one: Google Apps Script. The docs and API aren't great, but it has opened so many new possibilities for automation of things related to my personal and work Google accounts.

Re: capturing my imagination, physical computing with RPi/Arduino. I've also gotta admit that GenAI APIs have been an explosion of new possibilities for my line of work (technical writing)

Thank you to OP for creating a productive and inspiring topic (enlightenmentware) for us to discuss and share ideas around


Can you give some examples of what you've created with Google Apps Script?

I've used it a little, but it's one of those things I always spend an hour building up courage to attempt (because of the slow feedback loop, browser-based testing, confusion about different deployment types, permission models and extension types). In such ways it's the opposite of writing Emacs extensions, though I can see how it promises to give something similar in the end.


In a previous job I rigged up the submission of a Google Form to send out an FYI email to relevant stakeholders and to create a GitHub issue (glossing over the details of why this was useful)

Apps Script also has an API that essentially allows me to expose my spreadsheet data as a web service that returns the data as JSON

For a while I was doing daily journals in Google Docs. I used Apps Script to auto-populate the heading for each day, e.g. an H1 with the text "21 May 2024". I've done lots of little auto-population things like this


We used SVN at my first job in 2006 and I had the exact opposite experience with it. I never fully understood what I was doing, nothing made rational sense, merges were an absolute nightmare, and somehow I would always ended up corrupting the repo and had to pull a nightly backup to get out of the broken state.

Git was a literal breath of fresh air in comparison. I fell in love hard and fast. Everything just made sense, even if our workflow using git am patches seems downright ancient these days. Friends at the time tried to sell me on hg, but I was in love.


I thought SVN was great, easy to use, and very intuitive, that is until you had any merge conflicts.

At the time, it worked very well for our small team, I imagine it would work less well for large teams on a single codebase.

I miss having a commit serial number.


I used kdiff3 and could never understand people complaining so much about SVN merges. Now I use kdiff3 in Git and it's fine, too. What isn't fine (though occasionally improving) is Git's UI and mess of termnology and concepts.


It's been a long time since I used it, I don't remember what I was using for diffs, but I suspect it was just whatever the default built in diff support was.

It occurs to me that many people these days use git with github exclusively, and have it configured to only allow commits via PR, and only allow either squash or rebase merges, it's kinda SVN with extra steps.


You have to sacrifice the serial number to get a distributed system. Well worth it IMO. But if you really wanted you could tag every commit on master with the next number (should be easy to do with a hook).


I would add

Haskell - the rabbit hole to category theory and all the mad stuff Haskell people get into (compilers, proofs etc.)

Bitcoin - love or loath it, technically it is a marvel. At the time Bitcoin came out I was musing on the same problem but never could figure out how to avoid double spends and would never have come up with blockchain!


Now that the blockchain itself is actually not novel, it was already a part of Git at the time of the Bitcoin white paper. The novel part, I believe, was the proof-of-work concept.


Even proof of work was not novel, there were proposals for fighting email spam with similar techniques. Bitcoin's fortune is combining the right pieces at the right time and getting sufficient buy-in to become relevant and more difficult to ignore.


Yes I take blockchain to mean proof of work (or proof in general) validating blocks such that you can track time to some extent in a system that cannot be sure of time because of being distributed and nodes not being trusted.


Haskell was one of my most important discoveries; it affected my thinking and approach to software engineering the most.

However, as I mentioned in the opening section, I deliberately removed programming languages from candidates for several reasons: 1. They received enough praise already. 2. My presentation would be very biased. 3. The article would be way too long.


Maybe one day the buck2 ecosystem will evolve to be that bazel replacement. It has a much smaller core. Right now it’s lacking in ecosystem support, tooling, examples, and a local build sandboxing (which could be fine if there was an easy to use local implementation of remote build that felt natural). Also no go support is sort of painful, and it’s a bunch of work as i understand it to get something like rules_go to work for buck2.


> examples

As it may be that people don't find them, the official ones are in `examples` https://github.com/facebook/buck2/tree/main/examples language examples in `examples/with_prelude` https://github.com/facebook/buck2/tree/main/examples/with_pr...

There is a minmal Go example too: https://github.com/facebook/buck2/tree/main/examples/with_pr...

More "real world" examples are dtolnay's CXX for Rust and C++ interop: https://github.com/dtolnay/cxx

and my own ones (sorry to link these again, but I do not know of any others):

C++ with Vcpkg: https://github.com/Release-Candidate/Cxx-Buck2-vcpkg-Example...

C++ with Conan: https://github.com/Release-Candidate/Cxx-Buck2-Conan-Example...

OCaml: https://github.com/Release-Candidate/OCaml-Buck-2-Examples


Buck2 is really enticing. If I had too much time on my hands, I'd love to try to make a full bootable Linux distro as monorepo and built with buck2. I do see some sort of convergence between package managers and build tools, nix and buck2 do seem to approach similar problem from different angles.


It looks like Go is starting to get better support in buck2 - see https://github.com/facebook/buck2/issues/455


And a remark: the "real" problem when using Buck 2 is the interface to the LSP, as most LSPs only work with the "native" project configuration. For C++ generating a `compile_commands.json` is quite easy (see my C++ examples in the other post), not the least because there is no single standard for a project's configuration.


> But occasionally, we discover a piece of software that transcends mere utility. These tools capture our imagination, open new possibilities, and affect how we design our own systems.

For me, it was DEBUG.EXE on MS-DOS.

This humble debugger allowed me to peek into interrupt vector tables, inspect the content of ROM, learn how MS-DOS boots from scratch, etc.

I fondly remember the days when armed with an assembler, some knowledge of the CPU and the computer architecture, we could plunge into the depths of the system, unravelling its intricacies to our heart's content.


My first programming forays were with the assembler feature of DEBUG.EXE, on an 8088. Extraordinarily cumbersome in retrospect, but it was existing at the time. Definitely a gateway experience.


Coming from C++ and Python, Go's packaging + deployment tooling really enlightened me. It's SO EASY to depend on things and deploy my apps, I love it!

I've also heard Rust gives folks similar warm fuzzy feelings in regards to building and deploying, one day I'll try that too


I felt that way about node and yet node lead to an explosion of poorly written and designed packages and constant notifications about needing to upgrade project X because it depended on Y which depends on Z and Z has some DoS issue if you pass the wrong regex to it.

I don't feel confident that rust won't go the same way when I tried to update the rust docs (https://github.com/rust-lang/docs.rs)

    cargo build
    Downloaded 541 crates (55.2 MB)
Seriously? 541 crates for a static site generator for docs?

rust is clearly off to copy npm in all of it's faults. I have no idea if go is similar in the explosion of dependencies and supply side attack surface area explosion


> for a static site generator for docs?

docs.rs has a lot more to do than just that. But also, actually building those static pages takes a lot. To do so, it has to actually build every crate, sandboxed, of course. This makes it closer to "universal CI for the entire ecosystem" than "generate a few html pages."

If you look at the dependencies, they're pretty normal for a website that does this kind of thing. It's roughly 80 dependencies, then 11 used for development only, and a couple more that are build-time only. The larger number is the sum of all transitive dependencies.


> Seriously? 541 crates for a static site generator for docs? rust is clearly off to copy npm in all of it's faults. I have no idea if go is similar in the explosion of dependencies and supply side attack surface area explosion

In Rust, it is design choice. They try to keep the standard library small, and let community create competitive packages. You see the result in those numbers.

It is hard judge based on those numbers only.


The philosophy does not really matter, though. Any one of these dependencies could be a vector for a supply chain attack and all these libraries being updated independently and a synchronously is just asking for 2 dependencies requiring incompatible version of something else. We’ve seen this happening already and it usually ends up in 2 ways:

- the node approach: who cares? YOLO!

- the particular circle of hell that is Python and its 25 ways of doing virtual environments. Wait, 26, yet another one just dropped.

For all its faults (and there are some), a centralised approach like with Boost has some merit.


Rust, the language itself depends on 220 packages: https://github.com/rust-lang/rust/blob/e8753914580fb42554a79...

If you trust nobody, it is hard to use anything.

But about your second note, (environment, mismatched dependencies), I would argue that Rust provides the best tooling to solve or identify issues on that area.


Rust depending on 220 packages is somewhat understandable.

After all it runs on many platforms. I counted 16 Windows packages just by glancing over and also many macOS related.

But 541 for docs?

Surely there's a gradient between trusting no one and trusting 541 packages to generate static files.


Are you confusing `docs.rs` with `cargo doc`?

It is indeed many packages, but if you look into the dependencies and code, docs is full blown standalone HTTP async server which uses tokio, AWS S3 and Postgresql. It is used to host the docs.rs where is the documentation of every cargo project.

Maybe they should feature-gate some functionality and also split the dependencies

Static site generator is mostly in the Rust itself: https://github.com/rust-lang/rust/tree/master/src/librustdoc


That's the same argument node people make. See how well it's worked out.


NPM's got bigger problems than just having lots of small libraries, like for instance allowing circular dependency relations in packages


Emacs, Squeak, and Genera (Lisp machine OS) all qualify for me. It's no surprise that all of these examples are what I call "pervasively programmable": you can not only extend them with code, but examine and modify the running system by typing code into it, shaping the system to your needs as it runs.

Another pervasively programmable piece of enlightenmentware is Ashton-Tate Framework. It wasn't just a spreadsheet but a piece of "integrated software" (the 80s term for office suite), sporting spreadsheet, word processing, database, graphing, and serial communications capabilities, all under a unified desktop-GUI-like interface and programmable with the Lisp-like FRED language. If there were ever an "Emacs for business", it'd be Framework. It's pretty amazing that a program that powerful existed on 1980s PC hardware.


One day, I'm working on a Linux machine with my Emacs open, I'm using a Bazel to clean my today's to-do list project. And I open the browser to find a person who wrote a blog about Boost.graph which I never heard about, but I'm really interested to look at. I finish this writing, save the buffer and =C-c g= to lunch magit to write a commit message "good day", then pushed to my git repo.


Docker would be on the list for me - for reproducible environments. Probably JUnit as it was my first real testing framework - for being able to use test driven development for hard problems.

With programming, I had so many "aha"-moments, it's hard to remember them. It's not all about software, but more about understanding the concepts and being able to transfer this knowledge. Being able to pass functions or function pointers. Streaming / piping data instead of a fixed data structure. Interpreted vs compiled languages. How everything we do here only happens through a long list of 0s and 1s and how this clever setup makes us even see graphics on the screen. Or hear audio through a screen reader....


Happy to see emacs on the list, changed my life.

Enlightenment hardware: kinesis advantage 2, many non obvious, non ergonomic benefits, like adapting the hardware to _my way_ of working/thinking.

Certain games also feel like enlightenmentware.


What games would you classify as enlightenmentware?


dwarf fortress

ultima-likes

minecraft

I’m not much of a gamer, but those definitely made a pretty big impact on me as far as what one or a small group of people can accomplish


The Witness


> ClearCase is confusing like a Russian novel: All the characters have strange names, the plot is complex, and it doesn’t end well.

Well done.

> Git removed the friction from using version control; there was no excuse not to version anything of value anymore. Merging branches with Git didn’t cause anxiety disorders. The staging area—confusingly named index—became essential to my workflows. But my favorite feature was the breathtaking beauty of Git’s design, the elegant mix of distributed systems, acyclic graphs, and content-addressed storage.

I love seeing glowing reviews of Git.


many moons ago, before git was released, I put svn in front of clearcase and had people locally commit to that, trap into hooks of svn, check the file out of clearcase commit and recheck-in. the whole engineering group I worked in switched to using the svn and sped up, although user contribution was thrown out in the clear case log.


Facebook Name: ADEOKIN GAFARU ladiokins0101@gmail.com It's glad to hear from you soon enough, welcome to UNITED BANK OF AFRICA Account holders name: AGL Account number:2329762557 UBA direct express transfer to the business trade account for me to reimburse the ransom money I asked you to lend me.available to with the account officer in charge of my recumbent exercise application form his personal phone number is named Oladipupo (Adeokin Gafaru) Phone number :+2349075759323 email payment receipt directly for reimbursement of one liberty Dime coin, postulated your money will be preferable invested in my life career (agl2329762557@gmail.com)


I have a couple of these to add as well:

VCVRack - simply one of the most mind-expanding things a synthesizer-nerd can play with. (https://vcvrack.com/)

ZynthianOS - another example of a simple software solution to a problem nobody realized existed, opening the door to an absolutely astonishing array of Audio processing tools (https://zynthian.org/)


I’ll mention Typescript.

It elegantly improves almost unimprovable mess of JavaScript. It makes JavaScript development much more productive and pleasant. And it’s surprisingly powerful.


I agree but it's a shame we need to put a bandaid over JS instead of having a properly typed language option for the web.

Now we start seeing some wasm being used but I still wouldn't use it for the whole project, so TS is the way for now.


If you think about it, entire web infrastructure is buzzard and inefficient.

Sane design is to have some kind of bytecode for web code and some compact format instead of html. The waste of delivering this data and then executing js is enormous. And we need to pre-process and build websites and web apps before serving anyway!


Losing decades of backwards compatibility and portability is probably not worth it


I heard that already over 10 years ago. Are we just going to keep at it because of sunken costs?

Also having a parallel system on the side would be fine. Just something more efficient and with types please!


I'll nominate another text editor for this subject: Sam. It's a graphical text editor, but it doesn't resemble any other graphical editor I've ever seen. I've been told it's "like ed on steroids" but I can't verify that statement as I've never used ed.

Sam made me reconsider the way I think about the mouse. In Sam, the mouse is integral to your workflow, but the way you use the mouse is unlike any other editor I've ever used. Brief strokes blended with keyboard stuff. Sam translates remarkably well to the trackpoint, even though it was designed a decade before laptops existed. Its command language is somehow as simple as sed, but (almost) as powerful as vim. It's really old software now, so it lacked some features I wanted (auto-indent, mainly). The source code is simple enough that I just added auto-indent myself!

I used Sam for the last two years of my undergrad. I wrote Python, C++, Nim, and Verilog using Sam. Sam shaped the way I think about computers in so many ways. At a glance it would seem idiosyncratic and weird. Having used it, I consider it to be wonderfully ergonomic and creative.


I believe Sam is a predecessor of Acme[1][2], which still attracts new users today. One of its notable users is Russ Cox, the tech lead of the Golang team. I haven’t used Acme yet, but I want to try it out one day.

If you’re interested in learning more about ed, I highly recommend “Ed Mastery” by Michael W Lucas [3]

[1] https://doc.cat-v.org/plan_9/4th_edition/papers/acme/ [2] https://www.youtube.com/watch?v=dP1xVpMPn8M [3] https://www.amazon.de/gp/product/B07BVBSDNZ


Speaking of Bazel, I wanted to try it out for a Java project, but it felt a bit more complex than expected.

Would you recommend using it even for mono languages projects?


Bazel is probably at its simplest in a monolingual codebase. Toolchains have a lot of complexity.

It's like that Churchill quote about democracy: Bazel is the worst build system except for all those others.


Used bazel for years, now using pants[0] and really enjoying it as a tool that is good in the same ways, but better in some smaller ways.

0: https://www.pantsbuild.org/


While maven and gradle may lack the architectural purity of bazel, they work much in the same way.

Plan and execute steps, building a graph of the required build tasks, only executing the tasks that have changed inputs.

With that said, it's quite common to see maven/gradle builds that are misconfigured and execute tasks unnecessarily.


I would use choose it for a C++ only project, but that’s because the alternatives are so horrible.


Why not Gradle or Maven?


If you haven't already committed, consider Nix instead.


I'm still trying to understand why people recommend Nix in place of a build system. Nixpkgs stdlib by default expects an autotools project. It will happily integrate with other build systems, as long as you've spelled out your dependencies in both. I've yet to see it generate a Makefile or make any decisions about compilition that weren't spelled out in a "traditional" build system. Could you shed some light on what I've missed?


So.. it's sort of a battle over territory between build system and package manager.

Bazel is there becoming ever more complex and unwieldy in an attempt to provide supposed reproducibility - taking control of the provision of ever more of a project's dependencies (in often very janky ways). But to Nix people it's clear that what people are actually doing here is slowly building a linux/software distribution around their project, but in a very ad-hoc and unmaintainable way. And bazel projects will continue to grow in that direction because until you have control of the whole dependency stack (down to the kernel), you're going to struggle to get robust reproducibility.

I don't think many Nix people would suggest actually using Nix as the build system, but probably to use a comparatively simple cmake/meson/whatever build-system and use Nix to provide dependencies for it in a reproducible and manageable way.


You call blaze side janky and ad-hoc but to me (as complete outsider) using monorepo+build tool seems more principled and working more with fundamentals, while nix feels more ad-hoc and trying to fix stuff post-facto.

> And bazel projects will continue to grow in that direction because until you have control of the whole dependency stack (down to the kernel), you're going to struggle to get robust reproducibility.

This is bit weird statement, considering that it's not where bazel is growing to, but where bazel is growing from. The whole starting point for bazel is having full control (via monorepo) of the dependency stack


> You call blaze side janky and ad-hoc but to me (as complete outsider) using monorepo+build tool seems more principled and working more with fundamentals, while nix feels more ad-hoc and trying to fix stuff post-facto.

The Nix side is a maintained software distribution, which is a lot more than a bunch of random versions of tarballs pulled down from random urls, wrapped in minimal build scripts and forgotten about for years on end. It's also work that is shared across packages in the distribution and it produces consistent results that don't have dependency conflicts - if you have two bazel projects that each build against their own cpython, I can guarantee that they will have chosen different versions of cpython. Which one wins when they're used together? Who knows...

Every project building-out their own separate pseudo-linux-distribution cannot produce good results.

> The whole starting point for bazel is having full control (via monorepo) of the dependency stack

I'm not aware of a bazel project that builds its own glibc (I imagine there are some which people could point out...). But then.. do they ship that glibc with the end result? Or just shrug and hope it works fine on whatever glibc the target system happens to have?


I haven't worked at google, but my understanding is that their monorepo does contain everything, including kernel and libc etc. So it's not bunch of random tarballs, its complete in-house maintained source tree.

> But then.. do they ship that glibc with the end result? Or just shrug and hope it works fine on whatever glibc the target system happens to have?

That's the whole point of monorepo, you don't have some random target systems, it's all included in the same repository.


Thanks for the summary. I've been using Meson + Nix, so the comments about using Nix as a build system have been confusing. I think what I've been seeing though are "use Nix instead of Bazel", not "use Nix as your build system".


What I mean is use a relatively simple build system instead of Bazel, and deal with dependencies and reproducibility through a Nix development environment.


You lose out on some of the incremental compilation speed that Bazel offers doing this. I think many in the Bazel space suggest using Bazel inside of a Nix environment.


I'm not sure why you'd want to generate a Makefile if you're using nix. Unlike make, nix understands the inputs to a build step and won't bother rerunning it unless their inputs have changed. You would lose that you generated a Makefile instead of having nix build whatever it is that the Makefile builds.

Otherwise it does the same things as make: this bunch of commands depends on this other bunch of commands... It just makes you express that as a function so it can be smarter about memoization.

I've not used it for large complex builds, so maybe there's some itch it fails to scratch at finer granularity which I'm overlooking. I liked this artical about where it shines and where it fails to be a build system: https://www.tweag.io/blog/2018-03-15-bazel-nix/. I've been waiting for the problem to arise that encourages me to learn Bazel so I can use it alongside nix, and it just hasn't yet.


> I'm still trying to understand why people recommend Nix in place of a build system.

Probably because Nix is a build system. After using it for a decade, I dislike that it describes itself as a "purely functional package manager"; that causes all sorts of confusion, since it has far more in common with something like Make (e.g. see my "Nix from the bottom up" page http://www.chriswarbo.net/projects/nixos/bottom_up.html )

> Nixpkgs stdlib by default expects an autotools project

Ah, I see the confusion. Nixpkgs is not Nix; they are different things!

Nix is a build tool, similar to Make. It has some differences, like caching results using their hash instead of timestamp, but the main advantage is that its build receipes are composable (thanks to the FP nature of their definitions).

For example, say I run `make` in some project repo, like Firefox. Make will read that project's Makefile, which contains elaborate rules for how the various build products depend on each other. Yet despite all that care and attention, I get an error: `cc: command not found`. Oops, I don't have a C compiler! So I grab a copy of the GCC source, and what do I find inside? Another Makefile! The `cc` command required by the Firefox Make rules is itself defined with Make rules; but the Firefox Makefile can't refer to them, since Make is not composable.

In contrast, Nix is composable: Nix definitions can `import` other files, including from build outputs! For example, we can write a build receipe which imports its definition from a build output; where that build fetches a git commit; and the definitions inside import things from some other builds; and those download and extract a bunch of .tar.gz files; and so on.

Nixpkgs is the most obvious example of this composability, with mountains of build receipes, built up from a relatively small "bootstrap" (pre-built binaries for a few basic tools, like parts of GNU). It's also a testament to backwards-compatibility, since it features build receipes (and helper functions) which act as wrappers around all sorts of legacy tools like Make, PIP, NPM, Cargo, Cabal, etc. (if you're working on a project that's stuck on such things).

Whilst Nixpkgs provides support for all of these things; Nix itself is only capable of invoking a single `exec` syscall (see "Nix from the bottom up"). Everything else is built up on that foundation, and isn't tied to any particular tool, language, framework, etc.

Hence it's not so much that Nix is a "package manager", or "orchestration" tool, or "configuration manager", etc. It's more like: those categories of tools are workarounds for crappy, non-composable build tools like Make. Nix is a composable build tool, so all of those other things turn out to be unnecessary.


Shells, and how by scripting them you have programmatic access to the entire operating system.


In my case Cycle JS (https://cycle.js.org) was very enlightenment and pedagogic. It make me realize that software is always and only a matter of data transformation. And those pure data transformations can be keep separated and decoupled from "side effects".


I want to thank wmii for being a vanguard force in illuminating the path before me, by being both a fine tiling window manager, while also showing it's guts, exposed as a 9p file-system.

Being able to craft together super crappy shell scripts to monitor and manage my windows was amazing. It was a huge turn on to feeling like someone who was really communing with the computer at it's deeper level, rather than just surfing above the application's crust. This feels like the real enlightenment goal: bringing forward the (ab-)natural philosophy that underlies each bit of software, rather than crafting facades of interface atop the core.

I have high hopes that general systems research can one day again spark an age of revelation & understanding, that we can form better more earnest symbiosis with machines. wmii was a good example of one way to let bonds grow close & strong.


lazygit (https://github.com/jesseduffield/lazygit) is enlightenmentware for me. It helps me navigate Git commands I forget all the time, like using the reflog to undo things, custom patches, or rebase --onto.

It makes working with Git a lot more fun, and I giggle like a little child whenever one of the weirder things work out again.


I find VSCode/Codium to be even better at that! And worst case, open up the terminal and do the job there.

Even merging which was often an annoying endeavour is quite smooth there.


I found quicken to be enlightening. It took me two months to master my family’s finances and budgeting and I’ve never looked back. Learning it’s ins and outs and why it is what it is and provides some features but not others was a wonderful learning experience.


I worked with John Wiegley for a couple of years and discovered one of his projects, Ledger (https://ledger-cli.org/), during our conversations. This tool taught me double-entry accounting and helped me understand finance and blockchains on a deeper level.

Unfortunately, I’m too lazy to use the tool on a daily basis, so it’s the second most insightful piece of software I’ve never used :D


> Git was nothing like Subversion. It had a steep learning curve and confused everyone to no end

I've actually found git much easier to teach to people who don't know subversion than to people who do. It's still a confusing mess, though. Why do you create a branch with `git checkout -b` rather than something like `git branch -c` (`-c` to checkout new branch)?

It looks like the `git switch` command helps a lot, but I never remember to try to use it as I'm used to the old ways, so I never teach it to new people either. I wish I could alias `git branch` and `git checkout` to remind me to use `git switch` but you can't alias over a built in command.


I started on mercurial, then git, and I think that was the happy path / easy on-ramp. Actually I still mostly prefer hg, but it has some downsides and the overall ecosystem really prefers git.


For teaching to newbies, please use the new commands! Much better to distinguish `git switch` from `git restore` than to use `git checkout` for all possible tasks.


My road to Emacs was similar to the author's. When I was learning programming everyone was using IDEs. It seemed like they were inseparable from the programming language. I remember thinking I couldn't learn C++ because I didn't have a C++ IDE. After my second or third language I started to think this was ridiculous. It's all just text! And these built in text editors are terrible! It took the best part of a decade for everyone else to realise and move to VSCode, a vastly inferior editor.


I hope the author would consider trying Doom Emacs, he mentioned knowing Vim and using Vscode, so with Evil and lsp support he will feel at home, the configuration has great defaults!


I tried God Mode and Evil several times, but it never ended well. Mostly because of the muscle memory I accumulated over a decade.

Evil is fantastic, but it doesn’t play well with all the packages in the ecosystem. There are adapters for Magit, Compile, etc., but most packages define key bindings that require Ctrl-Meta cords, which causes cognitive dissonance.

God mode is a neat idea, but I never committed to it, and I found it confusing at times.

Overall, I decided to accept Emacs as it is, with all its quirks, and embrace its way of doing things. I’m not married to it, so I can occasionally cheat by switching to Vim or VS Code without feeling guilty.


I never was able to use evil per se but it works really well in doom emacs (and before that in spacemacs). Oh and use Caps Lock as Ctrl to minify pinky usage!


Evil offends my simple sensibilities. I prefer boon for modal editing due to its simplicity and how it fits with emacs so well.


> It turned a mundane job of fixing bugs into an exercise in skill.

Emacs does this for me. It's like a toy always there to play with when you're bored with mundane job tasks.


LaTeX.

I LaTeX, Git, and emacs every day.


Honestly TeX/LaTeX the engine is a marvel of technology,

But everytime i see a \makeatletter or get a runaway argument it reinforces my belief LaTeX the language was a mistake


(La)TeX is an example of a very enlightened _idea_ that offed itself \footnotemark{} with a spectacularly cursed user interface. It is simply gross to write, and it's difficult for frontends, converters and GUIs to make it much better.

Yes yes, I can already hear the cultists chant "YoU dOn'T wRiTe In LaTeX" but this mentality is precisely the problem. If I can't write directly in your typesetting system nowadays, then I'm sorry, your system probably sucks.

You could unfortunately write an article or thesis quite comfortably in Word or even InDesign, while formatting as you go. (I say "unfortunately" because from a business-model and hacker's perspective, these tools suck.)

\footnotetext{not implying that LaTeX is dead, but referring to how it sentenced itself to the academic niche, in which case it might as well be dead…}


From what I’ve seen from Latex GUI applications, there’s no way we can avoid complexity. Most users will do OK with a basic word processor. We do not need a silver bullet for every use case. You select the best one and move on.


Boost::graph feels like one of the dustier corners of the Boost libraries. I have used it, it worked, but it took a long time to wrap my head around the design and actually adapt it to my project. It is not great for getting simple things done, but it will get them done, with the power and flexibility as stated in the article. You will likely never see the Boost interfaces poking through whatever facade you end up erecting around it.


I don’t know whether this is a provocative or tedious thing to say, but the quintessential ‘enlightenmentware’ to have come out of the past several years is ChatGPT. Name anything that brings as much functionality with so simple an interface and so elegant a core!

(It’s simply a pity that we can’t install it locally or tinker with the internals. :) )


> Name anything that brings as much functionality with so simple an interface and so elegant a core!

The most important thing of the last years has been LSP support! I can live perfectly fine without LLM autocompletion (although I did use Tabnine long before ChatGPT came out), but not without a LSP.


For me, Guix. The Guix system and package manager let you really work with your OS in a way that you just can't otherwise. It turns the great cluttered mess of files and packages lurking in the typical Linux system into a few clean version-controlled files. It is truly beautiful software on every level.


Common Lisp. Not just the language, but the entire runtime: repl, debugger, system loading, quicklisp. It just works.


blaze is one of mine too, specifically for the realisation that your build setup really feels rock solid when your entire dependency list is spelt out as an explicit DAG in a build file. inferring or otherwise auto discovering dependencies on the fly is seductive, but it always ends up letting me down when things get complex.


Great article. For me it's creative use of tools by others - sometimes myself - in a non-standard way (sometimes that becomes standard!) that brings enlightenment. Be it language, source control, networking, you name it.

Aside: assuming the author is reading, minor typo in the first para: But once in a w̶h̶i̶t̶e̶ while


> But once in a w̶h̶i̶t̶e̶ while

Thanks, fixed! It's “occasionally” now. Even Grammarly didn’t find it :(


NixOS fits the mold. The confidence it instills feels similar to when I moved from Windows to (Ubuntu) Linux.


Laravel. I had been developing web apps since I was 12 years old, but Laravel completely changed the speed at which I could deploy and maintain a production-ready app. And the ecosystem of plugins, add-ons and developer tooling is incredible.


> Although I’m writing these words in Visual Studio Code, I always have my Emacs open

He is writting the blog's text in Code praising Emacs. Makes me a bit sceptical or am I missing something?


You can learn a lot from Emacs about software development philosophy but then move to something with more batteries included that requires less upkeep like VSCode as you move forward in life, especially once you lose the ability to tinker (mostly lack of time due to things like children)


Emacs is an operating system, VS Code is a code editor.


Streaming and transforming structured documents at scale used to require some awfully complex machinery such as Apache Camel, Kafka Connect, Flink, etc. I was so happy when I bumped into Benthos https://benthos.dev which can be used as a lightweight replacement in most cases. Bonus: It’s written in Golang, so I don’t have to bother with heavy dependencies and slow start times.


Yeah, one of the best programmers I've ever worked with would launch Epsilon (a commercial emacs style editor for various OSs) each morning and then do _all_ of his work from it.

The closest I come to that is messing emacs keyboard shortcuts when I'm not using a Mac.

I really wish that there were more programs which completely re-examined all aspects of various tasks _and_ incorporated scripting in a fashion which allows folks to take advantage of it.

Some of the apps I would consider if putting together such a list:

- LyX --- billed as a "What You See is What You Mean" Document Processor, v2.4 is looking to be quite promising...

- TeXshop/TeXstudio --- the former in particular is _very_ nice for folks who aren't able to devote the effort to learning emacs

- pyspread --- have a spreadsheet where every cell can contain a Python program or SVG graphic is _way_ cool --- I just wish it was as flexible as Lotus Improv/Quantrix Financial Modeler

- Solvespace --- I wish I could do better with 3D --- usually I fall back to OpenSCAD, esp. now that there's a Python-enabled version: https://pythonscad.org/ though I often use: https://github.com/derkork/openscad-graph-editor

- TikzEdt/IPE --- I really wish there was a nice graphical front-end for METAPOST/METAFONT (or that the graphical front-end for Asymptote was more flexible)

On the gripping hand, one has to give props to the Krita folks for making scripting a first-class citizen: https://scripting.krita.org/lessons/introduction


> LyX

During college, I time-tracked how long I spent on each homework for each class. I can confidently say that using LyX instead of LaTeX for my math assignments resulted in me finishing them 50% faster.

I think that most of the improvement was that the WYSIWYM reduced the cognitive load enough that I could write equation reductions inside the editor without having to write them out on paper first.

I highly, highly recommend LyX to anyone who needs to typeset math equations.


That also helps folks downstream --- when I did book composition, the cleanest LaTeX manuscript I ever worked on was done by an author who used LyX.


Have you seen TeXmacs (https://www.texmacs.org/)?


I haven’t used it before, but based on the website it also looks promising.


I’m barely a programmer, but I have been using computers for nearly four decades. Among the various tools that have, over the years, captured my imagination, opened new possibilities, and affected how I create things with computers, the current leading enlightenmentware by far is LLMs. Nearly every day I discover something surprising and useful that they can do for me.


Agda. Not as in something i'd use everyday, but using it definitely shaped the way i reason about programming and type systems


React is this for me. Before it, I was fumbling around with libraries like ExtJS for my first job, but after I started using it the concept of components that produce a view as a functional output of state really made so much sense to me.

It has given me so many powerful primitives to use while coding for the web


I would also say Redux. Even though I grew to dislike Redux, understanding the power of reducers was quite mind blowing.

And further I would throw in Tanstack/Redux query.


I really like the buku terminal bookmark manager. https://github.com/jarun/buku I like that I can just `man buku` when I don't understand something and I can actually find the answer I'm looking for.


Nice!

I used https://wiki.systemcrafters.net/emacs/org-roam/ for a while but switched to LogSeq (https://logseq.com/) because org-roam was buggy.

I like working with LogSeq, but even after a couple of years of using it, I’m not convinced by the Zettelkasten method. Maybe I’m doing it wrong!


Once in a white, we discover a piece of software that transcends mere utility. These tools capture our imagination, open new possibilities, and affect how we design our own systems. I call such software enlightenmentware.

In this article, I praise the software that contributed the most to my enlightenment.

What’s your enlightenmentware?


JAGS - allows you to specify a probabilistic model and sample from the posterior distribution


Radiology technician jobs easier but communist turn it to Blackberry Project [NUANCE) How to get paid doing the developers software for them and getting bullied for it without assurance of the company incare to finance radioactivism . agl@collector.org 150832005@dr.com agl2329762557@gmail.com AGENT NAME: OLADIPUPO ADEOKIN GAFARU


JAX


The only good tools I've ever used are typescript and tqdm (a python progress bar).

Wandb and React can get honorable mentions.


Vim, tmux and fzf

These three tools completely changed how I think about development


You have a typo. "earn to" when you probably meant "yearn to".


Fixed, thanks for reporting!


Closed tab at Bazel.


I quit my last job in no small part because of Bazel. I hated it so much. It tortured me.

I think Bazel is the kind of really complicated language that invites clever engineers to build incomprehensible balls of spaghetti. And the tooling and docs are really underinvested in.

But I got a new job, and to my surprise I've been doing Bazel all day. And I love it. I don't really know why.

All this to say, don't make a final judgement yet, there's something brilliant buried underneath that pile of rules and aspects.


Bazel confused the hell out of me at first, and I think the two-phase execution model (the “plan-execute pattern” as I called it) is to blame.

My favorite thing about Bazel is how easy it is to get stuff done if somebody sets up the rules for you. Copy-paste a code snippet and fiddle with the dependency list until it works.

But as soon as you go deeper, you get overwhelmed with new concepts, and the documentation doesn’t explain them well enough. I think this huge spike in complexity makes people hate Bazel, especially if their colleagues force it on them, breaking the usual workflows.

I don’t love Bazel, but it’s the build system I hate the least. And it taught me a lot.


I work on a couple sets of rules at Google. I enjoy it. Though I agree people really can make balls of spaghetti with it. Everyone’s macros are terrible except for mine . The configuration system is where things can get really out of hand: transitions, selects, flags, etc.


agree - what's more, the author is really talking about blaze, where everything "just works" because there are massive dedicated resources maintaining it. He literally admits that he likes the copy-pasta-no-think-about-it:

> Surprisingly, I didn’t need to fiddle with blaze, nor did I have to understand how it worked. I could copy some build targets and edit the dependency list, and the build worked as expected.

Sure, systems that just work are great, until they break.

bazel on the other hand, not so much. Heaven help you if it doesn't support your use-cases - you will wish you had Google to maintain your build.


Care to elaborate?


It is vscode and es6 to me.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: