A lot of the cultural phenomena around Rust can best be explained in religious terms. It's no accident that we are sometimes referred to as the "Rust Evangelism Strike Force," and the word "zealot" is frequently used, of course largely by critics.
There's a deeper level as well. The concept of "memory safety" which is a core principle of Rust has, I believe, a lot in common with religious concepts of purity and cleanliness. Undefined behavior, by contrast, is an undesirable form of uncleanliness, emotionally similar to trayf in Jewish culture or haram in Muslim culture. Rituals include the `#[deny(unsafe_code)]` incantation, or using cargo-deny or similar tools to make sure uncleanliness hasn't crept in through dependencies.
It's not just Rust, although that's a nexus for a lot of tension and conflict around this issue. Tools for avoiding and mitigating unsafety are often given names evoking cleanliness: "Purify" and "sanitizers."
That'd explain some of the pushback around https://news.ycombinator.com/item?id=34557840 ; the people who think we all ought to be using the same languages and tools are akin to believers in proselytizing religions?
(don't mind me; you might find me upon occasion with the animists, worshipping at the altar of '*' and '&')
Once I saw the light of Rust, I understood that it is the one true programming language.
There are pieces of wisdom in low-level languages, script languages, and functional languages. But after seeing Rust, one knows that there are only fragments of something greater.
I am a sinner, though. Even after I saw Rust, I still write mostly in Python and TypeScript. Out of convenience, out of habit, by social pressure.
When you say that the advocacy for Rustlang can be "best explained in religious terms", how did you come to the conclusion that, when quantified, the other reasons for this behavior are less than the religious one?
Are you certain you are not assuming that Rustlang advocates are arguing in bad faith?
I recall that at Apple, in the 90s, a man named Guy Kawasaki was Chief Evangelist at Apple, and he really connected with the community back when things were pretty rough for Apple. I actually didn’t realize that evangelist had a religious context until many years later - I thought it just meant someone who is enthusiastic and pushes the product.
Do you ever feel that the rude, forceful, and automated bot-spam Rust evangelism has harmed the project? Lots of people view your activities are harmful and you're EVERYWHERE commenting on EVERYTHING that doesn't use Rust. You're not mere evangelists - it's a toxic, corporate astroturf that's taken over this site, Reddit, Slashdot, even 4chan is filled to the brim with your evangelism.
I would say, turn off the bots but I'm sure you won't.
I've only got a second, but feel the need to highlight the Technical Interview series[1] and the many fan-works that pay homage to it. I enjoyed the anthropomorphising of a Package Manager Murder Mystery[2], and the creative writing of Common Tech Jobs Described as Cabals of Mesoamerican Wizards[3].
The article mentions SICP aka the wizard book. In it, there are a lot of metaphors pertaining to magic and spirits, which IMO makes the book much better than if they weren't there. One of the authors of the book, Hal Abelson, famously said:
> There's a good part of Computer Science that's like magic. Unfortunately there's a bad part of Computer Science that's like religion.
This points to perhaps a useful distinction that is to be made here between religion (as in tribalism which leads to holy wars) and mysticism/spirituality/magic (as in deep fascination in search of ultimate truth).
There was quite a long period where he was known to be fictional though, probably including when the OP was written. Which if anything makes listing Mel within the pantheon more appropriate.
One weird thing I noticed over time is the enshrinement of old and crappy technologies, in the face of all reason. Some examples:
Some people for some reason I can't understand have warm and fuzzy feelings about FTP, despite it being a horrible protocol in multiple ways. Yet, for a long time it was often thought of being the "proper" way to download stuff, even though even HTTP is technically better in many respects.
You still find plenty people harping on about the Unix Philosophy, even though that in modern times it's nigh irrelevant, and significant parts of it are technically obsolete. Yeah, parsing text streams was sorta okay in the 80s, but is very troublesome and brittle in the modern age.
And there's an odd amount of people who are still mentally stuck on the Linux of the 90s and having trouble to understand that things have gotten more complicated since then, and software has had to adapt to things like laptops and wifi.
I feel the need to be "that guy" cause this is hacker news, but the Unix philosophy isn't really about parsing text streams, it's about decomposability of a problem. It's way easier to test and debug 4 simple programs feeding into one another other like a flow chart than building a mega program with an insane internal state (the your problem can be decomposed into 4 subproblems).
The basic core revelation that the Unix folks had was "gee wiz, I seem to need to sort data/open a socket/make a directory super often, and every time I personally try to code one of these operations up it's a huge headache that adds hours to dev time"
Everything has good and bad cases where it can be applied of course.
But I see this as a good example of a religious issue because you can often find people think that it's a principle worth rigidly adhering to, regardless of whether it actually does produce better results in a given case.
For instance my experience is that such a model tends to get bad when things get sufficiently complex. Things like GPG and cdrecord are painful to interact with, and would arguably work much better if they were a library because a text stream is actually a pretty bad communication channel.
And sure, I agree that back when Unix was created the model made sense, but it's increasingly less and less relevant.
As the previous poster said, it’s not about text streaming, it’s about composability. One “modern” paradigm that follows the Unix philosophy and has gained a lot of traction is micro services which are mostly viewed positively. You need to extract the idea from its original implementation to apply it today.
I, for one, would very much prefer GPG to be decomposed to follow the unix philosophy - I think I would have a much easier time understanding and composing commands like `gpg-sign --keyfile ~/mykeyfile somefile | gpg-encrypt --to ~/someperson.pub | gpg-asciiarmor > somefile.gpg` than whatever the monolithic incantation is. This would make it much less painful to interact with, especially because GPG is doing a series of transformations on either streams of text or raw binary.
And that'd be awful for many use cases, because maximizing that would make it incredibly unwieldy and brittle.
For instance if you want to deal with keyrings, and be fully philosophy compliant, then that might be a gpg-keyring tool, which means now we have a problem. A signature operates on both a key and a file, and a pipe only works on one stream. Then there's the issue with error handling, since any of those bits can go wrong, and then somebody has to figure out a good way of dealing with all of that.
Or, we could have libgpg instead, in which case our mail client could avoid screwing around with text streams, signals and process management, and just have a nice API with features like callbacks and well defined datatypes that would be a lot more comfortable to use.
EDIT: Even with just the 3 primitives you've outlined, we already can create a huge mess. Is it sign then encrypt, or encrypt then sign? We can combine signing, armoring and encrypting in any order and any amount, requiring the recipient of this insanity to somehow figure out what we did, and then somebody else will do it differently.
On the 4 simple programs vs a complex one...I feel this a microcosm of the microservices vs monoliths discussion, which can also devolve into quasi-religious battles.
> You still find plenty people harping on about the Unix Philosophy, even though that in modern times it's nigh irrelevant, and significant parts of it are technically obsolete. Yeah, parsing text streams was sorta okay in the 80s, but is very troublesome and brittle in the modern age.
Have you confused the Unix Philosophy with Unix itself? The Unix Philosophy doesn't have any technical components which might become technically obsolete.
It's a fuzzy matter because it doesn't have a single agreed on definition. But yes, some parts are definitely showing their age:
> Write programs that do one thing and do it well.
Putting things together from a kit of parts isn't necessarily the optimal way to do many things. In many cases tight integration actually produces superior results, and debugging a system made from many cooperating bits isn't necessarily any easier.
> Write programs to handle text streams, because that is a universal interface.
Or don't, because it's a bad interface, and a cause of many bugs and security issues.
>Putting things together from a kit of parts isn't necessarily the optimal way to do many things. In many cases tight integration actually produces superior results, and debugging a system made from many cooperating bits isn't necessarily any easier.
The reason why this is part of the Unix philosophy is not merely because of the productivity it provides, but largely because of the power it affords end users. Most of the programs on my computer are GUIs that don't interact with each other. How much more powerful and flexible would my PC be if they were designed to do one thing well and be slotted together.
I think back to my days teaching at a college whose CS labs ran on NetWare 3. For “reasons” the user accounts were things like PASCAL; all the students in a course that used Pascal used this account (and kept their files on floppies). I wanted to move to an environment where students had individual accounts. The IT folks said that was way too much work to implement, and showed me the GUI (or TUI?) app they used to manage accounts; I agreed that this made individual accounts very difficult to manage. I sketched out a solution that used awk to identify adds and drops from the class lists the Registrar's office produced, along with scripts for creating and deactivating accounts as appropriate. This elicited much mystification from the IT folks.
A year or so later, we got individual student accounts. I resolutely refused to discover how they implemented it.
The Unix Philosophy is about building small components that compose well together. The fact that some applications benefit from tight integration in no way derogates from its value.
So absolutely true. I work in an “operational facing” engineering group. We are engineering but process data (daily/monthly data processing for factory and customer requests).
Having a composable process using the Unix philosophy makes it trivial to stop processing at any point in the work flow for that one customer request that comes up once a year. There is no ROI to make the work flow support these edge cases, it’s cheaper even after a decade or more just to one-off these special cases.
But this would be completely impossible if we didn’t architect the system in smaller components. As you say, it’s not text processing, it is units of processing that can be composed, stopped, reordered or have a custom step in the middle.
>> Write programs that do one thing and do it well.
> Putting things together from a kit of parts isn't necessarily the optimal way to do many things. In many cases tight integration actually produces superior results, and debugging a system made from many cooperating bits isn't necessarily any easier.
Define "optimal" here. Without a definition this is a meaningless statement. But even if your definition of "optimal" is "easiest to debug", I'd have to disagree. Tightly coupled systems are nigh impossible to debug because you have to have the whole system composed together to debug anything. Loosely coupled systems allow you to debug each part separately, and small components mean they have little you have to debug. The next part of debugging is the interaction points, but that's easily constrained with a moment's thought when designing and developing a system.
If "optimal" is about performance, that's a debatable thing. The tightly coupled shit some people put together at my office is terrible with respect to performance. It's bordering on negligence the way they designed it and trying to fix the performance issues is almost impossible again because of the tight coupling of different components.
Regarding text streams: Ok, one weak point in some definitions/descriptions of it. The idea of utilizing a more universal data format though still stands. 1 million or so different binary formats isn't much better.
That is kind of the point, you have to define the problem and the tradeoffs involved. Then sometimes one approach better than another. That determination should be made on a case-by-case basis, rather than as a religious matter.
So for instance stringing too many things together also leads to confusion due to the many interactions between disparate bits.
At some point you can end up too much parsing/serialization in the mix, too much accumulation of startup costs, or dealing a system that has components written in shell script, Perl, Python, C, C++ and Rust making the whole system difficult for any random person to comprehend fully.
Do you need to transfer a file? Use the protocol designed to transfer files! Why would you use something silly like a Hypertext protocol for binary!?
I see similar conflation of the name-vs-reality all the time.
My favourite is the “secure” network that is differentiated not by firewall access policy, but by name only. Literally just a label.
Do you need to transfer a file? Use the protocol designed to transfer files! Why would you use something silly like a Hypertext protocol for binary!?
Because it’s faster with http, doesn’t require firewall/nat trickery at either side, is secure out of box via https, can be easily gui-ed via index plugin, cached via a proxy/cdn, supports “moved elsewhere” and other issue reporting.
And if you ignore that “Hypertext” word that is there historically and look at http as just a metadata+data transfer protocol, it is clearly superior to ftp for any sort of data transfer.
Don't know if that fits the bill, but for a lot of people (me included) ftp is the reminder of simpler times, where you'd just put your username /password and address in an explorer window and drag and drop files.
It wasn't secure, nor reliable, but it felt really simple and easy to grasp, especially for inexperienced users. We have way better choices now, but I've spawn many more quick and dirty ftp servers at home than I'd be willing to publicly admit. Hell, my printer pushes scanned files to an ftp.
Containerized microservices for a todo list app is an example. No one seriously claims that an example like that needs to be implemented that way.
People were using the “Unix philosophy” in the 2000s to try to satisfy the demands of the consumer internet. It failed miserably at that, and was replaced.
In terms of the fetishization of ancient relics: In the photo of RMS, his "halo" is a disk platter (looks like an RP06 platter but likely was from some "winchester" drive of the 80s). The diameter is about 360 mm (14 metric inches), because the entire drive was 19 inches wide to go into a rack.
Those large drives could store an enormous amount of data: sometimes as much as a megabyte per platter!
I use Deities & Demigods for hostnames, and create John Bunyan-esque Pilgrim's Progress maps for VPC config. No downsides to being a medievalist in '23 ;)
Same here, except Lovecraft deities. My domain is rlyeh, and my hostnames at the moment are yog-sothoth, azathoth, and nyarlathotep. cthulhu, dygra, and yidhra are retired hosts.
Related academic publication: "Worship, Faith, and Evangelism: Religion as an Ideological Lens for Engineering Worlds", Ames, Rosner, Erickson,2015
"...a common ideological framework that appears across four engineering endeavors: the OLPC Project, the National Day of Civic Hacking, the Fixit Clinic, and the Stanford d.school."
Laurent Bossavit's "The Leprechauns of Software Engineering"[1] is fun and accessible. It claims that much of what we consider fact in software engineering is actually folklore.
The Tao of Programming is not a website. It’s a book. If you saw the whole thing on a web site, that was an illegal copy. Of course, it was also most certainly missing the illustrations, foreword, etc.
Your use of the ASCII character sequences `` and '' as substitutes for “ and ” are antiquated at best. It only looked good on old X11 fonts, and that was a misfeature of those fonts to begin with.
There is also many mistakes in your text: Where, in 1.1, your web page has “The user is pleased and there exists harmony in the world.”, my book (7th edition) has “The user is pleased and there is harmony in the world.” In 1.2, my book has “Each language expresses the yin and yang of software.”, but your text capitalizes: “Yin and Yang”. In the following paragraph, your text writes “COBOL”, but my book has “Cobol”. There are many more (including missing words, wrong words, etc.), but I don’t want to write them all here.
Is simply reporting errors considered “haranguing” and “having the audacity to insist they change it”? Do you have the same attitude towards people who report bugs in your code?
it varies; sometimes people are being helpful by reporting bugs, for example because they don't know how to fix the bugs themselves, or because they're paying me to solve their problems so knowing what those problems are is the first step in getting paid, but sometimes they're just griefers looking for someone to bully
i suggest you fix the errors in your copy of the page and post the url here
I was not criticizing the book, I was criticizing the transcription on the web page. And a web page is continually published, so it is perfectly reasonable to expect it to be kept up to date enough to at least not look terrible on modern systems.
yeah, and i think it's really unfortunate that we ended up with unix aping the misguided font choices of microsoft windows on this point instead of microsoft windows adopting the superior choice previously made by unix
Zork and the learning what do game "spells" do on objects (and specially with Spiritwrak) are obviously related to the process on learning arcane/obscure commnds and their arguments.
Pre-infocom Zork I-II-III's ambient (Dungeon) was obviously related to the MIT and its rooms. In Adventure, you explored the Mamooth Cave. With Zork, you learn about the "mystical place" of programmers with weird spells, magical-techy places (heck, Zork is anachonistic) and figuring out the mechanics by yourself as hackers do.
What sort of religious/spiritual folklore could we come up with?
Turing meeting his end with the apple is an obvious one, but I'd rather imagine that, for having brought computing to the mortals, he's chained to a rock in perpetuity, when he attempts to port DOOM to the ACE, only to be foiled each evening when the moths show up and immolate themselves, destroying various valves, requiring him to start afresh each morning...
Shannon came down from the mountain and gave us sequences* of symbols; various later prophets (McCarthy, Crockford, etc.) have added new structures on top but it's difficult to go very wrong if one hews to the mitzvot of the old ways: after all, string homomorphisms stream.
(Rabin & Scott introduced monotapism in https://news.ycombinator.com/item?id=34561797 ; polytapists suffer [both in this world and any possible successor] by losing the grace of Boolean closure, for ever and ever. Amen)
Very near the Western Wall is a mosque whose wall contains the parallel port where Englebart (peace be upon him) plugged in the divine Mouse.
Every generation rediscovers Confucian Rectification of Names for themselves; that's why we have so many names for the same things in Informatics. (cue https://xkcd.com/927/ )
When Siddhārtha Gautama left the palace for the first time in his life, he encountered legacy code, bit rot, abandonware, and an Agda programmer. Shocked at the rampant suffering in software, he started his noble quest to free software...
* Die ganzen Zahlen hat der liebe Gott gemacht, alles andere ist Menschenwerk. — LK
I did a stint at an old monastery in Estonia that was converted into a sort of hostel for international programmers. I was assigned a dorm where I spent much of my time wearing simple muted clothes, quietly writing code on an old computer with an IBM mechanical keyboard, just like all the other programmers.
When we were not writing code, we did chores around the monastery, usually I was sweeping the cobblestones or fetching supplies from the small town near by.
Some took to more technical chores such as managing the local area network of the monastery; there was no internet access but computers at the monastery could still communicate with each other locally.
Most days we would start off by listening to sermons from wise old programmers in the grand hall. The past Christian iconography had been taken down. The simple stone architecture that remained with its thick pillars holding up soaring ceilings far above our heads encouraged pure creative thought.
We did not sully our thoughts with pointless conversations, a brief nod while passing another programmer in the halls was enough social interaction for the day. Anything else could be taken online to the monastery’s IRC channels.
The best part was the libraries. Books upon books about programming that you’ve never heard of or even seen, that were never sold commercially. Binders filled with notes from ancient programmers printed on pages now yellowing with the passage of time. All manners of topics and languages were available here. Floppy disks with ancient programs, old punch cards, vintage machines and peripherals. Secrets that you could not learn anywhere else.
Maybe some day I’ll return and spend my days pontificating over a new generation.
Something like this would appeal to me, actually. It reminds me of Nesin Mathematics Village in Turkey. There are also some similarities to Recurse Center, where I did do a 3-month batch in 2017. It also has a library with some rare items, and a good collection of retrocomputing artifacts.
For me the "mythological" aspect comes in when trying to understand some recursive processes. SICP compared them to incantations. Sometimes it's hard to keep in one's head all that a tree-recursive process is doing. It's not technically magic, of course, but has some of the character of an incantation.
My closest to religious experience was a C program crashing after removing an unused variable. I repeated this and tested multiple times by adding and removing the variable and running the program.
Now I obviously have a believable rational hypothesis for that behaviour but then it was mystical. I left the variable and got a passing grade (it was a student program).
Computers are extremely haunted. It's often remarkably they were working at all before you made the change, then you prod and all the goblins come out.
Similar experiences I've had:
* Putting a big array on a stack (early in C programming career) - compiles fine, instant crash at runtime.
* Colleague discovers that a function needs padding with NOPs to achieve precise timing. Number of NOPs needed varies depending on code changes (even NOPs added or removed) in functions higher up the same source file.
* Occasional crashes in low level routines on ARM64 after changing completely unrelated code. The stack appears to contain a struct from elsewhere in memory instead of ... a stack.
(First one was probably just the array was too big for sensible stack management/ growth and a friendlier compiler would have just told me. Second one was to do with the size of memory pages in the flash - there was a delay if you ran over a boundary. Third one was, IIRC, the variable containing the base of a temporary stack occasionally getting splattered to point into other data structures! The actual stack was fine but you couldn't see it anymore)
That one confounded us for a while. There was no true parallelism going on in the system but the code that needed nop padding was the interrupt handler.
That it needed padding wasn't unreasonable - we had tight timing constraints. It would have absolutely made sense for it to race with another part of the system but that didn't seem to be the nature of the interaction. The interrupt handler could always run when it wanted to.
The eventual deduction: the flash memory had 256 byte pages at hardware level. I inferred that there's some initial access cost to open a page (and probably cache it into SRAM or something).
The build was putting later functions in the file at later addresses, so what you put in other functions changed the alignment of the interrupt handler. If you requested 256 byte alignment for that function then you still needed nop padding but it was completely consistent.
This is usually the result of undefined behavior somewhere else. Undefined behavior is so insidious because it creates exactly these unexplainable situations. Probably the compiler was running some kind of reordering or optimization which is proven to be safe for defined behavior but when run on UB code, causes the program to segfault on changes that would normally be no op.
It sounds like an array size problem. int a[50]; int unused; a[50]=55; // UB, but compiler probably placed unused after a[49] which prevented the crash.
Somewhat off-topic question: if I wanted to try to re-interpret assembly as a religion (i.e. knowledge left for us by our creator as opposed to a programming language for a technology we understand); which book about assembly would be most appropriate to choose as the core holy text?
That's a tricky one because there is no "assembly language", so much as N different ways of mapping text onto numbers that a processor can interpret.
First the target instruction set changes. You may have a `mov` for copying between registers but you may not. For that matter, you may not have registers. You probably can write raw bytes in the middle of an instruction stream for instructions that the assembler doesn't know about, and if you wrote the entire program like that, you'd (almost) have the binary itself.
Then there are common conventions like `label:` and being able to write basic arithmetic in a place that an instruction expects an immediate operand, but they're at the whim of whoever wrote the assembler. There's probably a macro layer or two.
There is a strong analogy remaining though. Some creator gave us the processor, with documentation of some quality about how to make the piece of sand do anything, and on top of that someone (who may have been the same creator but may not) usually writes mnemonics for the instruction set and from that, an assembler.
The core holy text for a given processor is then something like https://developer.amd.com/wp-content/resources/Vega_Shader_I... - an incomplete description of what the magic sand does, from which one who is sufficiently determined can construct arbitrary computation.
I quite like the idea of Assembly being a vague network of cultic religions without shared canons. If you had to choose texts though, the Story of Mel and Hacker's Delight would definitely be in there.
I love how in the link to the Linus Torvalds flamed rebuttal about Minix, one of the arguments they were having was whether a micro-kernel or monolithic kernel was better... This was in 1992! It's funny how some things never change :)
Terry Davis and Temple OS. God communicated to him through the random number generator.
This article is meant to be humorous, but I think it unintentionally hit on some important truths.
Terry Davis Was Right
Religion Is Computing, Computing Is Religion
Terry’s big idea was that it is not possible to separate religion from computing.
Since the beginning, every complex society has been based on a shared socio-cognitive operating system, a religion. Your religion says you must live a certain way. It tells you what is valuable and what is disgusting. It gives you your fundamental paradigm and ontology of belief through which you interpret the world. It tells you to read certain texts, listen to certain traditions of wise men, and meet with like-minded others at certain places and times. It tells you to avoid certain influences, and seek out others. It gives you techniques for structuring thought and mind and memory. When you have doubts, it gives you procedures to ask for and receive insight. It tells you who you are, and what your life is for.
I'm researching these ideas more since ChatGPT has shown me how to make certain religious and spiritual ideas more accessible.
Levandowski shut down his AI church after he got in trouble with the law for something unrelated. He was too soon, so it's only a matter of time one starts up again.
I hate to be a kill joy, but none of these are true examples of religion or mysticism. They are at best playful examples of hacker culture, and at worst examples of arguments with opinions "strongly held"; i.e. flamewars. "Religious wars" in tech is not to be taken literally, nobody thinks these are spiritual in nature; it's an analogy to the fury of real world religious wars.
The Tao of Programming takes the trappings of Taoism but is not to be taken seriously as a religious or spiritual text.
When someone says a programming book is "the Bible of X" this is no more religious than when videogame creators refer to their documents collecting their fantasy world data as "their bible", as in "the Fallout bible". This is just borrowing the popular image of the Bible as the "authority" on something. It's common usage, not religious.
TempleOS is an outlier, and not a good example. Its author is arguably mentally ill. If I remember correctly he is also a bigot, and nobody would claim bigotry is an essential part of working with computers.
It's not true that most hackers are Buddhists, where's the evidence for that?
Folklore.org is about anecdotes about early Apple (I wouldn't discount some have religion as the background because I haven't read them all, but I can attest the vast majority are non spiritual in nature, and instead about nerds, tech and infighting). They are fascinating because Apple is a major player and its early history is full of quirky anecdotes. It's not folklore in the sense of gnomes and ghosts.
Stallman is an atheist; his "saintly" photo is a joke.
And so on, and so on. I know the author of TFA knows all this; I bet most readers of HN also do. So why am I such a bore?
I would say there's hacker culture, humor and tradition, yes. Even ideology (what is Free/Libre Sofware if not an ideology?). If that's what you mean.
But religiosity or mysticism? Not in programming, though of course individual hackers probably run the gamut from religious to atheist, with all beliefs in between.
TFA is a joke, by the way. The author obviously understands this. But I'm worried it will get interpreted literally. For example, by the person who submitted it to HN.
I think it gets interesting when you get to mythology, which is not necessarily meant to be taken literally. The founding myths of programming? Babbage and Lovelace. The Tech Model Railroad Club at MIT. von Neumann and Turing.
That may be the falsifier somebody is looking for: How much software lasts? How many programmers last?
There may not be "illegal" people but there are people who gain entry by illegal means. There may not be "suicidal" software but there is software which is so recklessly constructed that it is bound to crash and burn regardless how brightly shines the gold it showers on its creators and acolytes. There is "original sin" in programming. There is a hell of a lot of venality.
Most programmers don't last either; their lords and masters recognize this, just as lords and masters throughout history have tolerated religiosity in their serfs... up to a point.
No, my argument is that none of the examples given by the author are of people acting religious.
The most obvious one is the photo of Stallman that opens the article, posing as "Saint Ignucious": that's a joke, and Stallman is a self-declared atheist. Likewise, "Folklore.org" is a website about the early history of Apple; anecdotes told by the people who were there; there's nothing spiritual or mystic about it. I also explained many of the rest of TFA's examples.
The only exception is TempleOS, but that was the work of a schizophrenic and maybe not the best example of the hacker community at large. I wouldn't draw any conclusions from that particular example.
I honestly don't understand what you mean. Can you challenge one of my assertions more directly, so that I can better understand?
For example, do you believe Folklore.org is about what people commonly understand as "folklore"; or that Stallman believes himself a saint; or that programming "religious wars" (e.g. vim vs emacs, or Linux vs Windows) are truly religious in nature? Do you believe when people talk about the "Dragon Book" they revere it as a mystical book, or that someone believes the Tao of Programming is truly a religious text?
Religion has permeated and shaped common language so that a lot of the expressions we use originally had a religious meaning but don't anymore. Others, like "crusade" or "religious wars", or even "evangelize" are not meant to be taken literally, they just evoke religious imagery but nowadays often mean something else.
I think people can be canonized by others without a church, yeah. An Irish Catholic might fight me if I insult St. Patrick. A wokester might fight me if I insult George Floyd. A techie might fight me if I insult Elon or Stallman. The behavior is there and very real, and whether it's fueled by a religious or abstract idea is completely irrelevant to me if I don't share those beliefs.
If someone does canonize Stallman, then Stallman's beliefs on the matter are similarly irrelevant. The "religious" behavior is exactly the same. There's just no supernatural element. But you already don't believe there's a supernatural element. So I guess I just don't get the distinction you're making here.
Just the idea that you think something can be "truly" religious exposes this for what it is: Your opinion on a thing. You're an atheist and you don't think anything is truly religious, right? All pictures of Jesus are a guy dressed as Jesus. I just don't understand why the picture of Stallman dressed as Jesus is different. To you I mean. I get why I think that but I'm a Christian.
Do you have any actual religious behaviors to point to that aren't already human behaviors? Because right now it's like, two guys are doing cartwheels. But one guy thinks cartwheels are very important so it's fundamentally different when he does them. The other guy was just doing cartwheels for fun so it's not the same.
No, I mean there's barely any relation between your comments and mine, that's what's stopping me from engaging. It's like trying to converse with a wall.
This may shock you but I don't hate religious people. I guess good for you that you stopped hating them?
You assume way too much. I notice you failed to respond any of my questions, and so I'm uninterested in continuing this conversation.
No, they aren't. There's no separation between the religious and the non-religious. What atheists describe as religiousness is actually just how humans work.
There's a deeper level as well. The concept of "memory safety" which is a core principle of Rust has, I believe, a lot in common with religious concepts of purity and cleanliness. Undefined behavior, by contrast, is an undesirable form of uncleanliness, emotionally similar to trayf in Jewish culture or haram in Muslim culture. Rituals include the `#[deny(unsafe_code)]` incantation, or using cargo-deny or similar tools to make sure uncleanliness hasn't crept in through dependencies.
It's not just Rust, although that's a nexus for a lot of tension and conflict around this issue. Tools for avoiding and mitigating unsafety are often given names evoking cleanliness: "Purify" and "sanitizers."