Hacker News new | past | comments | ask | show | jobs | submit login
The Story of Mel (1983) (utah.edu)
445 points by thunderbong on Aug 9, 2022 | hide | past | favorite | 167 comments



So, the RPC-4000 version of blackjack seems lost, but the LGP-30 version exists (and can be run on simh). I've disassembled and partly annotated it, and found that it too has a sort of cheat switch.

The LGP-30 has no source of randomness to use as a seed. From loading the program, if the player plays optimally, the games will all be the same. Over the first few dozen games, the player ends up in the hole (IIRC, noticeably more than the long-run house edge).

The LGP-30 has one conditional branch instruction, that tests the sign bit of the accumulator. But if the sign bit on the instruction is set, and the TRANSFER CONTROL switch on the front panel is set, then the branch is always taken. This appears once in the program. On startup, if the switch is set, it marks two of the aces as already dealt. This perturbs the sequence enough so that, over the first few dozen games, the player has the advantage.


Too late to edit, but I misremembered a little: from an earlier comment that I'd forgotten I'd made¹, the program image on paper tape had the aces marked dealt, and the TRANSFER CONTROL test switch merely skipped the part of initialization that cleared it. This means that (with sufficient time and dedication) one could in principle prepare multple tapes with different starting configurations, analogous to editing a binary to change a hardcoded seed.

¹ https://news.ycombinator.com/item?id=20489774


This page [0] includes an analysis of the manual for the possible machine Mel programmed in the story along with a probable photograph of him.

[1] is an instruction manual for the blackjack program itself, written by Mel.

Classic story solidly in the “should be reposted every six months or so” cohort.

0: https://www.freecodecamp.org/news/macho-programmers-drum-mem...

1: http://bitsavers.trailing-edge.com/pdf/royalPrecision/RPC-40...


> I did eventually manage to get in contact with Mel, but I scared him away, unfortunately. That's a story for another day... :-/

I'm... very curious what this entailed


The answer can be found in this very thread:

https://news.ycombinator.com/item?id=32400786


It’s interesting to me that caption to the 1956 newspaper picture in the first linked article identifies someone from the NSA. I thought the mere existence of that agency was basically classified information well into the ‘70s.


That’s puzzling! Thanks to the author’s sourcing, a quick search turned up a scan of the full article [0]. My initial thought was, maybe it’s a name collision and this National Security Agency in 1956 is an insurance or accounting firm, but the source article doesn’t elaborate.

0: https://www.librascopememories.com/Librascope_Memories/1950_...


I've always loved this story as a tribute to the early hacker ethos. So easy to forget how high up the stack we live these days.

There are a few other gems at this site, in the "hacker folklore" appendix. http://catb.org/esr/jargon/html/index.html

I like "How to Become a Hacker", and the AI Koans too. Lots of good stuff there. These old pages have a certain biblical magic to them.

I love our field! Long live the hacker.


Thanks, I'll read your link but I must dispute your enthusiasm:

"It took me two weeks to figure it out"

Leaving a no-documentation time bomb for your successor is not to be celebrated, nor IMO is it a hackerish thing to do.


It’s not to be celebrated, but it’s an extremely hackerish thing to do.


I wrote a Twitter bot in Python that tweets one of the Hacker Dictionary Words of Day once a day.

https://twitter.com/hashtag/hackdict?src=hashtag_click


Don't miss this: https://news.ycombinator.com/item?id=20489273 (edit: from kps via https://news.ycombinator.com/item?id=32400756 - thanks!)

The OP has been posted so many times that even 10+ years ago people would mention how often it had already appeared. Yet there have been surprisingly few interesting discussions, either of the original story or in related threads. These are all I could find. If there are others, please let me know!

The Story Of Mel - https://news.ycombinator.com/item?id=7869771 - June 2014 (77 comments)

The story of Mel (1983) - https://news.ycombinator.com/item?id=678999 - June 2009 (22 comments)

The story of Mel, a Real Programmer - https://news.ycombinator.com/item?id=181144 - May 2008 (9 comments)

Related threads:

Mel's Loop – A Comprehensive Guide to The Story of Mel - https://news.ycombinator.com/item?id=31458048 - May 2022 (2 comments)

LGP-30 – A Drum Computer of Significance - https://news.ycombinator.com/item?id=20484330 - July 2019 (39 comments)

The Story of Mel Explained - https://news.ycombinator.com/item?id=9913835 - July 2015 (25 comments)


The Story of Mel, explained: https://jamesseibel.com/the-story-of-mel/

Great addition to the original story, as there's lots of background required for modern readers.


Discussed once here:

The Story of Mel Explained - https://news.ycombinator.com/item?id=9913835 - July 2015 (25 comments)


Worthwhile checking out some of the other stories in the parent directory (https://www.cs.utah.edu/~elb/folklore/).

Thanks for posting this, it made for a fun read and some of the other ones (especially the Robin Hood and Friar Tuck story) made me chuckle.


I think I found Melvin Kaye's obituary a few years ago but I couldn't find it anymore. A pity that both him and Ed Nather passed away :(


There's this comment long ago about getting in contact with him that has always made me curious, but no information was proffered: https://news.ycombinator.com/item?id=7871260

And, of course, there's this picture that was found of him:

https://zappa.brainiac.com/MelKaye.png


> but no information was proffered

What more would you like to know? I'm bad at telling stories, but I can do Q&A.


:D I'm just guessing at questions, since I don't have much context.

You found him-- did you get to really talk to him? Find out how he looks back on that era? Did he keep programming?

What scared him away?


> You found him-- did you get to really talk to him?

Yes. Here's the e-mail exchange: http://acuozzo.sdf.org/Mel.pdf

(I moved all of my e-mails to Gmail via Thunderbird years ago which is why the sender address is different.)

I found him after Bill Bryner mentioned in an e-mail that Mel "was trying to become an expert (Master?) at Bridge and also played flute with the UCLA pep band for basketball games".

I figured he hadn't moved far from where he was working in LA at the time and I was right! His e-mail address was listed among the members of the Thousand Oaks, CA ACBL Unit 532 Bridge club/league.

> Find out how he looks back on that era? Did he keep programming?

I couldn't find any of this out because I scared him away. My guess is that he just wasn't the nostalgic type.

> What scared him away?

My guess is that he was spooked because I contacted his son a few days prior.

… and the webmaster responsible for running his son's business' website.

… and several other people. I must have seemed like a genuine creeper.


I guess Mel could have been in the region of 80 years old at the time, and the few 80 year olds I’ve known have been very private individuals.


Oh. Thank you for sharing the e-mail exchange.

That's too bad he wasn't interested in talking about it all. I can imagine if he was aware of it, having random people pop-up wanting to discuss could become annoying.

Thank you for the info!


Bonus trivia: those "Raw, unadorned, inscrutable hexadecimal numbers"? Those would not have been the 0-9a-f we're familiar with today, but 0-9fgjkqw[1]!

[1]: http://ed-thelen.org/comp-hist/lgp-30-man.html#R4.13


The reason for FGJKQW is simple: Look at the list of single-letter mnemonics for the 16 opcodes (called “orders” in the manual you cite). Sort them alphabetically, and look for the first six letters that are unused. Viola! (Of course, the problematic letter O is skipped, and for some reason, V as well.)

That’s how it was explained to me in 1973 by Mr. Willoughby, a math teacher who also taught Computer Programming in my high school. We didn’t actually have an LGP30, but he had learned on one, and graded our programs handed in written on paper.

Fortunately, this was only for the first half of the class; after we learned this machine language, we graduated to a higher-level language, Neat3, for the NCR (yes, the cash register company) Century 100, one of which the school district did own. Subsequent assignments were handed in on punchcards and actually compiled and run. By the teacher. After school. So you’d get back your compile error the next day, or if you were more fortunate, your output. If your program compiled and ran and gave the right answer the first time, you’d get a grade of 100. Then 97, 94, 91, etc. It was quite the motivation for carefully planning things out ahead of time.

And his reason for starting with machine language? “Well, you can’t expect anyone to understand what’s really going on in higher-level languages if you don’t know what’s happening underneath, right?” Worked for me.


> And his reason for starting with machine language? “Well, you can’t expect anyone to understand what’s really going on in higher-level languages if you don’t know what’s happening underneath, right?” Worked for me.

I teach a middle school class called "Computer Organization and Design". It's basically from gates and truth tables, up to implementing ALU functions, to understanding bits of sequential logic... then some handwavy computer architecture stuff to save time, and finally on to handing out a simplified, reduced-instruction ARMv7 THUMB machine language reference, and students writing their own programs on paper and assembling them.

There's a couple digital logic labs in there, and finally they get their own little computers with a simple monitor program that lets them enter programs, single step, and view registers.

I wasn't sure how reaction would be. Many of the students love it. Middle school students seem to do pretty good at this stuff, too-- their memory of learning arithmetic is recent enough that learning a bunch of new similar rules (combinatorial operations, multiplexing, hexadecimal, instruction encodings, etc) seems simple. And, well, no one told them this stuff is often considered "hard."


Amazing!

And it parallels the early learning my peer group and I experienced in the 80's.

We had Apple 2 type computers to work with and a small group of us were split off to do a deeper dive education. And really it was a sort of student guided education.

Basically, the teacher asked is to declare what we were going to try and do and that was more about making sure we did something besides play games than it was anything else.

So we did that and came to the same conclusion!

And that kicked off a love of the lower level computing that persists to this day. 6502 was not too difficult and what the teacher did was of high value:

They found us the info we needed. Data books, magazines, whatever contained material we could use. And we attacked it together.

One thing I picked up on super early was the powers of two and how it really all boils down to address lines! Was a great insight for a young person and I remember teaching others about hexadecimal, the first 16 powers of two and lots of number related things.

Others had something they grokked and together we learned a ton, each person teaching what they could to the others and doing projects together.

And we are talking stuff like:

Count numbers 0 to 9999999 on screen

Draw a Sprite and move it around

Play music on the speaker

Do maths of various kinds.

These were all slow or impractical using the Applesoft BASIC.

In assembly language, they made sense and were performant.

As machine code, they could be loaded from disk and called by BASIC.

The logic parts took me a while, but the moment AND, OR, XOR started to make sense was the moment I really started to do computing. Those things and the numbers and how they are used to represent stuff was the core of all that was to come.

We all sort of came to that understanding and it was all a beautiful experience. One of the best bits of it was our teacher being curious and as playful as we were! The tech was literally intoxicating.

Bet your experiences are much the same.

For what it is worth, the other core piece was I/O. That moment when one realizes they can POKE a number into a register, and for that matter knowing what a register vs. a RAM memory were, and then seeing an LED or hearing a speaker click were the absolute BEST!

In my view, making sure we have this kind of education happening is super important and ultra high value.

It is no different from the other basics:

Money

Wood

Metal

Computing

Cars and farm machinery

Electronics

Etc...

Many of my class ended High School with good, all around basic competency. I grew up in sometimes profound poverty. For me, it was actually a benefit because I was lucky to be among people who did not judge and gave me opportunity to put all those learned skills to use.

Made a huge difference in my own life.

But the same can be said of just about all the students I know being exposed to what I will just call fundamental type education. Everyone was capable and ready and able to learn just about anything.

Looking back at my class, a fair number of us went to college. Another big slice went into the various trades, and some into business.

The ones who were not involved in the fundemental type education generally struggled more.

Now this is all anecdotal, but I do find common themes when these discussions happen. Generally speaking, it can really help people and rarely hurts them to be exposed to potent basics early in life.

Well done.

What have they done with their little computers? Anything notable? If you can share, please do.


Thank you for your detailed answer! I enjoy hearing about other programs and experiences. I didn't have the benefit of something like this growing up, so I have to guess a fair bit about what is helpful and will only know for sure 15 years from now when my former students might tell me.

> What have they done with their little computers? Anything notable? If you can share, please do.

We only really picked up the computers themselves in the last 2-3 weeks of class; we were much more into theory and logic. I provided some system calls to move sprites around on the top of the screen, so there were some simple games written using that... and one student implemented a game inviting you to guess the computer's number.

> Basically, the teacher asked is to declare what we were going to try and do and that was more about making sure we did something besides play games than it was anything else.

This sounds really awesome. This sounds like what I try to do with after-school robotics.

My (older) brothers had a really awesome educational experience where the local district had set up something they called "Independent Learning Module" and later "Kleine Schule" that was very much self directed.

Education is much more regimented now-- whether in public schools or private schools. All the emphasis on standards, unified curriculum, alignment and articulation of courses to colleges, etc, have certainly improved the depth of education and likely improved the average quality of education. But these changes also make it harder to offer an experience like this.

Perhaps the pendulum is slowly starting to swing the other way, with a lot of discussion of "student agency".

From the after-school program-- it's interesting how students struggle at first with setting the agenda. I have high achieving students who breeze through all kinds of things that I think of as difficult. But then other things, like organizing a set of to-do items onto a calendar to make a coarse schedule, often seem daunting to the students and cause paralysis.


It was awesome.

Us: We want to move a little ship on the screen

Teacher: Ok, let me get some info.

A few days later, we take that stuff and work out what we think needs doing and then we do it.

Of course it never works, or is slow. I remember the first attempt being slow. We computed all the crazy screen addresses. On an Apple this is far more convoluted than one might think.

An iteration or few later we have lookup tables, now understand why those were in the Apple documentation and a ship moving.

Wash, rinse repeat for all sorts of stuff.

If I had it to do over again, I would collect the bits of code, info and package it up with a lesson in each one.

Students work through those and end up with a library of things they can combine to do fun things.

We would get stuck on big picture stuff. That is where the teacher was super high value.

We would lose interest, or wonder about something else. They would remind us what we said we wanted to do and help us do it minimally.

That itself was an important lesson.


Your teacher is great. People may not appreciate now but just getting info is hard.


Yes it is. I took too many words to convey that in my comment above.

Two very high value things:

Getting us kids relevant, useful info

, and

Time / Focus management.

When done with basic human consideration in mind, those two amplify the good thinking and knowledge building.

Great teachers get that and apply it to the course, project at hand and the students benefit greatly.


This was partly because letters also served as instruction codes with some mnemonic aspects to them. (This is also found on early Univacs. Coincidentally, Stan Frankel, the designer of the LGP-30, came from Eckert’s and Mauchly’s team.)

As in:

  b - bring from memory (load into AC)
  h - hold and store (deposit AC in memory)
  c - clear and store (deposit AC and set it to zero)
  y - store address (store AC as operand)
  u - unconditional transfer (jump)
  r - return address (stores PC+2 as operand at given address)
  t - test (conditional transfer on AC negative)
  z - stop (break point in operand)
  p - print
  i - input
  a - add
  s - subtract
  m - multiply (most significant bits of result in AC)
  n - multiply (least significant bits of result in AC)
  d - divide
  e - extract (mask, logical AND)
This left only a limited set for hexadecimal encoding, namely f, g, j, k, q, w. (And yes, "l" is 1, since Flexowriters weren't invented as computer terminals, rather they are electric typewriters with a potential for automatization.)

Since hexadecimal was used for operands, which were actually addresses on the magnetic drum given by track and sector numbers in binary, which in turn resulted in a rather interesting single-word instruction format, this further complication may not have mattered much.

Compare https://www.masswerk.at/nowgobang/2019/lgp-30


Its even worse. The keyboard (as normal for typewriters of the day) had no numeral '1'. Instead you would type a lower-case 'L'. So the actual hex encoding is 0L23456789FGJKQW.


I've never heard of this. Was this just a quirk of the LGP-30?


Probably not. Look at your keyboard. Those are the keys on the right side of the home row. They probably used them because it was easier and quicker to type than 'abcdef'.


> Those are the keys on the right side of the home row.

Why would you want that alongside numbers which are either two rows up or to the side?

Unless the keyboard was something like USPS’s which has numbers as an alternate mode on the home row, and thus made this layout sensible? But that would still make it a quirk of the system.

Edit: yeah it was absolutely a quirk of the LPG-30, per a sibling comment: http://laboratorium.net/archive/2008/04/28/a_few_facts_about...


What keyboard layout do you use? Q and L sure aren't in the home row on a QWERTY keyboard unless you're one of those sales guys that only ever types TYPEWRITER. :P


I'm disgusted... What was the motivation for that?


The convention of using "ABCDEF" didn't get established until well after the LGP-30 was designed in the 1950's. The Wikipedia entry on "Hexadecimal" says "The now-current notation using the letters A to F establishes itself as the de facto standard beginning in 1966, in the wake of the publication of the Fortran IV manual for IBM System/360, which (unlike earlier variants of Fortran) recognizes a standard for entering hexadecimal constants."

The comments elsewhere that note the layout of the Flexowriter character set risk conflating cause with effect. The single-character mnemonics for the sixteen hardware instructions were chosen by the LGP-30 designers to be, well, mnemonic; and then the remaining characters FGJKQW were left to represent the values ten through fifteen. This then forced the assignment of character positions for the Flexowriter device, so that no table lookup would be required when reading opcode mnemonics and also when reading hex numbers; all bits could just be shifted into place in both cases.

Trigger warning: The phrase we were taught was standard in the LGP community to help remember FGJKQW was, I'm afraid, "For Good Jokes, Kill Quiet Women." This seems to have been excised from the written lore fairly early on, as search engines have no record of it.


For resistors, I was taught:

Bad boys rape our young girls but Violet gives willingly

Not yet excised I see.


I found this [0] which says that the layout of the LGP-30 Flexowriter (linked in the article) conformed to fgjkqw, so it may be related.

[0] http://laboratorium.net/archive/2008/04/28/a_few_facts_about...


From https://ub.fnwi.uva.nl/computermuseum//DWcodes.html#A077 the Flexowriter character code was:

    +----------------------------------------------+
    |  00   20   40   60         00   20   40   60 | (1)  triangle
    +--------------------+    +--------------------+ (2)  pi
    |  RE   NL   ST      | 00 |  RE   NL   ST      | (3)  sigma
    |  z    i    p    h  | 01 |  Z    I    P    H  |
    |  0    4    8    j  | 02 |  )   (1)  (3)   J  | BS   Back Space
    |  SP   /    o       | 03 |  SP   ?    O       | COL  COLor
    |  LC   BS           | 04 |  LC   BS           | DEL  DELete
    |  b    d    e    c  | 05 |  B    D    E    C  | HT   Horizontal Tab
    |  l    5    9    k  | 06 |  L    %    (    K  | LC   Lower Case
    |  -    .    x       | 07 |  _    ]    X       | NL   New Line
    |  UC   HT           | 10 |  UC   HT           | RE   REad
    |  y    n    u    a  | 11 |  Y    N    U    A  | SP   SPace
    |  2    6    f    q  | 12 |  *    $    F    Q  | ST   STop
    |  +    ,            | 13 |  =    [            | UC   Upper Case
    | COL                | 14 | COL                |
    |  r    m    t    s  | 15 |  R    M    T    S  |
    |  3    7    g    w  | 16 |  "   (2)   G    W  |
    |  ,    v        DEL | 17 |  :    V        DEL |
    +--------------------+    +--------------------+
    |     lower case                upper case     |
    +----------------------------------------------+
Taking bits 2–5 gives you: 0 l 2 3 4 5 6 7 8 9 f g j k q w


This was assigned reading in one of my university CS courses, and although it was great fun then, as we were mostly all novice programmers learning Java of all things, the madness of the story didn't hit me until a few years later when I had done much more work with C and pointer math.

This part in particular, taken from Wikipedia, still reads to me like Necronomicon level black magic:

> But when x was already the highest possible address, not only did the address wrap around to 0, but a 1 was carried into the bits from which the opcode would be read


I bought the Hackers Dictionary by Eric S. Raymond as a 90s kid and it had this story, as well as a few others. Das Blinkenlights and some AI Koans like the Broken Lisp Machine come to mind.

I used to read them over and over, and it really left an imprint on me. The early hacker ethos was such a strong flavor. Sort of a "one part gnostic, one part mechanic, one part counterculture" vibe.

Sometimes I wonder if that flavor always exists, but shifts from community of practice to community of practice, or if there was something specific about the early days of the net that caused it to arise uniquely.


I've wondered this too. When my dad was learning to program, this was the culture. When I was learning to program, the culture was no longer fresh and alive, but its spirit was still strongly felt; the torch was being passed to us. Have we preserved that light? If my son learns to program, what will "hacker culture" mean to him?


Maybe being able to figure out strangers' discord passwords. More generally, phishing powers


I think in this context, 'hacker' means more 'one that twiddles bits' than 'one that twiddles others bits'.


The definition of 'hacker' was changed and is a moving target over the years.


It's still alive, but it's not the culture IMO. Which is sad.


> I bought the Hackers Dictionary by Eric S. Raymond

That hurts a bit to read. Raymond is/was a huckster who took the original Hackers Dictionary, a communal MIT project, and made some edits. He went on to annoy the Linux community with his CML2 antics around the build system. Lisp/MIT, BSD, and Linux all have their own histories, a mixture of forgettable drama and fundamental difference.


> a communal MIT project, and made some edits

The Jargon File had been dead and dated for nearly a decade when he picked it up. His maintenance and publication was an important part of making this content relevant and accessible to future generations.

Eric Raymond's work is a mix of good and bad-- like you could say about anyone.


Oh interesting -- I didn't know any of the above Raymond controversy (or even really who he was; the name just is etched in my brain from that book)

As a guy who found the Jargon File via that book, I'm grateful he did it. But I can see how publishing an edit of an online forum would ruffle a lot of feathers: 30 years later you get guys like me attributing it all to him.


The introduction makes it pretty clear the sourcing: http://catb.org/jargon/html/revision-history.html (This appears in the printed book, too).

Most of the criticism of his stewardship of the jargon file and the publication as N.H.D. by MIT Press comes after other controversies.


A lot of that ethos I think is just Generation X plus computers. A rebellious generation that came of age during a time when digital technology had just granted kids the power to wardial norad. I think tinkerers have always been timeless, but that kind of cyberpunk culture is something we're unlikely to see again.


I'm solidly GenX, but the hacking culture described by "The Hacker's Dictionary" is a generation older than that -- the era of the MIT AI Lab or Stanford's SAIL in the mid 1970s. Lisp Machines and custom in-house operating systems. The GenX hacking culture was the 1980s and was more about getting the most out of our own microcomputers.


Oh believe me I know it has origins in ivory towers like MIT. Stallman has written all about that for instance. The context with GP was once it had trickled down into more popular culture with the ESR / Jargon File days. We honestly don't know a lot of the true origins of hacker culture, because so much of it started in corporate government enclaves with confidentiality rules. What we know are echos of practices and socialization that crystalized into usenet posts.


It was also about bullshitting on Alliance.


I’m surprised I haven’t seen this linked yet, but Bryan Cantrill of dtrace, Sun, lawnmower, Joyent, etc, fame gave an amazing talk for Monktoberfest 2016, titled “Oral Tradition in Software Engineering”, which features The Story of Mel [1]. Highly recommend checking it out — there are loads of little gems and stories like this throughout.

All of his other presentations are great too and definitely worth a listen if you like this sort of thing [2]. A couple of my favorites are “Fork Yeah! The Rise and Development of Illumos” [3] and “Debugging Under Fire: Keep your Head when Systems have Lost their Mind” [4].

[1]: https://youtu.be/4PaWFYm0kEw?t=644

[2]: http://dtrace.org/blogs/bmc/2018/02/03/talks/

[3]: https://youtu.be/-zRN7XLCRhc

[4]: https://youtu.be/30jNsCVLpAE


Bryan Cantrill's talks are some the best I've ever seen. I've always tried sharing them around with colleagues (with limited success, but still worth it in my view...)


"Don't fall into the trap of anthropomorphizing Larry Ellison" is one of the funniest things I've ever heard: https://youtu.be/-zRN7XLCRhc?t=2302


My favorite line

  I have often felt that programming is an art form, 
  whose real value can only be appreciated 
  by another versed in the same arcane art; 
  there are lovely gems and brilliant coups 
  hidden from human view and admiration, sometimes forever, 
  by the very nature of the process.


More info about the “drum computer” this was written on: https://www.masswerk.at/nowgobang/2019/lgp-30

Previous discussion where another person tried to replicate this: https://news.ycombinator.com/item?id=20484330


Wow this is a blast from the past! First time I’ve read this was in a (paper) magazine some 25 years ago, one of the journalists was kind enough to translate it. The story was like a magnet to me as a high-school freshman, I must have reread it 20-30 times in the following weeks.


I've always found that this story is also compulsory reading for wannabe Real Programmers: https://www.ecb.torontomu.ca/~elf/hack/realmen.html

Strangely enough, it's more modern, but many of the concepts mentioned may be more alien to young programmers. How many will know what's meant by keypunch? Timesharing? RATFOR? Listings? The email address in the header?

BTW, the title (and text) refers to the book "Real men don't eat quiche", a pastiche of the genre that would later bring produce the weirdly unironic classic "Women are from Venus, Men are from Mars".



Found the instruction on how to play the game in the further reading of the Wiki - https://www.mirrorservice.org/sites/www.bitsavers.org/pdf/ro...

It was nice to see the original write up and to read how about it worked for the user.


To be complete on old programmer's lore, don't miss "Real programmers don't use PASCAL": https://www.ecb.torontomu.ca/~elf/hack/realmen.html

And of course, "The rise of 'Worse is Better'": https://web.stanford.edu/class/cs240/old/sp2014/readings/wor...


The Story of Mel was likely in direct response to "Real programmers don't use PASCAL", or potentially another very similar "Real Programmers" writing.


A recent article devoted to the macho side of programming made the bald and unvarnished statement:

Real Programmers write in FORTRAN.

- the first lines of The Story Of Mel


That line was included in more than one post. There are also multiple posts from that time period which said exactly the opposite. It was an ongoing joke, raised to somewhat of an art form.

https://www.multicians.org/thvv/realprogs.html

http://www.bernstein-plus-sons.com/RPDEQ.html

https://www.ecb.torontomu.ca/~elf/hack/realmen.html

https://web.archive.org/web/20080419225755/http://www.suslik...

Old magazines, BBSes, Usenet, and mailing lists are full of this sort of humor.

https://www.mipmip.org/tidbits/project.html


The original version of the post [1] cited the article "Real programmers Don't Use PASCAL" by name. For some reason it disappeared in the later versions.

[1] https://cboh.org/mel.txt


Thank you. I thought I remembered that from elsewhere, and was a little surprised to see the oblique reference to one of the many similar writings from the time.


     The new computer had a one-plus-one
     addressing scheme,
     in which each machine instruction,
     in addition to the operation code
     and the address of the needed operand,
     had a second address that indicated where, on the revolving drum,
     the next instruction was located.

     In modern parlance,
     every single instruction was followed by a GO TO!
     Put *that* in Pascal's pipe and smoke it.
The Apple II floppy disk drive controller had a piece of "logic state sequencer" which was programmed this way: each instruction contained the address of the next one.

Or rather, something like this: the ROM is 256 bytes arranged as 16x16. Half of each byte is an opcode, and the other half is the row of the next instruction 0 to 15. The column is determined by some external inputs to the sequencer, representing state of the hardware.

See https://archive.org/details/understanding_the_apple_ii (sec 9-14).


Reminds me a bit of the poem, "The Last Bug", possibly best read here with links and all: https://everything2.com/title/the+last+bug


"The program used an elegant (optimized) random number generator to shuffle the "cards" and deal from the "deck", and some of the salesmen felt it was too fair, since sometimes the customers lost. They [the Sale Department] wanted Mel to modify the program so, at the setting of a sense switch on the console, they could change the odds and let the customer win. Mel balked. He felt this was patently dishonest, which it was, and that it impinged on his personal integrity as a programmer, which it did, so he refused to do it."

IMO, this story can be a sort of Rorscasch Test that can reveal something about its readers.

For example, when I read this what stands out to me is that Mel brought a sense of ethics to the use of computers.^1

Salespeople at Mel's employer wanted to use computers to manipulate customers. Mel refused to help them.

Others read this story and only focus on Mel's tactics for programming a computer.

"You can learn a lot about an individual just by reading through his code, even in hexadecimal. Mel was, I think, an unsung genius."

In the same way the story's author, who was a Professor of Astronomy at IT Austin, believed that reading Mel's source code could reveal something about Mel, to me, the comments of people who read this story can reveal something about those readers.

"The RPC-4000 computer had a really modern facility called an index register. It allowed the programmer to write a program loop that used an indexed instruction inside; each time through, the number in the index register was added to the address of that instruction, so it would refer to the next datum in a series. He had only to increment the index register each time through. Mel never used it."

To me, "The Story of Mel" is about a person who preferred to think for himself instead of letting others do it for him.

If memory serves me correctly, I recall seeing comments online about this story that "advise" readers, "Do not be like Mel." The question is what do they mean by that. Are they referring to programming tactics, ethics, both, or maybe something else entirely.

1. No only that, but the author was so impressed by Mel's ability that he adopted similar ethics himself.


My great-aunt Gloria told me at this year's family reunion that she had used machine language (not assembler, she was quick to add) when she started programming.


In 1997 I was doing an apprenticeship. It had a very well organized curriculum and well thought courses. One course was based on 8080/8085 cpu. You know what they gave us: A development board (roughly the size of a modern mainboard) with onboard hex keyboard 0-9a-f and 3-4 function keys (halt, run, write at ... something like that). And a book and templated paper. So the exercises, 3-4 hours every week, were: Implement that algorithm on paper in assembler, use the book to translate to hex code on your memory layout, hack the program into your board and press execute. And PRAY that you did not make a mistake in the assembler or the typing.

I still think, that this is the most efficient way to teach kids CPU/Memory and assembler.


I can still "sight-read" much of the x86 instructions from a hexdump or even (E-)ASCII --- the ISA is much easier to memorise and more regular in octal. This is a habit I acquired from the days when I did a lot of binary patching, for various reasons, and good disassemblers weren't yet easy to come by.


A couple years ago, a junior dev was absolutely astonished at my ability to interpret a hex dump as ASCII. I don’t think I remember any numeric opcodes for anything.


Oh wait, I did correctly remember that $20 was JSR in 6502 assembly.


True... and most importantly $A9 = LDA.


My first 6502 bringup on a breadboard was just a CPU and ROM (no RAM) with A9'H'A9'e'A9'l'A9'l'A9'o' while it was attached to a logic analyser so I could see if it was running. Shortly after I got ram working and added a UART, but it was the easiest instruction to see something I recognised and keep the PC moving forward through memory (Of course NOP is good too, but then I don't know if reading is working right or if I'm just reading the same thing over-and-over).

Also my assembler wasn't working, so I had to enter that directly in the TL866 ROM programmer's hex editor & didn't remember any other opcodes.


Neat! I'm curious if you remember anything about this or the materials. I have developed something very similar that I teach to middle schoolers:

https://github.com/mlyle/armtrainer


That looks very identical ;) just so much more modern. A lot smaller and the screen is so much more fancy.

Think that with old style boards, old-style chips, old-style number screen.


Yah! I was familiar with old "microprocessor trainers" and I tried to recreate that. There's nice on-chip debug that makes writing a monitor to do this pretty easy.

I was more thinking-- curriculum? How old were you? Anything particular stick with you that I should be sure to include?


I was 17 to 19 at the time. We had everything, from math, physics, electronics, chip design (board layout simulation etc), micro processors (and after a year assembler on a DOS PC), networking, operating systems (mainframe + Unix + NT), C, Pascal, VBA. Add generic programming, English, project management and some other courses you need for soft skills. 18 months, 40 hours a week, two tests a week all with a full stipend for everyone. After that, you are hardened.

So the curriculum mentioned was more contextual than specific to microcontrollers. They created generalists which are capable of deep diving where they are then used within the company. For microprocessor in detail, I think they focused on translating algorithmic problems and higher level constructs into op codes (something like tail calls).


Wow-- that's really awesome-- a very deep generalist curriculum.

This is along the lines of what I'd like to ultimately create. It's hard, though, because I am offering this in the form of electives, and students have limited slots to fill and as they get older their willingness to take any risk in selecting a class decreases. I teach in both high school and middle school, and I've found I can be a lot "bolder" in what I teach in the middle school so far.


IMHO middle and high school do not have the life focus and specialization yet. Hacking on a microprocessor trainer is no fun and definitely does not bring you anything if you are not in CS and even then you can survive without.


> Hacking on a microprocessor trainer is no fun

I did give them system calls to draw sprites and things on the LCD.

My students had fun competing over all the challenges. Scored very well in the anonymous post-class surveys, too, though I did have one critical score. Also the majority of the students from that class are electing at least one of my classes next year.

Each year I teach one "crazy" class with undergraduate level material to MS students. This year it's going to be circuits. They're going to learn KCL/KVL, transistor biasing, oscillators, amplifiers and gates, how to use decoders and logic gates to spell things on 7seg displays, etc.

> definitely does not bring you anything if you are not in CS

I think knowing what a computer actually is is valuable information to know.

> IMHO middle and high school do not have the life focus and specialization yet.

That's one of the nice things about having an elective-heavy school. They do a deep class on computer architecture, then they go participate in a musical, then they take a 3d art class. Immerse yourself deeply in lots of things to see what's interesting and to get exposure to many ideas.


When I was faced with that task, I rapidly got fed up with hand-assembling and relocating my code and went off to write an assembler to run on a computer with somewhat less anaemic IO. I doubt I actually used it more than once or twice, but it was fun to write.


Relocating was fun. Worse was having you delete array routine wiping everything you typed for an hour because you made a boundary check mistake.


Hey, exactly the same I did in my apprenticeship! Only a few years earlier (1988 in my case).


My class batch had a number int 40s. It is very much possible that we attended the same school.


My CS (or EE?) class did this in 1996 with a Z80 board


And the value of a compiler


And the value of linkers, storage, keyboards and the wonders of mice and 4k resolution (okay then SVGA).


A classic for our new readers!



Can some way more experienced than me ELI5 why the inner loop part was so clever? I'm the most novice of programmers and have never touch anything remotely like pointers so feel I am missing the lightbulb moment the author clearly got. Many thanks.


From the wikipedia article:

Eventually he realized that Kaye was using self-modifying code to process elements of an array, and had coded the loop in such a way as to take advantage of an overflow. Adding 1 to the address field of an instruction that referred to address x normally just changed the address to x+1. But when x was already the highest possible address, not only did the address wrap around to 0, but a 1 was carried into the bits from which the opcode would be read—in this case changing the opcode to "jump to" so that the full instruction became "jump to address 0".

https://en.wikipedia.org/wiki/The_Story_of_Mel


>not only did the address wrap around to 0, ...

Very reminisce of the 6502 "issue" of jumping with an address on a page boundary.

Per Wikipedia:

the processor will not jump to the address stored in xxFF and xxFF+1 as expected, but rather the one defined by xxFF and xx00 (for example, JMP ($10FF) would jump to the address stored in 10FF and 1000, instead of the one stored in 10FF and 1100). This defect continued through the entire NMOS line, but was corrected in the CMOS derivatives.


Goodbye Mel. Hello LLVM.


I remember typing in hex programs from Compute! magazine to my C64. It would take a long time and, of course, I made at least one error along the way and had to start over.


This is a story of what not to be.

Much better eating quiche!

I have known some modern versions of Mel. Horrible arrogant blissfully ignorant (and usually) bullies.


Sadly we have strayed to the other extreme. From "every instruction is precious" right over to, "who cares how many instructions I use?"

Nowadays, desktop apps which started in an instant 20 years ago, seem to take several seconds even before they are even doing anything useful like loading a project or whatever. Even things like Visual Studio looking through the MRU project list can take 10 seconds (probably a threading issue) and VSCode, Teams and others are equally unimpressive - in fact the browser, the one thing that should be slow due to the network, is usually fastest when using a well-written site.

I'm not sure whether the multi-platform desktop app was ever a great idea. How much time have we saved "not writing multiple apps" but have instead ended up with slow, bloated software that doesn't usually run multi-platform, at least not all the time in the same way.


Agreed.

My #1 annoyance these days, because it is so egregious, is Electron apps.

I guess because the only language some programmers know is Javascript, of which I know little but what little I know places it marginally above PHP in intrinsic horror.

So people write standalone apps in a language intended for tweaking web pages, meaning that to deploy those apps requires embedding an entire web browser into every app.

And entire popular businesses, for example Slack, do not as far as I can tell have an actual native client. The only way to access the service is via a glorified web page, running inside an embedded browser. Despite which, it can't actually authenticate on its own and needs ANOTHER web browser to be available to do that.

Electron apps make Java ones look lean and mean and efficient.

Apparently, expecting a language that can compile to native machine code that executes directly on a CPU, and which makes API calls to the host OS in order to display a UI, is quaint and retro now.

And it's perfectly acceptable to have a multi-billion-dollar business that requires a local client, but which does not in fact offer native clients of any form for any OS on the market.

It's enough to make me want to go back to DOS, it really is. Never mind "nobody will ever need more than 640kB"... if you can't do it in 640kB and still have enough room for the user's data, maybe you should reconsider what you are doing and how you are doing it.


> meaning that to deploy those apps requires embedding an entire web browser into every app

It doesn't require it, that's just what they choose, and it has little to do with the language. (Besides, if you actually observe them—and ignore what they tell you about liking JS—then it's clear that most of them hate their preferred language.) Languages and the bindings that a particular runtime exposes are orthogonal. You can have GTK apps written in JS, for example, or you can write a program in JS that compiles into a binary that runs on a microcontroller[2].

This is much more of a problem with the culture of Electron and the adjacent NPM ecosystem than it is anything else. Conflating the source of these problems is a great way to tank any would-be activism meant to solve them.

1. <https://en.wikipedia.org/wiki/GNOME_Shell>

2. <https://github.com/Moddable-OpenSource/moddable/blob/public/...>


That's an excellent point, and thank you for elucidating it for me.

I will do a bit more reading up on this.

Cheers!


Actually javascript even though weird it is one of total different beast as a computing language.

It is readable unlike php, dynamic as root from lisp, wide use and hence accessible.

But given its original history and neglect, it takes decade to fix all issues it presented in other area.

Still I would not dismiss it. Yes it is slow. But it is not Java type.

It can be better is the motto.


> dynamic as root from lisp

That is a good thing, yes.

But TBH I wish Eich had just embedded Lisp in Mozilla instead.

> wide use and hence accessible.

"Everyone uses it" is not a good argument for anything at all, really.

> it takes decade to fix all issues it presented in other area.

Well, yes, it has had a lot of R&D done upon it to make it quicker, but again, that is not an endorsement.

> Still I would not dismiss it. Yes it is slow.

It seems to me that in the 2nd sentence there you contradict your own previous sentence.

> But it is not Java type.

What is so wrong with Java?

More OSes support JVMs than support Javascript. JVMs have few requirements and can even run on DOS. JVMs too have a ton of R&D into making them faster and better. There is rich tooling and support to make Java apps scale, such as Enterprise Service Buses, e.g. JBoss FUSE, which allows apps with clashing namespaces to run on the same host at the same time, and even communicate.

There are tools for writing native UIs for Java apps.

I would rather have Java apps than a Javascript one, frankly. At least with Java apps, one JVM in your OS supports all your apps, whereas JS needs one per app, or even one per window or tab in some apps.

> It can be better is the motto.

That is not a good motto. That is in fact a really bad one.


The way I heard it, Eich was hired to put Scheme in the browser, but then Sun planned a big applet event and he had a week and a half to throw together a scripting language that looked like Java, because plan B would have been something more like VBScript.


That's muddled: there was no Sun big applet event, I needed to do a demo inside Netscape to get everyone on the "it's possible to do Mocha" page.

And the VBScript threat was later, from Microsoft doing it in IE3. If I'd missed the Netscape 2 boat, it's likely VBScript would prevail (Netscape 3 was originally 2.1, and 4 was originally 3 but delayed a year; find "Collabra-driven" in https://www.jwz.org/doc/groupware.html).


Thanks, what I should have done was go find https://brendaneich.com/2008/04/popularity/


Interesting... that would make a bit more sense...


The developer time is saved by not writing multiple apps, fuck the end user, who cares about their time?

Seriously, I have to really think about when was the last time there wasn't some degree of unnecessary friction in an application. It really feels like being an end user of modern tech is like being in an abusive relationship.


> It really feels like being an end user of modern tech is like being in an abusive relationship.

I've come to realize that a lot of modern developers consider users of their tech to be little more than cattle. The tech is cattle feed, meant to fatten and ensnare the user, so they can be sold off and slaughtered.

There's really only one party in that kind of relationship that benefits.


+1000. But in my experience the trend started not with developers, but with the other people around them: Product Managers, Designers, Engineering Managers, Steve Jobs wannabes. There was an obvious disdain for users, and they were seen as complete dunces that should be shepherded to whatever new functionality happened to pop up their heads. There was also a complete disdain for the medium: designers used to print design choosing too rigid designs that didn't really work that well on a screen, and only adapting when the market started punishing them.

At first programmers were able to resist all that and have a voice, but lately it seems that the only prestige we retained was the salary, so we must play the same tune as the rest of the band. Agile was an attempt at being "self managed" and have a bit more independence, but that was also corrupted and lots of devs hate it with a passion too, so we're mostly back to practicing non-iterative, Steve-Jobsian-gut-feeling-centric development. Programmers have bought into that toxic mentality too.

And even in better situations, such as my current job, the tasks that cause the most issues, take more developer time and annoy the user the most are always the same: non-idiomatic features (for the web or for desktop apps), often concocted by designers totally disconnected with the audience, who at most did two or three "interviews" where the user said "yeah I could see myself using that".


non-iterative, Steve-Jobsian-gut-feeling-centric development

This is a misunderstanding of Jobs. It’s true that he had a disdain for what users would _say_ they wanted, but he was very focused on providing the, with something intuitive and easy to use. He wanted to make their lives better, and to ‘surprise and delight’.

He was also very iterative. He regularly saw demos of in-production software (and hardware), and would ask for anything from small tweaks to complete rewrites. He was completely unafraid of throwing away work, and would change his opinions on a dime if they didn’t work out.


Sorry, let me rephrase: I don't think Steve Jobs was like that at all.

But the copycats that don't believe in iterative development or in user research love to pretend they got all figured out before it's out for development.


> There was an obvious disdain for users

The disdain for "lusers" came from BOFH sysadmin types, well before it was adopted by the non-"tech", business-focused folks.


But it became industrialized by business-types. The BOFH thing was personal. They considered (still do, sometimes), users of their systems to be "the great unwashed."

Basically, pests.

Business types look at users as a resource to be exploited to make money.

Basically, livestock.

Different outlook. We try to discourage pests, but we breed and incubate livestock. In neither case, are we particularly interested in the long-term benefit to our users. If anything, the BOFH types are actually working towards the benefit of their "lusers," because that's their job.

I write software that is targeted at a demographic that I actually respect, and sincerely want to benefit with my work (so, naturally, I don't get paid for it).

I'm constantly fighting with "modern software types" that want to treat users of the software that I write as livestock. They -quite literally- can't understand my PoV.

It's fairly discouraging, really. I'm treated like an idiot, because I actually want to help the users of my software.


The only way I see that happening is if it becomes easier to crowdsource donations. When your users are the ones putting bread on your table, they're the boss. Whatever they want they get. But sadly it's hard to crowdsource from programmers because there's so few of us. I love building and sharing software that delights my peers. Not because it's a smart thing to do. If money was the thing I cared about, then it'd be more rational to play video games on Twitch and blog about culture conflict on Substack. Rather coding is something I feel compelled to do and I won't stop even if it destroys me.


It predates the BOFH a bit as well. I am restoring a PDP-10 to operation and the operating system refers to users as "lusers", non-sanctioned users of the system are "turists" who were just there to gawk at things. It's not so much out of disdain for the people themselves as what they were doing with the computer - when computer resources were limited, it was grating to have to wait while unskilled and uncaring people occupied those resources for frivolous or unnecessary reasons.

Edit: Consider being told something along the lines of "Your DNA sequence has to wait, the CEO has important Facebook posts to read..."


> The disdain for "lusers" came from BOFH sysadmin types, well before it was adopted by the non-"tech", business-focused folks.

Based on the definitions in the thread, I'd say the BOFH attitude is more the inverse: it is contemptuous towards users, whereas the modern practice is more condescending towards users.

The latter still has a notional ethos of catering to the user, but the Monkey's Paw corruption caters towards the user's most superficial desires, particularly at a first impression, while de-optimizing for the acclimated or "power" user.


Exactly, the modern practice is condescending. The prevalent thinking is that "users don't really know what they want", so there is zero research, zero iteration, zero respect and a lot of corralling in the application to force users into a (lucrative) workflow.

But the treatment itself is first class, unlike with sysadmins of yore.


I think those are totally different kinds of disdain.

The former is generalized misanthropy plus specific hostility to the individuals who bother them.

The latter is more akin to the feudal lord or the cattle farmer: a lack of empathy plus an eagerness to stuff one's own pockets such that they build exploitative systems.

Sysadmins ultimately just wanted to be left alone to pursue their techie interests. But the MBA types are the opposite. You can't have an upper class without a set of lower classes to provide you with income and feelings of power.


There's that old joke that only two industries call their customers "users."


I don't remember the norm being "desktop apps which started in an instant 20 years ago". Or 30, for that matter. Do you have any data for that? E.g., when I think back over launching Photoshop over the decades, I remember loading screens all the way through.

My take is that launch times are on average better but still not great. And I think that's because something else that's constant is that programs first get written for the convenience of the programmer, and then optimized until people stop complaining about the speed.

The waste of resources bothers me too, of course. But on the other hand, computing resources have become radically cheaper while programmer time has gotten much more expensive. While at the same time, writing software had gotten radically more complex. So I suspect me getting bothered here is sort of like when my grandmother, who grew up in the Great Depression, got bothered when I didn't wash and reuse aluminum foil.


> I don't remember the norm being "desktop apps which started in an instant 20 years ago".

What keeps happening is-- we get an upgrade, and then have some old applications, and they become "instant" in comparison to our experience with both newer stuff and the application in the past. Then we watch that win clawed away from us.

Of course, it was never instant, even with the old or newer hardware.

My brother spoke a few weeks ago of being awestruck and programs starting "instantly" when he got a hard disk on DOS on the XT. But now I have a better-than-XT-equivalent (9.54MHz V20) with a very fast IO subsystem and I can tell you that it is very much not actually instant.


Who cares about instruction counts? Computers are there to get stuff done, they're a means to an end.

Good algorithms are far more important than instruction level optimization, which is to a large degree pointless anyway, since new CPUs come out all the time, and change the tradeoffs involved.


I like this new word "esthetics"! It's like a cross between ethics and aesthetics. How to do the right thing and make it look good too, I guess!


I think it's just a spelling variation. "Ae" or "æ" does show up in English words sometimes, but a lot of times it gets simplified to "e" (probably closer to the pronunciation).

Examples: "aeon"/"eon", "caesium"/"cesium", "paediatrician"/"pediatrician", "anaesthesia"/"anesthesia", "haemoglobin"/"hemoglobin", and "chimaera"/"chimera". And more debatably, "daemon"/"demon" and "aeroplane"/"airplane".

Generally, British spellings are more likely to have "ae" and American spellings are more likely to have "e".

But it only gets dropped in some words. You never see "erobic exercise", "erospace engineering", "erosol spray", or a conductor being called "mestro".


You forgot my favorite one: "vaginæ"/"vaginae" is an archaic plural of "vagina".

But "vagine" is not to be seen in any dictionaries afaik.

Cracks me up because Æ is a letter in the Norwegian alphabet with a completely different sound than the ae diphtong in English. So to me it looks like you started saying "vagina" but started screaming at the end.


Hm. Aesthetics really doesn't make sense in the context. Ethics does.


To me, aesthetics does fit. It has two different meanings. One refers to the overall subject area of beauty, like, "This warehouse isn't pretty, but the people who made it weren't thinking of aesthetics." The other meaning refers to a particular artistic style or concept of beauty. For example, minimalism is an aesthetic, because to some people a clean look is beautiful. You might say the main page at google.com has a minimalist aesthetic.

If a programmer has a certain sense of what they feel is beautiful, they can incorporate that into their code and the way they go about creating it. So you could say Mel's sense of what makes code beautiful is deep knowledge of the computer and clever code that ties in with that.


It's just a variant spelling of aesthetics.


Hm. Aesthetics really doesn't make sense in the context. Ethics does.


I find this new-fangled blank-verse rendition grating, despite what the author thinks.


seems like yet another trip down the past-huggery and a penchant for doing-things-the-hardway. Real programmers don't write write-only code.


>> Real programmers don't write write-only code.

He did figure out how the program exited the loop, it just took him 2 weeks.


hehehe.


I had to strip out the pretend poetry formatting before reading.

https://pastebin.com/raw/TyqZXPVe


But the formatting

is part, I think

of the charm.


I agree with the sibling comment, the verse formatting makes it even better!


I have to disagree - it was a nice read without the grafted on pseudo poetic pretentiousness.


Maybe not really appropriate for a site called Hacker _News_. The site name suggests something new should be posted.


I showed "The Story of Mel" to my Dad a couple of years before he died. He started as a software developer some time in the 60s. He'd never seen it before, very much enjoyed it, and noted that it reminded him of a few people he'd worked with back then.

"New" is a matter of perspective.

I'm sure there are plenty of HN readers who haven't seen it before, and I'm happy to see dupes of high quality stuff pop up from time to time. I was just thinking earlier that it was time for someone to re-post "No Silver Bullet" soon. Perhaps I'll do it myself it nobody else takes the hint.


Older articles are posted fairly often, I don't see why this is inappropriate


https://news.ycombinator.com/newsguidelines.html

> On-Topic: Anything that good hackers would find interesting.


HN is for anything that gratifies intellectual curiosity (https://news.ycombinator.com/newsguidelines.html). Sometimes that's news in the usual sense of the word, but often it's more obscure things, and those are welcome here.

I think of the "new" in the "news" in "hacker news" as being like the used clothing store in my home town that used to be called "New to You".

In the case of classics/perennials like the OP, there are of course a lot of readers for whom the story is anything but new, but it's important that we also take care of the newer cohorts of users and make sure they get some exposure to the classics too. That's one reason why reposts are allowed after a year or so.


I hope I never have to work with someone like him. He sounds awful.


There is a saying "autres temps, autres moeurs"

In a way I agree - someone programming like this right now without very good cause would be a bit of a nightmare if you ever had to tangle with their code. But it really was another world back then.

They were so resource constrained [1]. That "drum" wasn't the hard disk, it was the memory! Think about waiting for the rotation of a drum for each instruction read. Then the actual capacity of it was only 4K. The laptop I am typing this on has about 32 million times more memory and I don't like to think how much faster it is.

You either used clever tricks, or you wrote very limited programs. There was no room for any overhead. The author of that story wasn't astonished by clever tricks, just the degree of cleverness.

[1] https://en.wikipedia.org/wiki/LGP-30#Specifications


When adding 256 kilowords of memory to your PDP-10 cost $230,000 (2022 dollars) you were either clever or unemployed.

If your programs couldn't run on the machine you had, they were no good. Nobody was going to spend a quarter million dollars on equipment for the sake of source code beautification.


In 1998 I was bringing up a new MIPS-based board that had been designed in-house

The CPU booted from an EEPROM and started running code. There was an FPGA on the board that controlled the memory. The FPGA needed to be loaded with a bit-stream that was also on the EEPROM. The trick was that I had to write a program to load the FPGA without referencing any memory -- I only had ROM and the CPU registers. Fortunately the MIPS had quite a few registers, but I had to abuse all the register-use conventions and the code jumped through some hoops in order to be able to get the FPGA loaded so we could start running from RAM.

There was all kinds of crazy stuff that was weird with that hardware that we had to fix in software... I didn't realize until switching jobs exactly how weird things were, I just thought it was normal.


Some things never change. One of my first HN comments got me four down-votes for much the same comment.

https://news.ycombinator.com/item?id=679208

Back then, I was so new it bumped me down into negative score. I almost made a new account but decided to keep it and try and rebuild my damaged rep.

Nowadays, my score in the thousands, but Mel still sounds awful.


If you're going to post an unpopular opinion, I think the only way to avoid downvotes is to at least make your comment interesting or novel in some way. Merely saying "I don't like X" is neither.


I'd be interested to know exactly why; Is it because he doesn't obey his bosses when told to make the program cheat, or just the way it's framed as some kind of testosterone caveman thing?


    In the story,    
    The storyteller is tasked with maintaining Mel's code.    
    He can't do it. Mel hadn't left any documentation.    
    All sort of tricks.    
    No explanation of what's going on.


Upvoted for the effort of responding in verse, very nice.

That rubbed me the wrong way too. Even the hackers at MIT documented their hacks, they even wrote up a memo explaining a bunch of them so others could understand and build on them - see HAKMEM.

Documentation was much more extensive and available back then. Systems frequently came with full schematics, and you could call the design team on the phone if you wanted. DEC would get phone calls about the PDP-10's RIM10B bootloader right up until the retirement of the 36-bit processor line, and they did their best to explain its tricks.

The loader was made to fit entirely in the processor registers so it didn't touch the memory it was loading. To do this it made use of a specific and documented aspect of the processor - that being that the first thing it did when executing an instruction is to determine its effective address, and nothing the instruction can do will have any effect on its own effective address calculation.

They had two bold-print warnings about this in the processor manual, both before and after the RIM10B source code, but some people still required more explanation. For those people, the explanation was given.

In Mel's story, he grinds out a very well optimized program, and while I can appreciate the skill it takes to do that, he documented none of it. This was customer-facing code. That's unacceptable even by their standards, and even his own co-workers of the era would have thought he was an asshole. A skilled asshole, with skill worthy of respect, but an asshole nonetheless.


The story is known to be inaccurate in technical detail¹, so it's not impossible that it's inaccurate in social detail as well. There's some LGP-30 code on Bitsavers including scanned coding forms, and the ones initialled ‘MK’ are not notably different in level of commentary from the others.

¹ https://news.ycombinator.com/item?id=20489273


Ah, I didn't know that. As for the code on Bitsavers, I'll have to check that out. I should have expected there would be something there - Al is a gem (albeit a cranky old gem), and I wish he'd get more love.


Obviously difficult to work with and unprofessional. Utilized programming side effects which obscured the intent of the code to such an extent that he unintentionally reversed the logic of the test he was supposed to implement.


Abuse of side effects to save code is just a reality of working in limited or embedded systems once you have run into their limits. On one of my current projects, the target has only 8K of program memory. I had to do some things I'm not proud of to get the required functionality to fit because the alternative was to say "It can't be done, scrap everything." The target is an embedded hard-realtime processor core, so there's no way to add more memory to it. It's a component integrated into a larger system so I can't demand a larger or more capable device without incurring significant redesign costs. Quite frankly, I won't be considered worth those costs. Nobody's doing to take on the expense and effort simply because I can't hack it. I'd have a hard time blaming someone for doing the painful to keep a project alive, considering I'm doing exactly that. We can always agitate for better conditions next time around, but this time we must go to war with the army we have, and sometimes that necessitates fighting dirty.

The rest of the stuff is entirely valid though. A programmer that cannot be managed is worthless, and someone who is unwilling to accept correction is a liability. I wouldn't refuse to work with him because unless I'm his superior I don't get to make that call, but I would probably go out of my way to avoid interaction. (And if I were his superior, unless he's got political juice or something, I'd have him shown to the door if he won't work with the team instead of against it.)


Obviously, not documenting things and writing the most clever code possible makes it difficult for other people to read your code, there’s no doubt about that. And the story acknowledges that. It’s clear that Mel is a very clever programmer but probably a pain to work with.

But your comment is just so dismissive of the point of the story, Mel’s cleverness. It comes across like you saying “This person sounds awful because they’re smarter than me.” I can assume that’s not what you’re trying to say, but that’s what it sounds like.

If you wrote a more nuanced comment acknowledging that, e.g. ‘while I would love the opportunity to learn from someone like that, I’m glad I don’t have to work with them or maintain their code.’ I would be more inclined to see your comment as contributing to the discussion.


“This person sounds awful because they’re smarter than me.”

I didn't read the comment that way, but then I share the opinion that Mel seems like a difficult person to work with or to manage. Wrote obscure code and apparently failed to document any of it. Took glee when said code worked the opposite of requested. The guy who took over seems to have had to waste countless hours trying to de-obfuscate the code in order to reverse the logic of the test.

I appreciate the cleverness, but there are some red flags here. Also, I got the impression that some of the cleverness was for its own sake, rather than out of necessity.


EDIT:

OP made a response indicating that his objection to Mel was that he left zero documentation.

To this, I agree 100% !!

Leaving no documentation is doing your future self a huge disservice (even a few weeks from now it'll be helpful if you've left yourself some good breadcrumbs to follow), and is pretty much a hostile act towards the team.

----- initial comment -----

Sure, if he's doing that kind of highly idiosyncratic stuff in the modern software & hardware environment, he'd be a hindrance to any team.

But this was not that situation, and sadly, this comment reveals a deep cluelessness about the technology underlying the computing industry.

The article shows a real genius at work, fully understanding that what he is doing is programming a computing machine, and using every available advantage to get to to yield a program that performs well.

Sadly, software now is optimized entirely for the convenience of the developer, and with literally billions to trillions of multiples of the computing power available to Mel, most software today is utter crap, taking tens of seconds to even load because it barely floats in an ocean of bloated abstraction and 'frameworks'.

The fact that you cannot at even recognize the obvious genius in that story indicates that you should really learn a lot more and seriously rethink your approach to computing. Learn how the hardware and software actually work. Work hard to strip out unnecessary dependencies, middleware, frameworks, etc., and make your applications snappy. With today's hardware, there is literally no excuse for software that does not respond faster than any human perception. But sadly, today, if you can make it work that way, you'll be the exception — so be that exception.


Please read his response of about 30 minutes ago, it's a few replies up. It's not what you think.


Oh, thanks for the alert!

Indeed not at all what it first seemed.

edited (original had sat in writing for a while then submitted later w/o reacing new context comments)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: