Hacker News new | past | comments | ask | show | jobs | submit login
ZX81: Small black box of computing desire (bbc.co.uk)
84 points by Peroni on March 11, 2011 | hide | past | favorite | 33 comments



Terry Pratchett told us a tale of making a crude vision system out of one of these. If I recall correctly, he figured out the memory modules were light sensitive, and if exposed to a focused image, the image could be 'read' straight from memory. I THINK he mentioned part of this in his speech at Penguicon 2.0 (Novi, MI -- 2003)... Actually, he DID!!! I just found a recording of his speech:

http://www.archive.org/details/Penguicon_2.0_Terry_Pratchett...

He starts talking about playing with the ZX-81 about two and a half minutes in, and at a little after five minutes, starts talking about making one "that can see things." This so impressed fellow guest Eric S. Raymond, that he came charging out of the banquet hall after the speeches to demand to hear more!

I had something of a front row seat to this conversation, as I'd already approached Mr. Pratchett to ask about how he'd wound up too radioactive to enter a nuclear facility (also mentioned in the speech, but kind of glossed over). So there I am on the inside of a crowd of onlookers as Terry's elaborating on the radiation story, with a flabbergasted ESR staring in disbelief, and each of us standing not much more than a meter from each other.

It was... an interesting experience.

I do wholeheartedly recommend that speech. Terry Pratchett is every bit as funny at the podium as he is on paper. He had us rolling!


Boast: after writing a character scroll (trivial, just a block memory move - one Z80 instruction, LDIR), I worked out how to do one using the quarter-character "pixels", the heart of which was three bitwise boolean instructions. I was delighted to later see in the source code for the ZX81 basic (by Steve Vickers), that the line drawing routines using those pixels used identical instructions.

Boast: at 15, I wrote a game for the ZX81 (in machine code), that had many simultaneous sprites, all animated at once; a tiled scrolled background; a massive ship explosion; and even different keyboard control selections. I showed it to a game retailer, and they agreed to stock it! All I needed to do was supply them with cassettes + covers...

Shame: I didn't do this.

A good effect was then when I next was running a business, I resolved to not cop out on the difficult things (read: boring). It's important to recognize mistakes as choices (and not things beyond your control, such as intrinsic inability, "I can't"), because then you have the power to choose otherwise.


"Micro Men" is a new (I think) comedy from the BBC about Clive Sinclair http://www.bbc.co.uk/programmes/b00n5b92 The clips (needs proxy) look hilarious - in a very weird, slow, dark, english-humour kind of way!


It's more dramatic comedy than factual. On the other hand it is very entertaining and well worth watching if you can get your hands on a copy. (Also has Martin Freeman of The Office fame, and Alexander Armstrong who is a well-known British comedian).


While they compressed some events and characters nothing was made up. The 'Battle of the Baron of the Beef' (Where Clive Sinclair started a brawl with Chris Curry) is a real event (and one well known to people in the Cambridge IT industry).

Look out for a guest appearance by Sophie Wilson (previously Roger Wilson) as the barmaid.


I think it showed originally last summer but on BBC 4 (which has relatively few viewers). Caught a couple and it was kind of fun. Makes Sinclair look like a bit of tyrant!


I think the whole series is here: http://youtube.com/watch?v=2y8IkcUGV9w


It was just a one-off program not a series, but it's been broken up into sub-10 minute chunks for Youtube.

Well worth watching. It's probably got relevance for startups as well as historical interest for geeks generally and it touches on the beginnings of ARM which of course is a big deal these days.


1K of RAM if you were a sucker :) My dad brought his home with the massive 16k expansion brick. Playing with this (my first computer) and waiting for a game called Rings Around Saturn to load was literally the most futuristic-feeling experience of my life.

I remember nothing about the game itself, other than the kind of wistful enchantment that so few things in life can inspire.

Where did it all go wrong?!


"Rings Around Saturn"

Interesting, haven't heard of it. http://www.zx81stuff.org.uk/zx81/generated/tapeinfo/s/SuperP... -- Listing and play it online.


Didn't even occur to me to look for it. Thanks for that!

Looks even simpler than I expected. I was looking for an idea for a simple iOS game to get started. This may be it :)


   16k expansion brick
I think brick is definitely the best description!

For those of you too young to remember:

http://farm4.static.flickr.com/3074/2671527140_fd269593b5.jp...


Nah, bricks stay put.


I also got a futuristic feeling from the science-fiction covers of the manuals. http://james.istop.com/zx81/cover.html


Ahh yes, the 16k expansion brick. If you sneezed too hard or a fly landed on the table it would reset. Didn't it also turn really hot after a few hours?


In most Eastern European countries (and in parts of South America, I heard) whole industries grew out of building clones of these machines, usually with a bit more memory but otherwise similar specs. I started programming on one of these when I was 9: http://www.old-computers.com/museum/computer.asp?c=632&s...


"... The Sinclair ZX81 was small, black with only 1K of memory ..."

It was also small, white with 1k of memory if you bought the 8K ROM upgrade chip (which I still have somewhere) for the zx80 ~ http://www.flickr.com/photos/bootload/sets/72157607718005837...


It's articles like these that make me wish I was old enough to have experienced computers before they became "fast". It was these machines that formed the thinking of many of our greatest coders today and I can't relate to that as I'm too young. Oh well


You can still write code that stretches a modern machine to its limits. For example, can you make a realtime raytracer? Can you crunch through 13GB of StackOverflow data and extract useful statistics? Can you write an interpreter for a subset of your favorite language? Can you make it fast enough? Can you add a JIT? Can you take a genetic algorithm that takes hours to run and get the time down to a few minutes? Can you write a multi-agent simulation that will scale from ten to ten thousand agents on a MacBook Pro?

Nope. Computers aren't fast enough yet :p


You can do all that, but programming one of these old computers or a modern micro controller allows you to almost see the bits and bytes flowing (if you program in asm).

Nowadays it's even hard to know how a pixel got to the monitor from your RAM in the first place, with the massive amount of knowledge needed and countless layers in between.

Just the datasheet of a memory controller is many times longer than the few pages needed for the asm instruction set and simple schematics of those times. Not to mention the M or G number of transistors on modern chips :)


I can't argue with that. I've personally tried to write a toy OS on the x86 architecture. The amount of processor documentation you have to get through is mind boggling, after which you're still left with documentation for the plethora of peripheral device. I view this as a failure of the PC architecture. There's no reason for things to be so complex.

OTOH, I know a few folks who work with all kinds of micros. Once these folks were trying to interface a SD card reader with a tiny LCD screen that had an onboard processor. They had (had to have, in fact) low-level access to the card reader. You had to know the ins and outs of whatever filesystem the SD card was formatted with because you only had raw access to the card. As in, you could go to an address on the card and do something to the bytes stored there, and that's it. After diddling the card reader interface for a while, they eventually figured out how to get data on the card, but there was a problem: sometimes the card reader wouldn't write the data to the card at all. After trying to pinpoint and fix the issue in code, an entire night of hacking later, the problem turned out to be a faulty power adapter.

It was a fun night. One that a "modern" computer couldn't afford you.


Start spending time programming microcontrollers and relive the 80's at 1/10th of the price.


Do you know of any microcontrollers that you can program ON? (Without needing to flash with a computer?)

I've played around with machine code monitors on C64 emulators, and I'd love to have something like this on a microcontroller today. (Perhaps a machine code monitor on a calculator?) I don't know of anything like this.


There's a company called Briel Computers that sells replicas of the Apple I, Altair 8800 and more.

The Maker Shed sells a 4-bit microcomputer trainer from Japan. It has a hexadecimal keypad, 1 7-segment LED display, some other lights and a speaker.

You could also go to eBay or a swap meet and buy an old microcomputer.

http://www.brielcomputers.com/

http://www.makershed.com/ProductDetails.asp?ProductCode=MKGK...


I don't actually but the 4-bit micro mentioned below sounds cool.

I am currently working on an atmega project and have hopes of bringing such a "self-hosted" programming environment to the device, which I'll post to HN when I get it working.


Probably the best way to get something like that would be to build it yourself. It may not be the kind of project you're looking for though.


I'm guessing the experience is similar to hacking graphing calculators, if you've ever played around with those in school. That's where I got started, on a TI-84+ :)

And no, it hasn't damaged my brain, I think.


It still applies. The things that are hard to do today - and those that require ingenious hacks - are the ones that future computers will find trivial.


Well, there are emulators out there.

On the other hand, if you don't mind things being fast as well as simple, there are dead-simple hobby OSes out there for exactly this sort of tinkering: http://news.ycombinator.com/item?id=1088617

I do remember the fun of PEEKing and POKEing with my TRS-80...


This was back in the day when programmers were real men and pixels were as big as your fist.


Don't know about real men, I was just hitting 8 when I finally got my hands on a hand-me-down zx81.


Me neither, real man, I was around 9 when I got hands on a VC20.


Interestingly enough, I did this a few days ago for a project. http://www.vga.hr/pr/intro.mov (22MBs - and this was an early WIP tests, fonts aren't properly aligned, missing scanlines and stuff)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: