Hacker News new | past | comments | ask | show | jobs | submit login
Updating “101 Basic Computer Games” for 2021 (codinghorror.com)
147 points by mariuz on Feb 28, 2021 | hide | past | favorite | 65 comments



I've been thinking about BASIC a lot, as I've been looking for "the easiest programming language for a non-programmer to learn". I'm not exactly sure that rewriting these in Python/Ruby/JS/VB.NET using "better coding patterns", because a lot of the things programmers take for granted are really tough for complete newcomers. Examples:

- I remember when I started, using BlitzBasic, I had a really hard time understanding the concept of arrays.

- A friend of mine, when showing him the statement 'x = x + 1' was confused; surely, x cannot also equal x+1.

- When teaching Python, the concept of function parameters and variable scope is always a struggle.

To some degree, having spaghetti code and all globals is useful for teaching because it 1) gets people excited to program and see something working and 2) demonstrates why you don't want to do those things. I remember when trying to write my first text based CYOA game, I was so excited to add content to it and see it on screen, but also learned very quickly that having to scroll through 20 pages of if statements was not fun, and why functions would help. I'm not sure if someone explaining "encapsulation" to me would have really taken root.

I think there should be a rewrite of the book, for modern languages and development environments, but keep the complexity extremely low, like 70's BASIC was intended.

EDIT: Just looked at https://github.com/coding-horror/basic-computer-games/blob/m... , it actually looks like a pretty good blend of original but with mild improvements.


> To some degree, having spaghetti code and all globals is useful for teaching because it 1) gets people excited to program and see something working and 2) demonstrates why you don't want to do those things. I remember when trying to write my first text based CYOA game, I was so excited to add content to it and see it on screen, but also learned very quickly that having to scroll through 20 pages of if statements was not fun, and why functions would help. I'm not sure if someone explaining "encapsulation" to me would have really taken root.

While I'm certainly no fan of spaghetti code, I think this is an important kind of experience for novices. I, for example, let mentees at work go down a wrong/weird path with their code for a couple weeks when I see bad code patterns or poor choices in algorithms/data structures in order to give us something to discuss and correct later when they start to hit pain points that I, usually correctly, predict they'll hit. I let them know that they're making their life harder, but I don't direct them to change course (some do on their own, others don't). Telling someone who's just learning how to do things the "correct" way (to the extent that there is a singular correct or a small set of correct options) doesn't give them sufficient motivation and experience to actually learn and internalize it. Often you get a kind of cargo cult mentality around the things you teach them since they lack comprehension of why those are good ideas or better ways of organizing code or whatever, or a later rejection of what they're taught because they don't understand the motivation behind it.

Also, letting them go down the path of spaghetti code or confused structures gives an opportunity to teach them refactoring techniques, which even expert programmers still have to apply to systems they develop (most people don't write perfect code on the first pass, and most systems having changing requirements which would render the perfect code imperfect if they did).


One of my favorite authors, Robertson Davies, put this idea nicely in "The Rebel Angels":

> To instruct calls for energy, and to remain almost silent, but watchful and helpful, while students instruct themselves calls for even greater energy. To see someone fall (which will teach him not to fall again) when a word from you would keep him on his feet but ignorant of an important danger, is one of the tasks of the teacher that calls for special energy, because holding in is more demanding that crying out.

It's a quotation that I often keep in mind while helping my kids learn.


Totally agree. In addition, it can just get plain overwhelming. Don't use a global, use a function -> what's a function? -> a subprogram with parameters and return values -> what's a parameter? a variable with scope existing only for the subprogram's duration -> what's scope? -> ... -> what's an abstract syntax tree?

When really they just wanted to increment the score variable by 10.


>- A friend of mine, when showing him the statement 'x = x + 1' was confused; surely, x cannot also equal x+1.

I think the best thing that my CS 101 prof did was always refer to the assignment operator as "gets" so you'd say "eks gets eks plus one" out loud. It really helped divorce assignment and equality in my mind.


There is an implicit and optional 'Let' statement preceding variable assignment BASIC, which was originally derived from FORTRAN (along with the practice of not using it.)

It is less jarring to see:

Let x = x + 1

because it could be read: we have a given value for x, now let us replace that value with x + 1.


Another helpful thing is how many programming languages have "assignment" and "comparison" operators.

I remember TI basic has "=" for comparison, and "<-" for assignment, like, 'A<-5'. That one, to me made, the most sense. "Put the value of 5 in something called A."


iirc in TI-BASIC, assignment goes the other way so you could have "B + 5 -> A" which also helps seeing that the expression would be evaluated before storing (the button to input -> on the calc is STO, short for store).


Similarly, I had a prof that referred to it as "becomes" which made great intuitive sense to me. "x becomes x plus one" makes me think of the variable transforming to the new value.


I’ve read that Dijkstra liked to say “becomes” for the same reason.


You can also say "x is now equal to (the previous x)+1"


x:=x+1 (Pascal)


> A friend of mine, when showing him the statement 'x = x + 1' was confused; surely, x cannot also equal x+1.

Some people say a language like Haskell would be no harder, and maybe even easier, for someone with no previous programming or CS experience, than something like C or Python. Comments like that make me wonder if they're right!

(Haskell is a pure functional language. And so has the same reaction as your friend. Variables are immutable. Anything else would be madness.)


I guess it depends. Maybe for an adult. I learned the rudiments of Atari BASIC when I was seven and the book explained assignment with cartoon pictures of numbers going into boxes. I don’t remember ever being confused about what it meant. You could just try it (oh, X was 7 and now it’s 8, ok). Similarly, control flow wasn’t hard... just follow whichever line number to GOTO next. You could just play with it until it worked. Would seven-year-old me get very far with recursive functions and IO monads? I really don’t think so. So I’ll always have a soft spot for BASIC and how much fun it was. Maybe python in a Jupyter notebook would be a kind of similar experience for today’s kids? But not Haskell.


> Would seven-year-old me get very far with recursive functions and IO monads?

A few years back, there was someone who shared here his experience teaching young kids programming with a simplified/visual Haskell and it was pretty cool. It seems kids without programming preconceptions don't find it all that difficult.


It is interesting to try to think all the way back to the struggles of initially learning and wonder if things that seem difficult now (Haskell, which I know barely anything about) might have made more sense then. I often wonder this about TCL, which looks very weird compared to most languages but is really easy to learn if you remember the "12 rules": https://tcl.tk/man/tcl8.6/TclCmd/Tcl.htm


I learned programming, in part from first book, before I had algebra in school.

When teacher was explaining equations first time. I thought, "this is easy, its just like programming".


I was ten when I first started learning BASIC so it wasn't until a few years later that I first encountered algebra as a late year lesson in math class. By that time x=x+1 was very much ingrained into my way of thinking so I had to unlearn some of that before I could understand algebraic equations. Wonder how many other kids of that era ended up going through the same process.


Every programming class I ever had was using logo on an Apple II, where mostly you just sat around making making the little puck fly off the side of the screen at warp 9.9. Most of the personal computers in the 80s came with a basic interpreter built in and they were largely compatible with each other aside from some syntax differences and the extended screen characters being vastly different. Typing in programs from magazines and writing my own did far more for me then any course I ever took. I dabbled with Turbo Pascal when I got my first IBM knock-off but settled for a little $20 C compiler - Power C - that worked in real mode and spit out DoS executables really quickly. Porting code from Unix to the Power C runtime taught me a lot. Today I write in Go almost exclusively and marvel at how far things have come.


This is tricky, because on one hand, you want to ignore structure and a lot of syntax to just get started, but on the other, you want to see the results and see something real, not just print the sum of two inputs, and you quickly need a real language for that.


Interesting post. This reminded me of what it was like to be a true beginner a very long time ago.

I’d forgotten just how confusing those initial concepts like arrays could be.


Can we have new illustrations by George Beker too? Those books just wouldn't be the same without Beker's whimsical pictures!

http://www.bekerbots.com/


Thanks for reminding me of these wonderful illustrations. Your comment made my day!

They were an essential part of the book, for engaging your imagination. They were also a gateway to books like Heiserman's "How to build your own self-programming robot":

https://archive.org/details/howtobuildyourow01heis/page/n5/m...


I appreciate his art now. At the time it was a prime example of the art over-promising and the game under-delivering. ;-)


It's funny that this got posted right now. I'm working on exactly this project. You can find a collection of text-based Python programs written in the style of the games in Basic Computer Games here: https://github.com/asweigart/gamesbyexample and installable through `pip install gamesbyexample` More info here: https://pypi.org/project/gamesbyexample/

Hi, I'm Al Sweigart. I wrote Automate the Boring Stuff with Python. My next book (working title The Big Book of Small Python Projects) is exactly what Jeff talks about in the article. I started on this three years ago, and the style of the programs are 1) short (under 256 lines, as an arbitrary limit) 2) text only (so that readers can link cause-effect between the print() calls and the text that appears on the screen 3) requires no additional libraries outside the standard library 4) fits in one source code file for easy copy-pasting, along with some other guidelines.

My main fear is that I'm just old, and I'm mistaking the nostalgia of how I learned to program for good pedagogy in modern times (this is a mistake the One Laptop Per Child project made). But I figure this might be a good start for beginners who want to see what programs "look like". I already have JavaScript, Java, C#, Kotlin, and Go versions also planned. The books, like all of mine, will be released under Creative Commons licenses. The first book should be out in a few months.


I love the methodology bikeshedding. If it were 1999, they'd be diagramming Super Star Trek in Rational Rose.


"Rational Rose" now here are two words I haven't seen or heard since 2001!!


You reminded me of something I used about 30 years ago. https://en.wikipedia.org/wiki/Rational_R1000


Wow, that sent me on a goose chase. That thing is cool.

Wikipedia linked to here: http://datamuseum.dk/wiki/Rational/R1000s400/Logbook#2019-10...

and there it says:

---

The Ada language was originally designed for embedded systems, in particular for embedded systems in, around and in control of military weapons, and it dates back where state-of-the-art in radiation-hardened microcomputers was the RCA1802 and a few kilobytes of ROM and RAM was the norm, so being parsimonious with memory space is deeply embedded in the genes of Ada.

For instance, when you define:

   subtype MISSILE_NUMBER is INTEGER range 1 .. 8;
The Ada compiler will know that only three bits will be needed to store that type.

And five bits saved here and four bits saved there, it adds up.

The R1000 is a true-blooded Ada computer, so the fundamental unit of addressing is a bit, and just because your field may happen to be 16 bits wide, does not mean that it is going to be aligned at a 16-bit boundary. (There is a notable exception: Instructions are 16 bit wide and must be aligned to 16 bit boundaries.)

---

Now, my friend told me that usually it'll align at 8 or 32 nowadays, but still, this is fascinating. I don't know if any other languages do that because I don't work on embedded stuff which might be the only place its done now if at all? Fascinating enough that I'm reading about it when I should be working!


Oddly, that was the only year I ever touched Rational Rose, and I don't put it on my resume. I guess I could have been somewhat more accurate by saying "If this were 2006, they'd be diagramming Super Star Trek in UML" but I thought the other phrasing was funnier.


I thought that one of those book covers looked familiar so I went over to my vintage computer books shelf and found the "more" edition. I think it's a 10th printing and in nearly perfect condition. Not sure I could handle typing in all of that code from the book like I did back in the 80's due to aging eyes but the book is very nostalgic for me.


I typed in a lot of these in the late '70s, it was "learning Basic the hard way" to get them to run. My access to computers was limited, so I read them more than used them at a computer. I later bought:

http://www.trs-80.org/trs-80-programs-32-basic-programs-for-...

I really liked "The Flying Walloons", simple as it was.

Here's one I never saw but would have been fun, I think:

http://www.trs-80.org/trs80-graphics/

I remember I had to hide them in my other books because kids would pick on me if they saw "nerdy" books like this, and teachers didn't like you reading anything interesting during their boring lectures.


IMHO re-implement these using modern ES6 javascript targeting browsers above IE11 (so ES module support is good) and a simple canvas library (or even P5.js) to really capture the original spirit of BASIC. It's supposed to be a modern language that's available everywhere, and web browsers fit that mold nicely. The dev tools window in your browser is the new QBASIC, just sitting there waiting for a kid to open it up and blow their mind with what's possible.


Interesting idea! I had to think about what you were describing, but that actually is very similar in spirit to the experience back then.

It's kind of like discovering the assembly monitor on the Apple ][, which you could get to by poking a value in Basic. I remember typing in an assembly program I found in a magazine that sampled audio from the cassette recorder, and played it back on the speaker. It was gritty sounding, but an amazing discovery for me at the time (about 1982).

(I'm always doing Ctrl-Shift-C to copy text from the browser when I forget I'm not in the terminal anymore--which brings up the element inspector and the console. I've always wanted to explore it myself, but I'm usually deep into another project.)

I wonder what I could have done if I had Three.js?


It's also interesting that QBasic (which I didn't use) is related to IBM Cassette Basic, the first language I used on the IBM PC. If the computer couldn't find a bootable floppy, it booted Basic from ROM:

https://en.wikipedia.org/wiki/IBM_BASIC#IBM_Cassette_BASIC

no OS needed, like the Apple ][, TRS-80, etc.


> IMHO re-implement these using modern ES6 javascript targeting browsers above IE11 (so ES module support is good) and a simple canvas library (or even P5.js) to really capture the original spirit of BASIC

IIRC, nothing in either volume of the original (not sure if this changed in the later update that targeted SmallBasic) used graphics, they were all text-mode, so there's no real need to target a canvas library. Maybe something that provides a simpler abstraction than the DOM for just dropping text onto the page, though, if you don't want all the display in dev tools.


Oh nice, heck you could just say "Step 1: Open your browser's dev console. Step 2: Write this code..." Console.log() and prompt() for all your text I/O needs.


I really, really wanted to learn BASIC, just for fun. Any idea how I might do it? Which BASIC should I use? What software do I need to run BASIC?

Any help is really, really appreciated.


You could get an emulator for whichever micro you wanted to try out. BASIC is still alive and well also, https://www.qb64.org/portal/


Seems cool, thank you!


Biltzbasic and BlitzMax were the best! Sadly, no longer supported... or is it? https://blitzmax.org/

There is a dialect called CerberusX that 's still alive (and I helped make a VSBasic extension for it): https://www.cerberus-x.com/community/index.php


Whatever comes with FreeDOS should give you some of the flavor of the era without the quirks of emulators.


Just checked it out, thank you!


If you want a modern IDE, you could try gambas: http://gambas.sourceforge.net/en/main.html


Seems nice, thank you!


I've always wanted to try out FreeBasic: https://www.freebasic.net/


Thank you!


B4X is a free Visual Basic-like environment that allows you to develop without cost for Android, Windows, Mac, Raspberry Pi, Arduino. The purchase of a license is required to develop for iPhone. There is a lot of free learning material, including videos.

https://www.b4x.com/


As someone who has done a lot of work in MS Access since the 90s, this looks really interesting. I may be able to move forward on some things I abandoned. Thanks for sharing!


Can we have a sub-$100 computer with built-in screen and keyboard to go with it?


Pick up any cheap $30 pre-paid Android smartphone at Walmart. Install Termux (while you still can). Use a nice keyboard for coding like Codeboard: https://play.google.com/store/apps/details?id=com.gazlaws.co... Or add a simple bluetooth keyboard to pair with it.


I wonder what the minimum cost for a phone with HDMI out is? Possibly better to get an Android TV box, although that has disadvantages as well.


The Raspberry Pi 400 is almost there. You wouldn't always get a VDU with a micro back in the day, either, so it's pretty close to what many of us had. Plugs into a TV, so it's a very similar experience.


Many people have huge TVs these days, not very suitable for hobby computing. Also, people don't want a mess of cables in their homes these days. Giving a kid a console without screen is not a good idea, because their parents will not like the cables, and the kid would just grab the iPad because of its superior UX. So I really think a home-computer should have a built-in screen.


> Many people have huge TVs these days, not very suitable for hobby computing.

With either a wireless keyboard or a long HDMI cable, a huge TV is great for hobby computing. Why would a huge TV be a bad thing for this use?


I think smaller tvs with HDMI are relatively easy to come by. people get a >50" for < $300 and get rid of the old one.


For some people it'll be fine, they either have an old TV or spare screen or can get one inexpensively, for others it would be a pain and a laptop or whatever would be better. There are laptop kits for the Pi as well.


When Amazon has their Fire tablets on sale, you can make a $60 tiny laptop. Or "laptop" if you prefer. ~$27 for the Fire 7. ~$25 for a Zagg hinged keyboard case. Spend 30 minutes removing the Amazon annoyances from their version of Android. Install Termux. Install Brave (this is the one Android browser I've found with full KB support). Not quite a full-featured computer, but maybe close enough depending on your purpose.


Fire Toolbox is an effective tool for removing Amazon crapware.

https://forum.xda-developers.com/t/windows-tool-fire-toolbox...


You can get ~$200 pi 400 + touchscreen deals. That's a lower nominal price than anything but firesale deals on 80s PCs without displays were, and a lot cheaper in real (adjusted for inflation) terms.

Sub $100 would be nice, sure, but it's not like what we have available today is bad.


Aren't keybords like $5-15. Rather have full/real kb. That +rasberry + cheap small monitor/tv should be sub $100.

I think there are keyboards with space for raspberry. To replicate the c64, trash80, appliED, TI94a experience.


With some of the discussion recently about Fabrice Bellard, I was thinking a nice retro, minimalist PC OS could be made with TinyC, QEmacs, FFMpeg, TinyGL, TinyCore Linux, etc. I'm sure a lightweight Basic implementation could be found to complement those.

Fabrix?


Everyone's probably way past reading this by now, but for the record I ported the first example to Common Lisp:

https://github.com/koalahedron/lisp-computer-games

Man that brought back the memories, and I had a blast doing it. Many thanks to Jeff Atwood.


I think I even discovered "intentional O(random)" complexity with this gem from Acey-Deucy:

  270 A=INT(14\*RND(1))+2
  280 IF A<2 THEN 270
  290 IF A>14 THEN 270


I highly recommend messing around with yabasic[0] a bit, runs on windows and (probably) most unix-likes, and even offers graphics capabilities right out of the box!

I feel like this would've been a great batteries-included environment to start in, back when I started programming.

[0] http://www.yabasic.de/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: