Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because programmers no longer grow up on 8-bit systems programming in assembly language, they jump straight in to Java or whatever - they've never gotten closer than 10 layers away from the actual CPU, it's all a magic black box to them.


Not to totally change the subject, but that's why I'm kind of more excited about the 0x10c game than Raspberry Pi.

RPi claims it will get kids interested in programming again, but in the end it's just a supercheap Linux box. I think you'll see more kids want to make their spaceship do something really cool and really efficiently on the virtual in-game CPU. The instruction set is small and tight and very easy to learn. I know everyone wants to build compilers for every language under the sun for DCPU, but just like in game programming the tighest code will win.

Just throwing that idea out there.


On the same note... Maybe RPi got it wrong (ie building super cheap linux box). Maybe, just maybe, they should have built a supercheap six-legged bot, with a DCPU-like real CPU and a set of touch-vision sensors? It might be interesting to program virtual CPU of a virtual spaceship, but making a real, tangible thing move and obey your commands is way-way cooler, IMHO...

Something simpler and cheaper than Mindstorms, yet more open and easier to experiment with.


I dunno how the modern stuff works, but you don't need to get much simpler or open or easy to work with than the original LEGO Mindstorms. My introduction to programming was BASIC -> PERL -> JavaScript, all on the family PC, and then C on on the Hitachi 8-bit processor in the LEGO RCX. LEGO was very good about providing resources and documentation for working with the Mindstorms hardware, and enough simple tools were built around that that a 10-year-old could get set up to, e.g., run a FORTH environment on the RCX using a PC as a terminal in half an afternoon for free. And then you get to build your own six-legged robot...

I credit this exposure to 8-bit low-level programming at a young age for my easy experience in computer architecture and operating systems classes later on.


They should have resurrected the BBC Micro, that's what they should have done. Kids these days...


Aha, but there's the catch! RPi was supposed to be in the spirit of the BBC Micro. Cheap, accessible, in every school. But the Micro had a ready to go built-in programming language, didn't it? It was pretty simple to teach a kid to write 10 PRINT "HELLO" / 20 GOTO 10 and see their eyes light up within a few seconds of turning the thing on.

10 PRINT HELLO is slightly more obscure on a Linux desktop. This is where I see RPi falling down in the educational market and it turning into another hobbyist breadboard.


Not only that, but the serial port, user port, Econet etc were incredibly simple to use, both as hardware, and how you addressed them from BASIC. The RPi has USB and Ethernet...


You can dig up complaints to this effect from the 70s and 80s too, that programmers no longer grow up patching wires and bootstrapping boards, but instead jump straight into writing "software" in asm or C or whatever, as if "hardware" is some magic black box that just gives you an instruction set out of nowhere.


Not quite: there's still a 1:1 mapping between ASM and toggling in binary and the front panel. But even C abstracts the stack away.


Yes, quite. Yes, there's a 1:1 mapping between ASM and toggling in binary. There is not, however, a 1:1 mapping between toggling in binary and rewiring the actual hardware.

Can you toggle in a new A/D converter? Can you toggle in a connection from your screen refresh to a CPU interrupt? Can you toggle in a switched memory bank so your hardware can handle more memory than the CPU was designed for? Obviously not.

These are the things you lost if you 'jump straight into writing "software" in asm or C or whatever.'


The son of one of my cow-orkers is having a devil of a time getting a job in the game industry because the school he went to taught the programming curriculum in Java.

He's screwed, near as I can tell, unless he can find at least half a year to sit down and learn C / C++.


Not to sound like an ass, but getting a good programming job in the game industry is hard enough as it is. After all, the supply-demand ratio for these jobs is much lower than it is elsewhere in the IT industry. I would advise him to get another programming job first, learn some C/C++ in his free time, and then start looking for jobs in the game industry.


As a counterpoint, there are a lot of game programming positions open everywhere, and studios are having a real hard time filling them. Yes, there is more competition than non-games roles, but programmers are still programmers - valuable and rare.

If he's a good programmer he shouldn't give up. If he's job hunting, tell him to spend a good two weeks learning C++ and build a couple of dummy programs, then start applying for roles and continue to work on his C++ in the background. It'll come.


or write games for Android


Any mobile platform will do. J2ME is still bundled in most dumbphones and it isn't too hard to move from Java to C# and address WP7. I gather Microsoft will do anything to get people started.


> it's all a magic black box to them.

Sounds like a big success, then!

Isn't this what every high level programming language designer aimed for?


Well, it's like a GUI. If there's a menu item for what you want to do, it's brilliantly simple. If not, then you're stuck unless you can write a program, which means, understanding how the GUI works under the covers so you can write a GUI app of your own.

I remember the days of writing GEM apps in 68k assembly...


Really, could anyone handle more than a few layers at the same time? It is sometimes useful to know how hardware (I'm looking at you, L2 cache!) works, but even then a bit simplified version can do the job.

And from time to time there comes another layer on top of everything else, and you wish you could forget the lowest layer you know, just to not start going insane. Or at least to slow it a bit :)


Unfortunately you can't forget about the L2 cache. I love Radix sorting it's just so fast and elegant except it's not vary cache friendly so it can be rather slow when you least expect it.


I was about to jump in and chastising you for generalizing about young programmers before I realized it's been 21 years since I wrote my first line of code, and I probably don't count as young anymore...

Though let me say that this is where I think good universities play a crucial role. A lot of my class mates had never written a line of assembler or C before university, and by the third year we had all built CPU's, compilers, operating systems and a few robots. Some of that is bound to stick and help you grow an intuition about the towering stack of complexity below the browser window.


The mind can only take in a window of information of a certain maximum width. Abstracting away the lower levels is crucial in order to free the attention for the higher level questions.

Software is solving much more difficult problems than ever before thanks to this evolution to higher abstractions.


Yes, and everything's fine until something blows up. Then you need to know what's going on at a lower level to be able to fix it.

Case in point: I once had a co-worker show me a Java crash report from a crash that nobody could figure out. Every now and again, the server, Tomcat, JVM and all, would just die, spitting out a stack trace that went into native land.

One really nice thing on the crash report was a full register dump, stack address dump, and a dump of the memory around the PC. So I disassembled the memory, figured out that it was calling memcpy when it died, and lo-and-behold, one of the register operands was 0! A comparison of the stack addresses showed that it was dying in a library used for biometric fingerprint based authentication.

The vendor kept insisting that the problem was on our side (we had a JNI library of our own that called their library, but I desk checked it twice and concluded that it was fine), so I disassembled the whole library and traced what it was doing. It turns out that certain kinds of incomplete fingerprint scans wouldn't trigger a rejection at the high level, but would cause a rejection at the lower levels (almost... it would clear the pointer to the data but would then return TRUE instead of FALSE). The library would then carry on its merry way passing a null pointer lower and lower until it died in memcpy, taking the JVM with it.


I think it's valuable for most teams to have one member that can understand raw crash dumps but this is an age of specialization and I'd rather have the rest of my team each learning some other niche.


Agreed, however I do believe that each person should have a basic understanding of at least one level below where they're working at.


> Software is solving much more difficult problems than ever before thanks to this evolution to higher abstractions.

Please prove that. :-)

Most of what I see is reskinned GUIs from the early 80s... first on Mac OS, then on Windows, then on OSX, then on Linux, then on the Web, then on handsets...

more pixels per inch != more essentially complex.

big data was being handled by bank mainframes in the 1970s (the original cloud computing. :o).


Natural language processing, image and speech recognition, vastly more powerful media creation tools, much much more complicated games, increasingly useful recommendation systems, data processing on previously unimaginable scales.

The list goes on and on and on. I can't imagine how anyone could seriously argue that we're not leagues ahead of where we were in the 70s.


Is a modern game really more engrossing than Elite, that I still play 25 years later? No, modern computing power is squandered on bells and whistles. The industry has stagnated since the 80s.


I often play the grumpy old technology curmudgeon role but you've handily outdone me here. Well played, sir!

I used to play Elite too and, while it had some unique qualities, it wouldn't compete with any of the better modern titles for my attention now.


It is also requiring a 2Ghz quad-core with 8G RAM to edit a simple document. Cheap powerful computers have enormous potential but that is just being squandered on ever-deeper layers of abstraction. Most of what I do, in terms of applications like word processing, spreadsheets, etc I could do on a 30-year-old, 2Mhz 8-bit with 32k RAM (i.e. a BBC Micro, and I do!).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: