Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The mind can only take in a window of information of a certain maximum width. Abstracting away the lower levels is crucial in order to free the attention for the higher level questions.

Software is solving much more difficult problems than ever before thanks to this evolution to higher abstractions.



Yes, and everything's fine until something blows up. Then you need to know what's going on at a lower level to be able to fix it.

Case in point: I once had a co-worker show me a Java crash report from a crash that nobody could figure out. Every now and again, the server, Tomcat, JVM and all, would just die, spitting out a stack trace that went into native land.

One really nice thing on the crash report was a full register dump, stack address dump, and a dump of the memory around the PC. So I disassembled the memory, figured out that it was calling memcpy when it died, and lo-and-behold, one of the register operands was 0! A comparison of the stack addresses showed that it was dying in a library used for biometric fingerprint based authentication.

The vendor kept insisting that the problem was on our side (we had a JNI library of our own that called their library, but I desk checked it twice and concluded that it was fine), so I disassembled the whole library and traced what it was doing. It turns out that certain kinds of incomplete fingerprint scans wouldn't trigger a rejection at the high level, but would cause a rejection at the lower levels (almost... it would clear the pointer to the data but would then return TRUE instead of FALSE). The library would then carry on its merry way passing a null pointer lower and lower until it died in memcpy, taking the JVM with it.


I think it's valuable for most teams to have one member that can understand raw crash dumps but this is an age of specialization and I'd rather have the rest of my team each learning some other niche.


Agreed, however I do believe that each person should have a basic understanding of at least one level below where they're working at.


> Software is solving much more difficult problems than ever before thanks to this evolution to higher abstractions.

Please prove that. :-)

Most of what I see is reskinned GUIs from the early 80s... first on Mac OS, then on Windows, then on OSX, then on Linux, then on the Web, then on handsets...

more pixels per inch != more essentially complex.

big data was being handled by bank mainframes in the 1970s (the original cloud computing. :o).


Natural language processing, image and speech recognition, vastly more powerful media creation tools, much much more complicated games, increasingly useful recommendation systems, data processing on previously unimaginable scales.

The list goes on and on and on. I can't imagine how anyone could seriously argue that we're not leagues ahead of where we were in the 70s.


Is a modern game really more engrossing than Elite, that I still play 25 years later? No, modern computing power is squandered on bells and whistles. The industry has stagnated since the 80s.


I often play the grumpy old technology curmudgeon role but you've handily outdone me here. Well played, sir!

I used to play Elite too and, while it had some unique qualities, it wouldn't compete with any of the better modern titles for my attention now.


It is also requiring a 2Ghz quad-core with 8G RAM to edit a simple document. Cheap powerful computers have enormous potential but that is just being squandered on ever-deeper layers of abstraction. Most of what I do, in terms of applications like word processing, spreadsheets, etc I could do on a 30-year-old, 2Mhz 8-bit with 32k RAM (i.e. a BBC Micro, and I do!).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: