Hacker News new | past | comments | ask | show | jobs | submit login

Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'

But then I see the interactive circuit simulation and think "Fuck it, who cares, this is awesome!". Designing circuits is one of those things that, if I were a self-learned coder instead of a comp. eng major, I would've never delved into...yet learning how to build an adder circuit and getting an appreciation of the most basic building block of computation (and how surprisingly complex it is to just add 1s and 0s) is a profound lesson that I think is essential for me, personally, to really grok programming. All the sections about culture and conferences and etc. are a little bit off-field for me...it's not that I don't think that code and life and human thought and behavior aren't intertwined... * I just think the discussion about conferences reads as if the author doesn't realize that all disciplines spawn conferences and conferences culture. There's nothing particularly unique about code conferences. Not the sexism, not even the nerdiness.

I would love to see the OP's editor respond in a not-quite-as-length essay. What did they learn about code after reading the piece that they didn't understand before?

edit: * I'm emphatically not arguing "Oh but everyone does conferences shittily so tech conferences shouldn't be shamed". Just that having it in this "What is Code" essay makes it seem as if it's a notable "feature" of programming...but that understates the problem by an order of magnitude. Sadly, it's a feature in most every discipline, and the inherent feature is the gender imbalance, not the topic of the conference.

edit: Also, I wished that the section on Debugging was much higher than it is...Robert Read's "How to be a Programmer" [1] makes it the first skill, and that's about the right spot for it in the hierarchy of things. Maybe it gets overlooked because it has the connotation of something you do after you've fucked up. But, besides the fact that programming is almost inherently about fucking up, the skill of debugging really underscores the deterministic, logical nature of programming, the idea that if we have to, we can trace things down to the bit to know exactly what has been fucked up in even the most complex of programs. And that's an incredibly powerful feature of programming...and not very well-emphasized to most non-coders.

[1] http://samizdat.mines.edu/howto/HowToBeAProgrammer.html




Not to your main point, but the circuit simulation reminds me of Silon by SLaks: http://silon.slaks.net/

Edit: Also, as a late-bloomer and self-taught (self-teaching) programmer, I am on the other side of the paradigm you're talking about. Petzold's Code is one of the first books a self-taught programmer should pick up. It is an awesome introduction.


One of the few worthy things I felt I got out of school was the moment I grokked the whole stack from sequential logic to the program counter and control logic from a cpu, how each clock tick formed a new circuit. That was really mentally expanding. I got it from reading a prescribed book for a class I wasn't taking from a professor who was a tool, so it is possible to learn these things outside of class. In fact, that's where the real learning, IMO, happens.


What book? For those of us not there yet :)


I dug around and cant find it; I graduated a while ago. But, the more I thought about it, it was actually 2 books: One on how to design a cpu on an fpga, similar to this one:http://www.amazon.com/VHDL-Implimentation-16-bit-microproces... And another book on digital design, specifically, digital sequential circuits. If you google that term you will find a few links to pdfs to study. Finally, "Computer Organization and Design" by John Hennessey is very recommended.


One of the most memorable weeks in my Engineering degree was using Cadence to build a CPU from the ground up. Every transistor, every connection, the ALU, etc was laid down by someone in our little group of students, and then wired together to make a thing with a few thousand transistors. And it friggin worked.

It also showed how the chip itself would be laid out, where the dopants would be and such.


> Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'

I completely agree. I got a third of the way through it before I just couldn't stand the obfuscation and decoration any further.

What's sad (as I [tweeted][1]) was that there's a 1972 article by Stewart Brand, published in Rolling Stone of all places, that does a better job of actually explaining what computers can do, without resorting to jargon and jive: http://stuartpb.github.io/spacewar-article/spacewar.html

[1]: https://twitter.com/stuartpb/status/609035295002984448




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: