Hacker News new | past | comments | ask | show | jobs | submit login
You just went to the Google homepage. What actually happened? (plus.google.com)
808 points by jmonegro on March 20, 2013 | hide | past | favorite | 144 comments



You just pressed a key on your keyboad.

Simple, isn't it?

What just actually happened?

You engaged a cascade of motor neurons to coordinate the contraction of thousands of muscle cells, which pulls a lever attached to your calcium crystalline framework, grinds across a glucosamine joint. This forces your calcium crystalline frame-member to depress, compressing your saline-filled lipid-polymer foam skin against the keyboard. As you do this, you constantly measure the pressure against the lipid-polymer walls to ensure you are not deforming your muscle cells too much or too little.

---

Reality has inordinate complexity. When humans build roads or build narratives or build websites, we are simplifying reality for ourselves and others, including other animals.


You just pressed a key on your keyboard. Simple, isn't it? What just actually happened?

The electromagnetic forces comprising the atomic structure of your fingertip repelled against those of the plastic polymer forming the key with which you made 'contact'. The two atomic matrices combine kinetically to move in the direction of overall force...

To add to your penultimate sentence: "Reality has fractal inordinate complexity".


Indeed!


I was expecting to see this kind of comment. Not that I entirely disagree but isn't there something to be said about all the layers the author mentions having been directly designed by humans and not pre-existing facts of nature (minus a few of the electrical properties he goes into).


I agree. I think OP is navelgazing.

"You just baked an apple pie from scratch. What just happened?"

"Laws" of physics... Big bang... cooling... particles... hydrogen... stars... fusion... heavier elements...

Oh, no cosmology? Okay, the entire history of evolution, then civilization, then industry leading to humans with electricity and ovens and supermarkets.


Yeah. After all, as Sagan stated, "If you wish to make an apple pie from scratch, you must first invent the universe"


At least with technology, someone somewhere understood what is going on. With biology/biochem/physics all the way down... there is no expert who designed the system originally!

Can you imagine computer science if we had no prior knowledge of computers and had to research the entire process starting at the end point? What a task.


> someone somewhere understood what is going on.

Not quite. A huge group of someones in aggregate understood what is going on. There is still no single expert who designed the system originally. Eventually, or perhaps already in many cases, some portion of the system has no living expert who designed it.

I can imagine a future where expertise on some deep components of the system has been lost to the sands of time, and people have to study them blindly to determine their function in just the same way that we currently study nature.

Neat.

edit: I think raelshark and Carl Sagan said it better than I did.


In Vernor Vinge's A Fire Upon the Deep he describes the role of the programmer-archaeologist, whose job is to sift through mountains of abstraction layers and black-box code modules written in dead languages thousands of years prior, determine their function, and stitch them together to perform a desired function.

In the prequel, A Deepness in the Sky, one of these modules, buried beneath said layers, contains a booby-trap set by an ancient civilization that only the protagonist knows about. I believe the Unix epoch is mentioned briefly.


In the book, all time is measured in seconds since the epoch, because days and years have no meaning for an interstellar civilization. But the programmer-archaeologists believe the Unix epoch is when humans first landed on Earth's moon. :)


"The first day of the first full year after Man landed on Earth's moon" is probably a fair way to describe the UNIX Epoch in the far future.

I guess the other interesting events are first Fission (weapon or test), and when we first land on Mars.


seconds don't have (consistent) meaning either, in a universe with a speed of light.


Why don't they just code new software from scratch?


It's a logical continuance of the current state of affairs. Even in the early 90s when these books were written, few people wrote entire applications in assembly language, or punched machine code into cards; they wrote in a higher-level languages that are eventually compiled into machine code, or were interpreted by other tools of which they know nothing about the internal workings.

Hell, we are inventing entire languages today that compile to Javascript, which in turn is interpreted or JIT-compiled by other tools. Or we write advanced HTML5 frontends that talk to advanced Node.js backends that in turn talk to legacy systems written in COBOL running on IBM mainframes. Programmer-archaeologists already exist.


Because GitHub.


Or the co-founders/original engineers no longer work for the once-startup-now-corporate company, and 3rd-tier engineers are left to figure the rats nest out, because of course, there are no docs.


It never really occurred to me that natural science is an effort to reverse-engineer the universe.


"We are a way for the cosmos to know itself."

-Carl Sagan


You just missed the whole point of the article.

No one, nowhere fully understands how technology like a computer works. That's because it's possible to design systems on top of systems without fully understanding the nuances of all the dependencies.


Actually, you missed the point of my post.

I never meant to imply that one person understood the whole technology system (that's truly absurd), rather that each individual component was designed and implemented intentionally and it's likely that each niche in the industry had at least one person who truly understood it.

Now, take that paradigm and compare it to natural science: No one anywhere ever had any understanding of the system, and now we look at the end result of a constantly changing system and try to ascertain everything from how it works today to how it all began.

I apologize, I could have been more clear in my original post, sorry!


I think it's likely that what we now call computer science will end up being essential to understanding the systems of the universe...it just so happens that we are using computers to discover it, like previous investigators used weighted balls, inclined planes, or chemicals.


[deleted]


Pretty sure it's "to my religious views"

If you're gonna make a bad joke then at least you should learn something from it ;-)


When I got to the last paragraph or two, I realized this was not just a "computers are complicated" post. He marks a specific social problem the complexity of the technology has created.

This is the best way to frame the "tech laws problem" I've seen in a long time. I'm curious: what is the best way to approach the bikeshedding issue?

On the one hand, the people who recognize the issue tend to be technical. On the other, the solution will inevitably be a social one, unless something comes along that makes patents and technological laws moot.

Here are three social avenues I could see being helpful, but none of them seems to solve the problem. I'd love to know what people are doing in this area.

a) Improve technical education for the general public so that they can call BS, or make reasonable decisions.

b) Improve technical education of public servants that make crucial decisions regarding technology. I'm not competent to rule in a legal case about pollution, so why should we assume judges are competent to rule in a legal case about code? (How do you measure that? Certifications? - egh).

c) Improve social outreach for technical people. Most technical people probably want to build cool things instead of sit in Congress, knock on doors, or otherwise get involved. I've talked with engineers who despise legal proceedings so much they started trolling the lawyers in depositions. Honestly I'd rather build something cool than think for five hours about how to get people to care about patent law. Maybe that should change.

I'd love feedback on this, because the bikeshedding issue is the scariest social problem I can't think of a solution to. It doesn't just affect a specific patent, it affects the way we rule on them in general.

If you are both a lawyer and technical, I would really love your feedback, here, or via email.


To me, bikeshedding is a signal from politicians, judges, and other public servants, that they don't feel the problem is important enough for them to understand. Like us, they have limited cognitive resources (processing time, calories, actions-per-day, etc), so they have to be cautious about how they budget it.

How do we force technical information into the brains of our government employees?

OR

What incentives do we need to provide tech workers to be more politically active?


Eventually, it appears most jobs will be replaced either by computers or people who program them-- in the same way that most work from before the Industrial Revolution has been replaced by mechanical machines or literate-and-numerate-people.

"Non-technical people" will be replaced by ones who don't think of computers and programming as technology, any more than we do books and math. Programming magnifies individual task completion potential, allowing one person to accomplish through scripted automation what would otherwise require manual delegation. It's not software that's eating the world, it's programming. Software is computational state, but programming is a state of mind.

The "Computer Revolution" has happened, but programming illiteracy is still very high, programming fluency very low, and programming languages very primitive.

Around the French Revolution, over 200 years ago, when the modern printing press had already been around for almost 350 years, the literacy rate in France was just crossing 50%. [1] And world illiteracy has over halved since the Unix epoch, 43 years ago. [2]

The only solution I see is to help spread "programmacy".

1. http://en.wikipedia.org/wiki/File:Illiteracy_france.png

2. http://en.wikipedia.org/wiki/File:World_illiteracy_1970-2010...


And as machine learning grows more popular, perhaps many programmers will be replaced as well.


As far as incentives for tech workers getting active, the recent spate of badly-written legislation (SOPA, CISPA, etc) has been acting as a great motivator. I literally do not personally know a single technology professional who isn't aware of these laws and their ramifications.

The great majority of them have informed their congresscritters and helped get the word out in some way as well.

Now if only there was a way to parlay that short term political interest into something more long term and substantial..


So before I was a lawyer I was an engineer, and most of my friends are still engineers. One of the things that I noticed then and still notice now is that tech people think they're special. They think that their problems are somehow qualitatively different from all the other problems that society has ever faced. They are simultaneously optimists (we can build anything!) and pessimists (we have no hope of influencing the system!). They are both self-centered (our things are the most important things!) and convinced of their own impotence (nobody cares about the things we do!). Indeed, this article exudes of "we're special!"

If tech people want a world with sensible tech laws, the first thing they have to do is internalize one simple fact: computer tech isn't special. It's no different, in the grand scheme of things, than petrochemical refining or agriculture. It's just one specialized problem domain within a larger society.

That realization is simultaneously humbling and empowering. If tech is the same as everything else, then that means the same social tools that work for everything else can be leveraged to work for tech! And that means (c), lots and lots of (c).

But not just (c), even though (c) is the starting step. Ultimately, through (c) you can do (b). For example, a judge isn't a domain expert in petro-chemical refining either, but they make rulings on petro-chemical refining all the time and it works more or less well. That's because the system is structured so as to not require judges to be experts in everything. It is structured so people versed in a specific problem domain, be it petro-chemical refining or code, can explain in plain terms the moving parts of his case, and the judge, generally a highly intelligent person, can make decisions based on those explanations. And ultimately, through (c) you get to (a). Ultimately, the burden is on tech people to convince the public at large to care about the things that they care about.

I've used this example elsewhere, but I think it's a really important one. The tech industry complains up and down about its inability to fight the "big money" of the media companies. Yet, the entire U.S. movies and music industry put together are about $50 billion in domestic revenue per year, or equivalent to just Apple's revenues in just one quarter. You're telling me that the tech industry can't fight the "big money" of an industry that's a fraction of its size? Please! Another example: Apple's revenues and profits are about the same as Goldman Sachs, Morgan Stanley, and JP Morgan Chase combined. Tech isn't the skinny schoolboy getting picked on by the big kids--it's the behemoth. The only sector that can compare is the petro-chemical sector.

We live in a democracy. In a democracy, you can't just sit around waiting for everyone else to realize how wonderful and special you are and legislate to further your interests. You have to integrate. You have to participate in the political process. You have to explain to policymakers the moving parts of your industry, and you have to convince the public to care about the things you care about. And you have to accept that the policy makers sometimes will not agree with you (because they're balancing a broader array of interests than just your own), and you'll have to accept that the public won't necessarily buy into your worldview. But when that happens it's not an excuse to take your marbles and go home.

For a contra-example, look at environmental legislation. Environmentalists have been incredibly successful considering there is very little money behind the movement, and that the people on the opposing side of the table are petro-chemical giants, each of which are 2-10x as large as the entire domestic media industry that tech people think are too monied to be overcome. Yet they have been remarkably successful given those odds! Why? Because they don't hole themselves up in ivory towers. They participate in the political process. They translate their value systems into things that perk up the ears of politicians (this environmental bill might cost a few jobs, but it will be more than made up for by the avoided health costs from the reduction in pollution!) Jobs, costs, etc. Those are things politicians care about, and indeed those are the things they're elected to care about! Sometimes, they even fight dirty. They participate in the war that is living in a democratic society with competing factions.


I'd argue that software and digital technology are different from everything that the legal system has legislated on in the past because this is the first time we've been able to make 100% accurate copies for zero cost. (Ok, maybe not zero with the cost of electricity and storage, but essentially zero.) I also feel that the legal protections of the patent system, while perhaps not completely broken, are certainly not tuned to the realities of software development. Software is different. It's sui generis and I believe our laws should be adjusted to reflect that.

That said, I agree completely with your point about integration. The worst thing we can do as a community is step aside and allow others to create legislation that is not in the interest of technology or the good of the people at large.

Edit: minor grammar.


We have had, for hundreds of years, technology that makes the cost of each marginal copy of a creative work some tiny percentage of what people are willing to pay for that work. Digital technology making that percentage even smaller doesn't fundamentally change anything. There is nothing magic about "essentially zero cost" copies versus "negligible cost" copies.

This is largely because our whole system of property is structured so that marginal cost is broadly irrelevant. We have a pervasive notion in our system that people are entitled to the "benefit of the bargain." That is to say, people are entitled to profit from the difference in price people are willing to pay for something, based on supply and demand, and the marginal cost of producing that something. That's why Apple can sell for $600 iPhones that cost only $207 to produce, or why Louis Vuitton can sell for thousands of dollars handbags that cost maybe $100 to produce. The marginal cost of production is irrelevant, from the buyers viewpoint, in anything we buy. So why should it be different for digital goods?


I have more issues with even the concept of software and design patents far more than copyright.. wrt copyright, I only feel the terms have gotten out of control.

With software patents, I firmly believe that if an implementation isn't either difficult or novel, it shouldn't be patented... Example, the apple page-flip animation. The effect is a simulation of a real-world behavior (non-novel), and the implementation details are very simple (given the hardware interfaces are mostly solved, as are the computational logistics as problems solved). The hardware involved could certainly be patentable, as could the original implementations (now older than patents). For the most part, anything that simulates a real-world activity on a generally available computing hardware should not be patentable, it's usually very obvious, and often trivial to implement.

I also feel that even if software patents were to be protected, it should be much more limited, perhaps 5 years. If you can't gain an advantage in software with a 5 year head start, you don't deserve to win. That's just how I feel about it.


You're right, the marginal cost argument is wrong, and yet there is something different going on here and I think I know what it is: The fundamental change here is that customers own the means of reproducing the product and reproduction costs are equal to the marginal cost. How much could Apple sell an iPhone for if the same was true? What would Apple do to remain in business?


I think that's worth repeating: "the ownership of the means of production has changed". Intentionally rephrased to allude to a certain economist.


It wasn't my intention to focus on economics or digital products. What I am really arguing is the more fundamental aspect of information and the ability to copy and transmit it with 100% accuracy at great speed. This, I think, separates digital technology from all others. It's what makes it revolutionary and desirable. I think the impacts on society are obvious.

Edit: typos


I would agree that software is different in this way, but the difference is quantitative, not qualitative. Other industries have had low costs of reproduction, but this was coupled to a high cost of initial investment. Software is unique because you can credibly design and build a globally successful product on your own, and not have to involve anyone in the production and distribution of it.

This quantitative difference has revealed that the IP emperor has no clothes. Before there was an illusion that copyright and patents were a benefit to society in their current form, but this illusion is now stripped away. With 3d printing and autonomous delivery vehicles we're at the dawn of the softwarization of the material world. If we can figure out the best IP laws for software, those will eventually be the best for most if not all industries.

I personally believe that means that patents have to go. They cause more harm than they benefit. This is obvious for software, and eventually it will become obvious in all industries.


Software and digital technology are different from everything that the legal system has legislated on in the past because reproduction and distribution (basically) are free, instantaneous, limitless, and available to everyone.


> computer tech isn't special. It's no different, in the grand scheme of things, than petrochemical refining or agriculture.

True, however I wonder if people think computer tech is simpler because they own some. Ask someone if they understand petro-chemical processing and they'll probably say no, but ask if they understand computers and they might think they know a lot because they use one all the time. They've even had to change some options in a password-protected preferences panel, or use a keyboard shortcut! They can get their phone to sync with their two computers, and it all just works.

Tech has (particularly recently) has become common and very simple, and we live now in a time when you can get your granny an iPad with Siri and she can use it. With driving the usability of everything up, we've also been pushing the idea that "it's simple, really! Don't be scared" and that's worked wonders. I get frustrated when a confirmation email takes more than a few seconds to show up in gmail, how ludicrous is that? I got annoyed when skype went blocky and the sound kinda crackled while talking to someone on the other side of the world for free, while on wifi. I caught myself thinking "But you just send the thing from here to there, it's so simple!" and thought about it more.

I think that was the point of this article, it's phenomenal complexity hidden behind a fantastically simple interface. A lot of people have poured a huge amount of money into making it feel simpler, Apple are a great example of that. You can talk to your phone and it'll sass you back.

I wonder which other fields have this same problem? I know there will be some, because I'll be one of the people thinking it's really simple when it isn't. Maybe medicine? People might think it's complicated for some things but there is a culture of 'Just make the right type of pill, duh'.


>The tech industry complains up and down about its inability to fight the "big money" of the media companies.

The case of the media industry is special because what they lack in dollars they more than make up for in airtime. Major media organizations can very easily make or break a candidate or an issue just in selecting which stories to cover.

This is starting to change with the internet, but there are still millions of voters whose primary source of information is cable news. Over time this is likely to change, the trouble is how to mitigate it in the meantime.


"Media companies" is the wrong word to use--I was talking about movies and music. I just don't see the music and movie industry really leveraging the airtime they have. What they do have is excellent lobbying. My wife (former lobbyist) explained it to me thus: Hollywood and the record labels have convinced politicians that they stand for three things: 1) America; 2) jobs; 3) American jobs. They portray music and movies as the uniquely American cultural export, one of the few industries where the U.S. is still globally dominant, and an industry that creates hundreds of thousands of jobs. They invested in this lobbying campaign early and have stuck to it adherently. And what's the opposition to these American Job Creators? A bunch of internet nerds who want to be able to play DVD's on Linux? There is no compelling counter-narrative from the tech side here, just a bunch of handwaving and bellyaching about how much Chris Dodd makes from the RIAA.


>"Media companies" is the wrong word to use--I was talking about movies and music.

They're mostly the same companies. NBCUniversal owns NBC/MSNBC and Universal Studios, News Corp. owns Fox News and 20th Century Fox, CBS owns Columbia records, TimeWarner owns CNN and HBO and New Line Cinema, etc.

>There is no compelling counter-narrative from the tech side here, just a bunch of handwaving and bellyaching about how much Chris Dodd makes from the RIAA.

Well, that's the irony, right? Hollywood puts together these commercials about how some union carpenter is going to starve to death if you don't pay $20/head to see American Pie 5, meanwhile the people who are actually working unsustainably long hours for weeks or months at a time to put together applications that help dissidents not get executed in oppressive countries are getting shut down and harassed because of the same laws. But the latter are actually instances of The Little Guy, so they don't have a huge organization that can pay to put ads on the television or lobby Congress. So even though the sum total of jobs and economic growth by small start ups is greater than it is from Hollywood, Hollywood is more organized.

What we get is lobbying efforts from the likes of Google, because they have the resources and the cohesiveness to make things happen. But that only works for the times when they're on our side. Things like DMCA 1201 cause the greatest harm to the smallest companies. They may harm the big guys too, but not as much, and often not enough to get them to push back hard enough to stop it.

It's basically a collective action problem. How do you get a million "internet nerds" to work together to make Congress understand the harm in the things they're doing to us?


Do we just need more lobbyists?


Define "lobbyist." The term carries a very negative connotation in the tech world, but fundamentally a lobbyist is just an advocate. As a student, I worked at our school's environmental advocacy clinic. We called up politicians, bureaucrats, the media, etc, to drum up support for our cause. That's lobbying! Chris Dodd aside, that's the bread and butter of what lobbyists do--explain to intelligent but not technically-specialized public officials the mechanics of how specific issues work and paint a narrative for them that convinces them to care about the issue.

So yes, the tech industry needs more advocates. People who can engage in the political process and translate from the value systems of the tech industry to the language politicians understand: jobs, growth, votes. That doesn't have to mean huge campaign donations and a senator or two on the payroll (though those things help). It does mean developing a real base that cares about tech issues (and keeps caring about them!) and having dedicated, politically-saavy people willing to champion them.

Again, a comparison to the environmental movement is relevant. There are probably 100x as many (number I pulled out of a hat) people championing environmental causes as there are people championing tech causes. I know a number of very qualified people who opted out of private practice to work at environmental non-profits. Nearly every law school has an environmental law center where students participate in addressing local issues at the grass-roots level (things as mundane as nagging the City of Chicago to do better lead testing for its municipal water). Meanwhile out of all the lawyers I know with technical backgrounds (which is actually quite a few), approximately zero went to a non-profit to champion tech issues. There are some organizations that do great work on tech issues, like the EFF, but I'm not even joking when I say there are probably more environmental issues organizations in Chicago than there are tech issues organizations in the whole country.


I think the solutions simple: where traditional government oversight and legislation is ill equipped to understand the problems about which they're legislating, they need to create a technocratic branch to further handle legislation where it seems a fit. A sort of technocratic release valve.

As far as how to appoint leaders or czars for the new technocratic branch, I'm stuck, but I'm sure someone could figure out a good way to incentivize nonpartisan experts to get involved. Maybe simply wikitize legislation, allowing a very decentralized passing of legislation.


>I think the solutions simple: where traditional government oversight and legislation is ill equipped to understand the problems about which they're legislating, they need to create a technocratic branch to further handle legislation where it seems a fit.

You have to be careful with things like this because of regulatory capture. And also because setting up a group of people whose stated purpose is to regulate something causes them to try to slowly regulate every part of it, even in cases where private ordering would lead to better outcomes. A huge part of the existing problem is instances where Congress doesn't need to act but does anyway.


If you had to have your patent reviewed by a team of engineers before it is approved, then we will have less patents.(we don't need a patent for a slick button) And to file for a law suit, you will also have to have a team of engineers to review it.

This could be a very good thing.


Patents are generally reviewed by degree holders in the same field. Submit a machine, it'll be reviewed by someone trained as a mechanical engineer. Submit a new formula for flubber, a chemist will be assigned to look at it.


I don't know how it is now, but when I graduated my understanding was that as long as your degree was in the general field, you were qualified to review the patent application. So a machine could be reviewed by someone with any engineering degree - industrial, chemical,civil, etc. And my brother's friend - a civil engineer - used to periodically call me to ask me about some coding-related patent. And I - definitely a non-hacker - would always respond "that's just a best practice. anyone who's had a few undergrad programming classes knows that. how can that be patentable". Hopefully, it's changed, but I wouldn't count on it.


I dunno. What I know about patents is what my lawyer has been telling me.

There's basically about a zillion ways for a patent to fail, examination is only one. For example, when I lodge in national offices, my application will become public. Anybody will be able to lodge objections and I will be on the hook for the lot, basically.


Most patents are reviewed by people with corresponding qualifications within the broad categories: mechanical, chemical, biological, electrical. Software tends to fall into the electrical camp.


a) b) and c) are all education problems, and I think for every other instance of unwashed masses catching up on technical knowledge, the only "solution" has been to wait decades for a generation shift with knowledgeable people becoming teachers and parents.

People didn't understand cars, or electricity, or electronics either, but now everyone has a basic knowledge about the working of these, and they can get what the repair guy is saying.

IMHO for any new technology there is a minimum time measured in years or decades depending on the complexity to have it properly assimilated. Mainstream internet is 20 years old ? Congress people should all be knowledgeable about it in 10 or 20 years.


> what is the best way to approach the bikeshedding issue?

Required background reading for anyone who hasn't seen it: bikeshed.com

Regarding (a): Give people lots of choices, and they'll learn to pick the right ones. There is a reason the two main smartphone platforms have completely won out over the older stuff. It's just really sad there there aren't more to choose from, since iOS and Android both have major areas of suckage.

Regarding (b): Public servants should not be making crucial decisions about technology. The government could never build the massively complex system described by OP, and regulation would have killed it quickly (probably by cementing an AT&T/IBM duopoly in technology).


Wow, it's easy to forget how much complexity there is, even for a technologist. This post reminds me to step back and appreciate everything that's going on under the hood. For a similar effect on non-technologists, see Louis C.K's bit about how even the shittiest technologies are a miracle[1].

I disagree with the conclusion though. I think the reason Steve Jobs' death impacted people more than Dennis Ritchie's is that Jobs was taken in his prime. Who knows what the world lost by his premature death.

1: http://www.youtube.com/watch?v=KpUNA2nutbk


"Any sufficiently advanced technology is indistinguishable from magic." -Arthur C Clarke[1]

This article has been posted at least once onto HN, and I liked reading it then, too. One of the first major insights I remember from my CS curriculum was the concept of abstraction -- in CS it's applied to code-as-data, the OSI model, etc, but it exists everywhere, including all engineering, large bureaucracies, etc. Thank you Prof Harvey!

1: http://en.wikipedia.org/wiki/Clarkes_three_laws


I prefer the correlary, as motivation:

"Any technology distinguishable from magic is insufficiently advanced"


"Wow, it's easy to forget how much complexity there is, even for a technologist."

Complexity is there every day, we just abstract it away in order to focus on the task at hand.

If you are widely read or have worked in multiple fields, and/or multiple levels of the "stack", both hardware and software, then it's a bit more apparent.


Embedded systems engineers work on all these levels, simultaneously, day in and day out. But apparently our craft is kind of a dying one.

I've seen discussions on LinkedIn about how one gets into embedded systems engineering. Nobody seems to have a clear answer.

I've seen people get all hyped up on using small eval/dev boards like Raspberry Pi, but don't get much farther than loading a desktop or XBMC on it. There's hope in the Arduino crowd, but blinking LEDs isn't even putting the training wheels on the bike.


The sad thing is I was talking last week to a former coworker with 20 years of hardcore embedded programming experience and he is thinking of going into web programming because of better opportunities there.

I spent the first few years of my career doing hard (as in deadlines not effort) real time embedded software and I think it made me a better programmer. However being able to work from a laptop anywhere I want instead of being stuck in a test lab with bus analyzers, scopes, data analyzers and bond out emulators booping all around me is a nicer lifestyle I'll have to admit.


I am unsure that most good embedded programmers would have a personality type compatible with web programming.

Embedded software requires attention to detail, and a continual focus on reliability.

It seems to me that web development often involves shoving shoddy solutions out the door, and often encourages a certain cowboy mentality.

Alternatively to try and make an elegant and correct solution for browsers (my goal) is frustrating and dissapointing due to the thousands of meaningless browser bugs and standards faults, and the ugly compromises that are forced to be made... Embedded vendor toolchains and hardware bugs are a dream in comparison!

At least that is my experience coming from an embedded programming background and now five years working on deep DOM foo... Slowly poisoning my brain with browser crud. Unfortunately business reasons often lead to choosing the browser as a delivery platform regardless of its numerous downsides :'(


Totally agree. Being embedded usually means long hours in labs next to machinery that's half-working and half-smoking. You also don't get bean bags and foosball tables. I'm going to spend all Spring arguing with a customer to spend the extra $5 on their bill of materials to use a better processor that I won't have to band-aid 3 years from now when the current one goes obsolete or can't handle what I know their roadmap involves. This a $10,000 product with 500 EAU. That's $2500 retail. Google will spend $2500 on employee soft drinks before I finish typing this sentence.

I've been in 20+ years as well and I kick myself a lot over not going the way of the web developer. After all, what's web programming other than finding new and interesting ways to concatenate strings? =) Sorry, embedded dev joke there.

But as I get older and the greenhorns out of college can't tell me the difference between ASLA and ROLA, it kind of gives me some reassurance that my skills will be needed and valuable for the time to come.


Has anyone ever composed a chart of how the pay between various disciplines in EECS? Say, Java developer vs PHP developer vs embedded engineer vs appsec engineer?

Sounds like a fun HN project.


EETimes used to do an annual salary survey of various types of embedded/EE/system engineers, but I can't find a recent one. It's behind a paywall perhaps.


Agreed. If you're building something like an embedded nas server, you have to have a deep understanding of literally every piece of technology he talked about.

I think arduino has done wonders for getting people started, but it would be great if there was a similar ecosystem for people who are further along than blinking LEDs and building something of significance.


I don't think we'll ever see something like the Arduino IDE for more advanced systems. Some chipmakers have IDEs like Cypress and their PSoC Designer/Creator, but that still focuses on the chip alone and not the outside world.

The thing about embedded is that you're crafting a very specific hardware and firmware design for the task at hand. The project may begin with a generic setup like Arduino or Beagle or Rpi, but at the end it's a totally unique animal. There's no way any graphic or assisted design environment is going to be able to handle it all.


What about FPGAs? I feel like they could solve so much of this problem if the tools were better. Though I haven't used them since college, so I have no idea what the current status is.


FPGA are not what I would recommend for a "what-do-we-do-after-Arduino" type of device.

Hell, FPGAs are Jedi-Master class devices as far as I see it. This is the kind of firepower you call in when you have something that no off-the-shelf micro can do, like emulate an obsolete processor, or put 128 of them on one chip, or hash Bitcoins, image processing, etc etc etc.


Ah, no I didn't mean FPGAs as the next step for implementation, I meant more as a development tool.

For example, it would be cool if there was some sort of software that would let you design multiple-chip setups that would usually require breadboarding, but then run the whole circuit on the FPGA. That design could be transformed straight into a PCB that's ready for fab with the discrete components.


Ah, I see. But, of course, the practical-engineer part of my brain reads that and says "well, if it's working on the FPGA, ship the FPGA!" =)

I actually see a lot of potential in devices like the aforementioned Cypress PSoC. If you're not familiar, the PSoC family involves chips that have a core microcontroller with ram and Flash but also have a number of analog blocks inside as well. ADCs, capsense, op-amps, logic gates, muxes, comparators, timers, etc. You can use the Cypress IDE to wire the components up and connect them to I/O pins. The IDE can generate your configuration, or you can optionally reconfigure the chip while running via internal registers.

So it's kind of like an FPGA-type solution as you suggested. When you start developing with these chips, you're really creating hardware from software. It's interesting stuff.


Part of the problem may be that we've done too good a job of hiding embedded computers in places people don't expect, so people getting their hands on an Arduino or RPi don't even realize they could do those things with their new kit.


> I've seen discussions on LinkedIn about how one gets into embedded systems engineering. Nobody seems to have a clear answer.

I don't think there is a particular shortage of jobs in embedded systems relative to demand. A degree in electrical engineering along with programming skills is highly valued in the field. The players to target is also different. Not a huge demand for embedded engineers at say Facebook, but lots of demand for them at Raytheon or Juniper. Telecoms, wireless, defense, and aerospace hire up a lot of embedded software developers.


It frustrates me too. We have the technology to automate SO MANY of our processes, but it's almost like people enjoy doing them by hand.

I'm teaching myself to build a self-driving trash can, among other things.


Beautiful. One thing though, I believe the OP is mistaken when he implies that Steve Jobs didn't affect how computers work on the inside:

"On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things."

Ultimately, computer architectures serve real world use cases. Innovation in use cases results in innovation in architectures. There are countless new technologies that exist because of the products that Apple invented.


Example?


The entire ultrabook category can be credited the Macbook Air.

The iPhone drove the creation of the modern smartphone market. Before 2007, only businessmen carried Blackberries. Now pre-teens have smartphones. There's a really powerful image NBC posted comparing the Papal Conclave in 2005 and 2013[1] that tells the whole story. Similarly, the iPad is driving the tablet market.

And that's just 3 recent consumer-facing categories. The list goes on as you dig deeper or farther back. If you're using Chrome, guess who started Webkit?

1: https://twitter.com/neilgupta/status/312317589432971264

Edit: I should have said, guess who popularized Webkit. We can debate semantics on whether forking KHTML to Webkit makes Apple the author of Webkit or not, but I think we can agree Apple popularized it.


> If you're using Chrome, guess who started Webkit?

The developers of KDE. Webkit comes from a fork of KHTML, the engine used for Konqueror. Apple came to the picture 3 years later.


The macbook air is just a small laptop. In what way is making things smaller an Apple technology? Laptops have been getting steadily smaller for decades now.

The iPhone may have created a market, but that was because of how it worked. The question was whether anything Jobs did influenced the technology of the device. The iPhone was an early (but not the first) user of capacitive touchscreens, but virtually everything else in the device was an off-the-shelf part.

And FWIW: I know who started WebKit. It wasn't Apple. :)


Apple pressured Intel to design the smaller chipsets that made the Macbook Air, and subsequently other ultrabooks, possible. Without Apple's influence, we might not have ultrabooks.

Similarly, Apple pressured Corning to make Gorilla Glass. I would also argue the iPhone's UI has heavily influenced modern smartphone OS's to be more user friendly.

I'll cede the WebKit point, though I do think we can credit Apple for popularizing it.


Apple pressured Intel to design the smaller chipsets that made the Macbook Air, and subsequently other ultrabooks, possible. Without Apple's influence, we might not have ultrabooks.

I remember taking a tour of an Iomega production facility when I was in high school. The tour guide explained how Toshiba (or someone) was demanding a Zip drive that was 2mm thinner so they could include it in their latest slim laptops. Sony has been making tiny, powerful, expensive laptops for longer than the Air has existed.

All manufacturers have been pushing for smaller, better, faster, stronger components. If it wasn't Apple, it would have been someone else. Apple was just the best at taking credit.


Sure, Intel had to offer something to Apple to make them switch from PowerPC to x86. But you could also argue that Intel's ultra efficient CPUs and chipsets are actually a result of the competition from companies like Transmeta. It's not at all obvious that without Apple there would be no utra efficent CPUs. (But without Apple we might well live in a world full of ugly netbooks.)


> It's not at all obvious that without Apple there would be no utra efficent CPUs.

That's a very slippery argument. It's pretty obvious that without Darwin, there still would have eventually been a theory of evolution (as seen be Alfred Wallace's work), but Darwin gets credit for being first. Similarly, it's not at all obvious that without Dennis Ritchie, there would be no C. I'm sure eventually somebody would have created a similar language, but Ritchie made it happen first, so he deserves credit.

Sure, Intel may have eventually come up with ultra-efficient CPUs without Apple, but the point is that Apple helped make it happen.

To be clear, I'm not comparing a CPU design to the works of Darwin or Ritchie. I'm simply saying that arguing that an invention would have occurred even without the inventor could be said for anything.


I think you got that one wrong. Transmeta happend way before ultrabooks at a time when Apple was still shipping notebooks with a very power hungry G4. Because of Transmeta (and because of a funny thing called the power wall) Intel had to reconsider their entire processor lineup and to start to focus on low power. IBM never produced a low power CPU for Apple, so Apple eventually switched to x86. To claim that Apple coerced Intel to focus on low power is not correct. Just look at the timeline.


How does Gorilla Glass even come close to enhancing the state of computing technology like Dennis Ritchie did?


Unlike C it was a net benefit even if a small one. Pascal was often the direct competitor to C and for low level tasks it's a much better language that lost out due to UNIX / tool chain support.


> The macbook air is just a small laptop.

Let's simplify:

The laptop is just a small computer. It has a form factor which conveniently houses the components of a full-sized desktop plus a mobile power supply, the engineering of which is largely due to CAD.


Exactly. The distinction you missed is that I wasn't trying to credit the inventor of the laptop with being a technological innovator along the lines of DMR.


Sure, Apple popularized many disrupting technologies like GUI (curtesy XEROX PARC), Laser-printer+Postscript (curtesy Adobe/XEROX). They also pioneered desktop computing, smartphones etc. building on technology made by Intel, IBM, etc.

But where are the "countless new technologies that exist because of the products that Apple invented"? I don't follow that argument. I think it's rather the other way around: the great products that Apple invented exist, because of the countless new technologies that already existed.

Apple and Jobs stood on the soulders of giants. And one of these giants was Dennis Ritchie. Stating that fact does not diminish Apple.


I am not the GP so I can't know for sure what he meant, but nobody is denying that Apple stood on the shoulders of giants. The argument is that people have also stood on the work Apple has done.

Apple created great products thanks to the countless technologies that existed. Likewise, other companies have created great products thanks to the countless technologies that exist, including those created by Apple. To claim that Apple has had 0 influence on the industry is just silly.


Sure, I don't claim they have zero influence. I just agree with the original post on google+ that non-technical people tend to overestimate Apples influence on core technology because they lack technical understanding...


Isn't that the biggest sign at the amount of influence Apple has had? Technology is important, but without practical applications, it's useless. I could have a revolutionary nuclear reactor inside my laptop capable of powering a small country, but if I can't do anything with it, there's no point to all that tech. Apple has heavily influenced how technology is used, which in turn, has influenced how the core technology itself evolves.


I recognize that it was a typo, but solders of giants is so much more appropriate, somehow. Even if it doesn't really rhyme. :D


> The iPhone drove the creation of the modern smartphone market. Before 2007, only businessmen carried Blackberries. Now pre-teens have smartphones.

That's a very US-centric view of it: European "lambda consumers" were already heavily drawn to smartphones by the time the iphone arrived, mostly via Nokia's "N" series, the N95 was a rather popular phone at the time for instance).


Speaking as someone who owned both back in 2007, the original iPhone was still a quantum leap forward from the Nokia N95.

You could bullet-point the capabilities of each device and end up with surprisingly similar lists, but the functional superiority of the iPhone was still remarkably unambiguous. It was like holding an artifact from the near future.


> Speaking as someone who owned both back in 2007, the original iPhone was still a quantum leap forward from the Nokia N95.

Absolutely, I wasn't trying to say the smartphones of the time were fantastic and there's clearly a pre-iphone and a post-iphone, my post was simply that the consumer world was not a smartphone-less vacuum before the iphone, the US were.


The Nokia N95, a series-60 device with a numeric keypad, can't possibly be seriously taken as a smartphone. Such a claim wouldn't have even passed the sniff test back when it was released -- it was just a riotously expensive feature-laden feature-phone. Compared to the Windows Mobile and Blackberry devices of the time, it was pretty weak.


I'll take you way back and go with desktop publishing, which was pretty much kicked off with Apple's LaserWriter, which integrated PostScript and the laser printer (http://en.wikipedia.org/wiki/LaserWriter).

You could dispute any given example, because there's always a precursor (even if only visible in hindsight, http://booksofnote.blogspot.com/2012/10/kafka-and-his-precur...).


Still missing the point I think. The argument was that Jobs didn't influence any of the funamental technology. PostScript was a rendering language from Adobe, and really not the first path-based rasterizer, but could broadly be called "technology" I guess. Laser printing was a technology. Integrating them by putting a computer running PostScript into a box containing a laser printer is not a technological development in any real sense.

That's not saying it doesn't have value. But it's not the kind of thing JBQ was talking about.


Disagree, but not arguing, because as I said, there's always a precursor.


Not the OP but I assume he ment something like HTML was created using a NEXT machine http://inventors.about.com/gi/o.htm?zi=1/XJ&zTi=1&sd...



Thank you - I'm constantly surprised how frequent the same information is churned on this site.


To add some cynical flavor: when you went to the Google homepage, you expended electricity, CPU cycles, network traffic, gave up some privacy and waited some time just to see something that could have been shown to you by a static HTML file(²) on your disk (or even something built in the browser), because some corporation sees some benefit in all this extra work. We're not exactly trying to do things efficiently these days, since it's no longer a priority.

(²) not valid for search results of course, but even they could be sent to you in a more efficient, less privacy-destroying way if only some corporation's interests weren't more important than your own


The best bottom up approach to this that I have seen is Charles Petzold's "Code: The Hidden Language of Computer Hardware and Software" [1] which starts with using a flashlight to send messages and walks up the abstraction chain (switch, relay, alu, memory, cpu...) to most of the components of a modern computer. It's very accessible.

[1] http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...


What's even more cool?

Before long computers will be able to conceptualize the whole complex mess.

Consider how we acquire knowledge (perhaps like http://matt.might.net/articles/phd-school-in-pictures/). The more we've learned, the more we need to know about how to learn. At each level of knowledge we gain knowledge by accessing new knowledge and combining it with what we know. Eventually the supply of new knowledge dwindles and the only tool we can rely on is learning; the only tool we can rely on is that ability to combine knowledge.

This is much harder than taking in new knowledge; especially for computers! However, computers are getting better and better at it. Whereas many of us are out of college, computers are still in middle school, but they're getting better and better at both large scale and high complexity learning, so they'll move on to high school and college soon. Moreover, they're moving at an exponential pace! (see Ray Kruzwell and his ideas on the exponential growth of technology) Eventually, no... soon, computers will be able to conceptualize and intuit the scale and complexity of something like google.com. No person can come close to this, so we have no idea what that ability will bring.


When it comes to complexity, I think it's a miracle that spinning hard disk drives actually work.


I have told people before, "every time you make a phone call on your cell phone, a miracle happens". The little bit I know about the complexity behind a mobile network is mind boggling -- even just at the physical layer, dealing with reflections, power levels, etc.

Though I guess I can sort of understand how that works. What I can't understand is that I can understand anything... I simply can't fathom how the human brain works...


I simply can't fathom how the human brain works...

Look for some popularized science articles (e.g. from arstechnica.com/science) and books (e.g. from Ray Kurzweil) that mention neural networks, fMRI, connectome, cellular automata, etc. There are competing theories, but there's enough accessible information out there to dispel some of the mystery.


In terms of epistemology (e.g., how can we form valid concepts and principles and be sure they're right), or in terms of neuro-biology/neuro-chemistry?


More the epistemology than the biology. The biology makes some sense on some level -- it's all "just chemistry" at the bottom.


I'm with you. "It's all just chemistry" is sufficient for me for most things, but I worry a lot more about epistemology, which is why I have spent some time studying it.

Until recently, there were two basic camps on "how we know stuff." The Rationalists and the Empiricists.

The Rationalists thought that knowledge was came from mental abstractions, not from reality. For example, Plato believed that we could all access the "ideal" world of "forms" by just introspecting. The medieval scholastic philosophers tended to be very rationalistic - they made arguments about abstractions not connected to real experience, like how many angels could fit on the head of a pin. Decartes was a rationalist - "I think, therefore I am" is rooted in the idea that we look inward to perceive reality.

The Empiricists, which are somewhat more modern, were a reaction to Rationalism, and rejected all of that. They said that all we can really "know" or "trust" is sensory experience. We can't build up complex mental models, and we can't have principles - we just have to be pragmatic all the time and do what seems best, because we can't really "know."

Kant tried to improve on both of these approaches by saying that we can't really know anything about "true" reality, because reality has to be filtered through the senses. There is only subjective "reality."

Ayn Rand, of whom I am a big fan, went in the other direction from Kant, but also rejected Rationalism and Empiricism. She said that you can build complex models of reality, but they have to be based on actual observations of reality (i.e., sense data). To do this, she proposed a new theory of concept formation (i.e., how to form abstractions based on observation of reality). Her approach is very Aristotelian, which is interesting because Aristotle was the main guy who disagreed with Plato at the very beginning. I'd highly recommend "Introduction to Objectivist Epistemology" if you're interested in this approach (although you may be better served by starting at a less advanced level... still, the book is short and very accessible, albeit kind of a mind trip).


I actually love how simple they seem compared to solid state memory. I mean the storage part is a disk coated in magnetic material, and you read/write data using a magnetic head.


I disagree with this assessment, both on first examination are quite simple. Magnetic memory is as you described, the other stores charge on the plate of a capacitor. We have two effects, one quasi-magnitostatic the other quasi-electrostatic.

Now if we dig down into the details both look seemingly impossible. For hard drives we have read heads that must sweep so close to the spinning platter that they use the wing ground effect to float just over the surface. We no longer can read data with the write coil (not enough gain) so instead we use spin polarization sensors that pass currents of electrons with only one intrinsic spin and then measure the effect the magnetic field of the platter had on the spin of the electrons in the current.

In the case of solid state memory, we are able to pattern silicon wafers with a minimum feature size of 22nm and falling. A single chip is a seemingly miraculous network of chemically deposited thin films and optically patterned cutouts all at a size that beats the detraction limit of light. The mask (imagine an overhead projector's transparency) can not be shaped the same as the intended end-shape pattern of silicon due to very small scale refraction but never fear! We have figured out how to calculate what our mask must look like to beat the diffraction limit. Its all amazing stuff.


I agree that it's amazing and interesting stuff, but both (storage) principles actually seem rather possible when scaled down.

My point really was that one is to this day made of a disk coated in magnetic material (with the read/write head and electronics getting more sophisticated over the years) while the other uses billions upon billions of transistors that are "etched" out of silicon wafers using an ever more complicated process.


I work in a goddamned hard-disk factory, and I think it's a miracle they work!


So I like how he ends with that bit on patents, when I did my presentation at the USPTO Silicon Valley roundtable event a month ago, a couple of the presenters made the case that absolutely nothing needs to change with software patents because computers should be treated the same as any other kind of machine, and so software should be considered the same as every other type of patent. The fact of the matter is this is simply not true, computer software cannot be equated to physical items, and barely equate to business flow and methods. With all the complexities, as well as the fact that to try out a new version of software happens in seconds where making a prototype of some machine or object takes days, weeks or months. It seemed they either didn't fully understand the difference, or they understand and do not want the system to change since it works greatly in their favor.

Anyone who has an opinion on patents, especially software patents, should be keeping up with the roundtable events. And, I'm not saying that because I went either, stuff is being talked about at these events that will either be ignored or shape the patent system in one way or another. In either case, it's in our best interest to stay involved in the process.

Edit: Spelling


>to try out a new version of software happens in seconds where making a prototype of some machine or object takes days, weeks or months //

Aren't most prototypes "made" in software nowadays and only final products are really fully produced.

In generally I think I agree with what you're saying. In Europe software patents, as such [!], are not allowed but patents to software have always been allowed that made a technical effect, ie performed a real physical change to a system. It's very hard to pin down the boundary but I think that this is something the board got right.

That said I think personally that all manufacturing rates have increased greatly since patent terms were set and that the terms should be decreased to compensate for this change in the rate of development.


That's a valid observation I hadn't made yet, concerning manufacturing in general. You're right about the prototypes being made with software as well, I mostly meant that to physically produce the prototype is different then when you produce the prototype for software, given the click and compile nature of it. All in all though I agree that the terms really need to be reconsidered, but I don't see our legislative branch making an real changes to our governmental system in general, it works out for them pretty good right now.


From another perspective this is a modern view of the classic "I, Pencil" - which Friedman gave an great overview of:

http://www.youtube.com/watch?v=OlTRau_XgGs


Except this one isn't well written, and smacks of postmodern impossibility instead of structuralist awe.


It's a scary thought that if all the computers in the world were destroyed in an instant, how long would it take for us to build a Core i7 processor. Decades?


I was just talking to another developer about this, except we took it one step fruther: If all man made objects disappeared in an instant, how long would it take us to build a modern computer? The best we could come up with was: a really long time. Anyone have any input or recommended reads that pertain to this?


I think, in the best case (the right people are all together, and the right raw materials are readily available), you could probably get a modern computer built in 10 years, and a relatively basic microprocessor built in a few years less. If we don't have to reinvent a global communications system to get experts talking to each other, build mines and refineries to produce chip-grade silicon or rebuild airplanes to get materials from around the world, this task is a lot simpler.

The biggest problem isn't that modern computers depend on some piece of technology or process that is lost to us. It's that all the people who understand every step in that chain, from survivalist blacksmiths who can smelt ore to make tools to industrial chemists who can fashion substances that nature doesn't readily provide to electrical engineers and developers who will actually program the damn computer in year 9, are currently widely dispersed. Put them together and give them what they need and I think they could get the job done more quickly than you think.

Of course, it's more likely that everything and everyone you need will be scattered around the world when this instant occurs, and if that's the case, I'd honestly bet it would take hundreds of years to independently reinvent a modern computer because you'd need to reinvent most of modern society to gather the people and the materials to do that.


From observing reenactments of early American life, I think an 18th century agricultural technology is self-sustaining and directly bootstrappable, except for the problem of mining and working iron. So my guess is not much longer than 300 years.


There exists a viewpoint that in case of a cataclysm (which would involve man-made objects disappearing) we would never, ever progress past 18th century tech again.

The argument is that getting from animal-powered devices to solar/nuclear/whatever powered devices while at the same time switching from 90%-agricultural workforce to anything more progressive can happen only if there is a cheap source of energy available - and we already have mined and spent all of easily available fossil fuels.

Even if all kinds of fancy devices are available and constructed by rich enthusiasts, the lack of cheap steam power ensures lack of cheap steel/etc, and all the technologies don't get the mass adoption required for their improvements, there are almost no advantages for industrialization, so the world gets stuck in feudal-agriculture systems as the local optimum.


I wonder how far you'd get with, say, bamboo and bamboo-charcoal as an abundant and renewable fuel?

http://opinionator.blogs.nytimes.com/2012/03/13/in-africas-v...

It wouldn't have quite the pop of surface-minable coal and oil, but if you were pushing for high technology it would boil water, drive steam engines - but maybe it wouldn't happen in England. Where you'd get the metal from is another question.


Brilliant stuff that I have been looking for quite some time. I work in support and sometimes I really want to show this to our users, yet I would love to see something more human like: Jennie from Sales Support just used word merge to print a letter. Simple, isn't it? What just actually happened? The letter will go to post box, were postman will take it sorting center, where it will be distributed and sent to customer, who will read the letter, discuss it with the family and take number of steps to take a decision how to proceed further.

Or a guy from sales just received a call, but what just happened? The client was recommended by a friend after a splendid experience with the product, the client spent bunch of time reading number of sites and reviews and he is just about to purchase a product when sales guy's computer has crashed and he lost a client. We always try to make people understand the IT more and the other guys will try to makes understand their processes more, yet it is going to be never ending dialogue, process, fight..


I've previously used this as an interview pre-screen question for job candidates. It's a great question to ask; there are so many levels of details involved.

As a web company, you generally want potential employees to at least mention "HTTP" in the response. DNS is great. TCP/IP too. You'll definitely weed out some people who don't have a clue.


This question is one of my favourites when interviewing front/back-end developers and sysadmins. It gives candidates the chance to talk about the part of the stack they find interesting, so I can see what they are most knowledgeable/passionate about as well as judging their overall level of background knowledge.


What if I don't use docsis? :P

also, we quite understand how chips are automatically organized by other chips. it's because we don't understand it that computers are "building" computers. its because they're way faster at those repetitive tasks (and yes i'm talking about automatic chip layouts, for example)

Even thus it's not exactly TFA's point, since TFA goes pages and pages on complexity for pointing out that people care about what they see, not what it really does, I though that's worth mentioning.


My favorite one-sentence version of this, said by a comedian (whose name I forget): If you're left in the woods with nothing, how long before you can send an email?


I'm just glad I understand technology at least to some extent. I mean technology is often so complex I could never put together most of it myself or even understand it down to intricate detail, but at least I have a grasp of how it all fits together to make a pc, cpu, memory, television, radio, phone, remote control, ...

I can't imagine going trough this life without having the faintest idea of how a lot of the stuff I use everyday actually works.


The problem is not only with communication between end-users/management, it's also a problem with communication between teams -- for the same reason.

Because of the huge amount of complexity described, it becomes impossible for one developer -- or one group of developers working on the same project -- to understand at one time much more than their current specialization. This makes it hard to talk to peers working on other projects.


Great post. I wrote something similar a while back - http://techcrunch.com/2012/06/16/the-way-things-work/

But he takes it to another level. There's a lot to be said on this, and education is super important, but ultimately one has to sort of ... surrender, at least to some degree.


I've long contemplated the depths of technological stacks (and knowledge as a whole, it pertains to epistemology as well), and my opinion is that knowing the intricacies of them isn't that important, as long as we manage to archive their fundamental principles somehow (may it be in our heads through studying and transmission, or in data).


Hypothetical question: if all the knowledge in the world was available to you, how long would it take to build a modern electronic device from raw materials and without using any existing machine in the build process? I mean the actual time spent building something, excluding any mental work.


Forget modern electronics, consider how long it would take to build a simple electric toaster from scratch:

http://www.thetoasterproject.org/


Another Abelson/Sussman gem comes to mind: "The secret to engineering is knowing what NOT to think about"


I don't have access to Google Plus in office. Can someone post it here? I am very curious.


[https://plus.google.com/112218872649456413744/posts/dfydM2Cn...]

Dizzying but invisible depth

You just went to the Google home page.

Simple, isn't it?

What just actually happened?

Well, when you know a bit of about how browsers work, it's not quite that simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just connected your computer to www.google.com.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how networks work, it's not quite that simple. You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC, SONET, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just typed www.google.com in the location bar of your browser.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how operating systems work, it's not quite that simple. You've just put into play a kernel, a USB host stack, an input dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a windowing system, a graphics driver, and more, all of those written in high-level languages that get processed by compilers, linkers, optimizers, interpreters, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just pressed a key on your keyboard.

Simple, isn't it?

What just actually happened?

Well, when you know about bit about how input peripherals work, it's not quite that simple. You've just put into play a power regulator, a debouncer, an input multiplexer, a USB device stack, a USB hub stack, all of that implemented in a single chip. That chip is built around thinly sliced wafers of highly purified single-crystal silicon ingot, doped with minute quantities of other atoms that are blasted into the crystal structure, interconnected with multiple layers of aluminum or copper, that are deposited according to patterns of high-energy ultraviolet light that are focused to a precision of a fraction of a micron, connected to the outside world via thin gold wires, all inside a packaging made of a dimensionally and thermally stable resin. The doping patterns and the interconnects implement transistors, which are grouped together to create logic gates. In some parts of the chip, logic gates are combined to create arithmetic and bitwise functions, which are combined to create an ALU. In another part of the chip, logic gates are combined into bistable loops, which are lined up into rows, which are combined with selectors to create a register bank. In another part of the chip, logic gates are combined into bus controllers and instruction decoders and microcode to create an execution scheduler. In another part of the chip, they're combined into address and data multiplexers and timing circuitry to create a memory controller. There's even more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Can we simplify further?

In fact, very scarily, no, we can't. We can barely comprehend the complexity of a single chip in a computer keyboard, and yet there's no simpler level. The next step takes us to the software that is used to design the chip's logic, and that software itself has a level of complexity that requires to go back to the top of the loop.

Today's computers are so complex that they can only be designed and manufactured with slightly less complex computers. In turn the computers used for the design and manufacture are so complex that they themselves can only be designed and manufactured with slightly less complex computers. You'd have to go through many such loops to get back to a level that could possibly be re-built from scratch.

Once you start to understand how our modern devices work and how they're created, it's impossible to not be dizzy about the depth of everything that's involved, and to not be in awe about the fact that they work at all, when Murphy's law says that they simply shouldn't possibly work.

For non-technologists, this is all a black box. That is a great success of technology: all those layers of complexity are entirely hidden and people can use them without even knowing that they exist at all. That is the reason why many people can find computers so frustrating to use: there are so many things that can possibly go wrong that some of them inevitably will, but the complexity goes so deep that it's impossible for most users to be able to do anything about any error.

That is also why it's so hard for technologists and non-technologists to communicate together: technologists know too much about too many layers and non-technologists know too little about too few layers to be able to establish effective direct communication. The gap is so large that it's not even possible any more to have a single person be an intermediate between those two groups, and that's why e.g. we end up with those convoluted technical support call centers and their multiple tiers. Without such deep support structures, you end up with the frustrating situation that we see when end users have access to a bug database that is directly used by engineers: neither the end users nor the engineers get the information that they need to accomplish their goals.

That is why the mainstream press and the general population has talked so much about Steve Jobs' death and comparatively so little about Dennis Ritchie's: Steve's influence was at a layer that most people could see, while Dennis' was much deeper. On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things. On the other hand, I literally can't imagine where the computing world would be without the work that Ritchie did and the people he inspired. By the mid 80s, Ritchie's influence had taken over, and even back then very little remained of the pre-Ritchie world.

Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running. That's the ultimate bikeshedding: just like the proverbial discussions in the town hall about a nuclear power plant end up being about the paint color for the plant's bike shed, the patent discussions about modern computing systems end up being about screen sizes and icon ordering, because in both cases those are the only aspect that the people involved in the discussion are capable of discussing, even though they are irrelevant to the actual function of the overall system being discussed.

CC:BY 3.0


Thanks!!


This will definitely be my interview question, and I expect the exact detailed answer!


uh, "The proxy server is refusing connections" to plus.google.com (corporate desktop) so, I don't know.


He didn't explain how DOCSIS works.


Technologist. What a stupid word.


Just another (linux) fanboi.


JBQ BEOS! WOOOOOOOO!

Sorry, couldn't resist




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: