Hacker News new | past | comments | ask | show | jobs | submit login

I don't think what pg means by "hackerly culture" is any different from what you said. It sounds like Apple is doing exactly what pg tells hackers to do: make something people want. "Hackerly culture" to me implies that you have people ("hackers") who are able to explore the space of possibilities and invent new things rather than just building them to some businessman's specification. It's not essential that you vent a bunch of experimental Google Labs projects out onto the world to have a hackerly culture--Apple just aborts more of its false starts in the womb. (Steve Jobs said in an interview once that one of the things Apple does best is deciding which of their projects go forward or not--they had a promising PDA a few years back but stopped development prior to release because they didn't think the PDA market was worthwhile at the time compared to portable music.)



Down that road lies that 'hackerly' finally becomes redefined as 'everything we consider a good thing'. It's OK for Apple to not be 'hackerly', while still being able to develop great products and software.

My girlfriend studies industrial engineering and knows quite a bit about innovation and product development. If I combine everything I've heard from her with everything I've heard about Apple, then I conclude that Apple simply excels at leveraging/executing all the well-known 'best practices'. There's nothing 'hackerly' required to do proper innovation and product development, unless you expand 'hackerly' to include 'doing business the way it should be done'


Well, here's what I mean--pg's essay criticized Yahoo for putting product managers in charge of dictating specs to programmers, instead of the programmers being able to think for themselves. I don't get the impression that's the case at Apple.


Agree 100%. Just adding:

The "dictating specs to programmers" practice seems to stem from the silly notion of applying manufacturing processes to software development. In the view of companies with this notion, programmers are little more than unskilled translators - the parallel of line workers in manufacturing jargon.

In my experience, one of the worst abusers of this "square peg/round hole" paradigm are internal IT departments.

I read a study in IEEE Computer sometime in the Summer of 2001 or 2002 that showed how bad the application of Waterfall was for software development. Does anyone know of a study on the application of manufacturing processes to software development?


I may be wrong, but I was under the impression that practice stemmed largely from older days, when "programmers" were people who translated specifications into machine code or assembly, and programs were actually written and specified by higher ups--essentially, the day that programmers were more like human compilers than what programmers do today.


I can't speak to early computing, but starting around when business applications emerged (60's/early 70's?), developers were very involved in gathering requirements, designing, coding, testing, etc. Shortly after I entered the field, circa 1992, I started hearing about software development as an assembly line process. The developer role started being split: BA, architect, coder, sometimes a QA staff, etc. Whenever I've worked on projects with teams split out this way, it becomes a nightmare of communication overhead and miscommunicated requirements.


The term "hacker," if this discussion provides the definition, seems more and more nebulous to me. Next we'll start hearing about "rockstars" (god forbid).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: