Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I started working for Apple (about 5 years ago now) it was my dream job (I had serious fanboitis). The new hires got a security orientation which was your basic "We spend lots of money protecting things; don't do anything lame."

But the reality of working makes the magic unicorn dust wear off. I saw a bunch of prototypes and I knew a bunch of information that nobody else 'knew' but lots of outsiders suspected. I became more interested in my work than selling other people this idea that I had these secrets that I could pass onto others. (We were also told to 'eat our own dog-food' which makes me highly suspicious of every Software Update to this day. I always wait a couple weeks before installing anything new.)

That's an important part of the equation. If your employees are more interested in leaking information than working, maybe you're doing something wrong. (Just to clarify, I still consider Apple a great place to work; I left because I wanted to do something different, not because I felt that it was a poor place to work.)

[Edited for grammar]



Surely if you were encouraged to 'eat your own dog food', that would mean people inside apple would be testing stuff themselves, so would mean eventual public releases would be better?


You might think so. It would be hard to test this empirically since I would need an alternate universe where people inside apple did not test unreleased software on themselves before releasing. But seeing the process that goes into deciding what fixes go in and what fixes don't has educated me to NOT update unless I have a real need to. I saw lots of examples where a fix for something broke something else that was seemingly unrelated. Or that a feature that I liked depended on a bug that was going to be fixed. That's why I wait on Software Updates. And if I don't need the Update, I don't install it.

The problem with 'eating your own dog food' is that it is dog food, so it's really hard to eat. To comply with this edict, I would set up a separate machine and install the latest build and play around with it for about 15 minutes and spend a half hour (or more) filing bugs. Thus, I wasn't really eating my own dog food...

Installing new builds turned out to be a big time-sink. It took time for the build to install and then you had to go through the paces of setting the machine up. You also had to do a clean install because of incompatibilities with previous releases.

After all of this, you started testing. Then you started finding bugs, and since I was in engineering, I tried to make sure the bug was reproducible and even tried to track down which component was responsible (which is a mess if you're dealing with overlapping stacks of software). THEN, you had to crawl through the maze of reported bugs to see if there was something already reported that was close... Hopefully they've made the bug reporting system better since I was there (but I doubt it… the system I used had been in place for a long time).

On top of that, I needed to get actual work done. There was no way that I was truly going to use a ever changing system of dubious quality to do work. I tried, and it was too difficult (although I knew of some folks that were faithful and did so).

'Eating your own dog-food' is an ok idea. It's not clear to me that it was the best idea for this particular situation. As a general practice, it seems that if your particular product scope is small enough, this would be a good idea. If it's larger, hiring a set of dedicated test engineers would be awesome. (The ironic thing is that Apple had those too!)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: