I did one of the code reviews for the Pac-Man doodle (and actually, the part I was responsible for reviewing focused specifically on sound, so this was kinda my bad). We had an early sneak-peak: Marcin was a UX designer for what was my 80% project at the time, so he'd passed it around the office looking for feedback.
One of the factors we'd identified in the post-mortem was actually the difference in Google culture vs. the rest of the world culture. When we heard strange whirring noises coming from nearby computers in the office, our first impulse was "Woah, that's really cool! We're seriously going to launch this on the home page? Can I help?" While at many other offices, if you hear strange whirring noises from an employee's computer, the reaction is "You're fired!"
Similar cultural mismatches have been responsible for a few other gaffes, eg. Google Buzz was a huge hit internally because pretty much everything inside Google is (was?) public anyway and people are very tolerant of different opinions, and it never occurred to us that people could be seriously harmed by others' knowing details of their lives.
> it never occurred to us [Google] that people could be seriously harmed by others' knowing details of their lives.
This explains so much.
EDIT: A year ago I wrote this, and it seems I was right:
> Google is self-selecting to employ those people who are fine with the idea of Google recording everything they do. This does not bode well for the rest of us.
It actually was far worse in the past than it is now, both because Google management does learn from their mistakes and because Google is much bigger now and so they employ more people from marginalized backgrounds.
I'd say that peak "all your data are belong to us" was back in 2010, before the launch of Google Buzz and various location/wifi privacy flaps in Europe.
I would be more worried now about the same dynamic existing in every other startup that takes off. When a startup is on a break-out trajectory, it has its pick of applicants from a wide number of elite institutions. Most of them will never have experienced powerlessness or marginalization, because that's what it takes to get into a hot startup. And so they generally won't be able to empathize with what it's like to have arbitrary fact X in your background taken and used against you in devastating ways, even if they do genuinely sympathize.
> I would be more worried now about the same dynamic existing in every other startup that takes off. When a startup is on a break-out trajectory, it has its pick of applicants from a wide number of elite institutions. Most of them will never have experienced powerlessness or marginalization, because that's what it takes to get into a hot startup. And so they generally won't be able to empathize with what it's like to have arbitrary fact X in your background taken and used against you in devastating ways, even if they do genuinely sympathize.
That's not "what it takes to get into a hot startup"; it's a self-selecting property that people reinforce, consciously or otherwise. You can and should make a conscious effort to counter that bias, and leaving aside all the other reasons you might want to do so, you'll counter the monoculture that can lead to "let's build things people like us want" (which fails if it becomes "and almost nobody else does").
I don't disagree, but at the same time: who gets hired by hot startups is not something under my control, and they have legitimate reasons (mixed in with illegitimate ones) for preferring people with a solid track record and name-brand affiliations.
IMO, this is a huge part of why so many startups fail so badly.
Google glass might have caught on as a 50$ toy, but at 1,500$ it's firmly into the realm of things people working at Google would buy, and that's about it. What's really interesting IMO is they could have made a 50$ version of the basic concept, but it was simply not on anyone's radar.
In theory, these blinders are why startups can show up and eat incumbents lunches. But, when starts are drawing from the same talent pool there making the same kind of mistakes.
Google Glass was ridiculously priced in an attempt to segment the market to only people who wanted to buy it to build a hobby/business with. They knew it was not ready for mass consumption and tried to use price and availability to signal accordingly.
It's been 2 years and nothing, so clearly whatever the intent it failed.
In a few years a cheap version of the idea may start making the rounds like the mini Segway's. But, again when it was obviously going to fail at that price point why continue?
It didn't actually completely fail - Google Glass saw some genuine excitement among certain professions (eg. surgeons, pilots, or firefighters) that need up-to-the-minute information in a heads-up display, and it's been re-launched quietly to focus on those niches. Ironically, the fix will likely be to increase the price: it's not a mass market product, it's an enterprise product that increases the effectiveness and safety of certain highly-trained specialists.
I think the broader issue is that too many in the SV tech scene are utopians by heart. They just want to improve lives, but forget or ignore potential downsides to what they are building. Not sure if it is a age thing or a background thing though.
> Similar cultural mismatches have been responsible for a few other gaffes, eg. Google Buzz was a huge hit internally because pretty much everything inside Google is (was?) public anyway and people are very tolerant of different opinions, and it never occurred to us that people could be seriously harmed by others' knowing details of their lives.
This is really disturbing. Imagine a major Google product launch failing because no one on the dev team had thought of testing it on hardware, OSes, or browsers outside of their than their own laptops. I can't imagine that would ever happen; no one would allow such a comprehensive gathering of idiots to putter along unsupervised on a major product.
So how is it possible that Google existed for over a decade before it even occurred to anyone to consider the users' safety and privacy? It doesn't make sense to me; I don't understand.
This is a general trend, not just Google. People are eager to make something technologically failure-proof. They can come up with all the technological problems that could happen.
People are horrible at coming up with social failure cases because we're often blind to such issues that others face.
Is it really that bad? I'm not personally familiar with US office culture, but it seems like an exaggeration. Why would a boss throw away an otherwise functioning employee over something so minor? Why would someone choose to work under such a punitive regime?
It depends where you work, but in some places, it can be. Employees in some industries have zero leverage over their employers; they're complete commodities, so if you do something that the boss doesn't like (whether it be get in late or question his judgment or spend time on Facebook during the workday), you're fired.
I've been lucky enough to never have such a boss, but then, I work in tech, where the supply & demand equation has been on the worker's side since I started my career.
Aren't there any employment laws surrounding this? I find it hard to believe someone can be sacked just like that, unless they do something really bad like assault someone, or a white collar crime or something.
Never mind rights or leverage, do they care nothing about the working relationship with their employees? The work environment I'm used to is much more collective. If there is a screw-up, it's the team that screwed up, and we fix it together. If it's the same person who causes the team to screw up again and again, it would be a different matter, but when does firing someone over a mistake solve a problem? And if a colleague was let go because his PC started making pacman noises, most of the team would probably quit the same day. I can't even imagine it happening. More likely the boss would joke about it: "Hey, wobbleblob, it sounds like you're not very busy at he moment. In that case I have a nice chore for you"
I'm certain the "You're fired!" is an exaggeration. You would rarely get fired over something so minor here.
Firing is difficult here and is only done for the most severe offenses. You could be "laid off," but you'll never know exactly why, (as you could then sue for improper termination), but even then, that involves a lot of paperwork and he won't sign up for that instantly after hearing a noise.
I have had my supervisor give me talks about proper workplace conduct for these kinds of things though.
I am of the opinion that web sites should never make noise automatically. Never, no matter what site. I often have my sound on without even realizing it. Nothing is more disturbing to me or those near me than me going to a site and having stuff come over the speaker. I almost always close the tab immediately. Please never autoplay anything.
I routinely open a bunch of tabs with interesting things, then read them. Youtube included - I'd like to be able to open a bunch of videos, then watch then one at a time in the tabs.
I don't want all of them to start playing at once.
Interesting, thanks. I find that very surprising, as YouTube now behaves differently across browsers. I'm surprised Google would make this the policy for Chrome, and not make the change in YouTube too; it's much better.
Not just infuriating, but potentially dangerous. If a website uses my audio to play a 100% volume sound without my consent and I had my headphones up high to hear a subtle recording, I might get permanent hearing damage.
Every time the concept of a "Web Bill of Rights" comes up, I want Article I to be: no use of audio without express user consent. That consent might be stored for a time on a per domain basis, but only for 1st party content.
This seems at odds with the conventional wisdom around here RE: adblock. You're simply requesting bytes from a server, what your computer does with them is up to you (and therefore your responsibility). You should probably configure your system so website have no ability to autoplay.
YouTube shouldn't autoplay videos. I especially hate it when you click on a channel profile to check out what they're about and some promo video starts up at max volume (my computer happens to be sensitive to sound differences, too). At any rate, YouTube should only play a video when the user presses the play button.
The real problem was autoplaying the sound -- they should not have auto-enabled sound... if they were worried that users wouldn't know there was sound, they could have put in a prominent button to turn it on, but auto-sound sucks.
There are plenty of ways to trigger it accidentally, like loading Google, then walking away from your desk, or doing a google search in a presentation and pausing to talk about something.
I might expect an autoplaying video with sound at a news or social media site, but I certainly wouldn't expect it from Google.com
I remember this day which was fairly close to when I was hired at Mozilla. What this looked like from the Mozilla side several bugs filed [1]. Support forums had hundreds of questions about this [2][3]. The data from input.mozilla.org is gone or archived offline, I recall that blowing up as well. I vaguely recall this doodle causing a graphics crash on a specific set of hardware as well (though maybe that was a different doodle).
I love these kinds of stories. They're like the programmer's version of a detective story, with the twist that you always somehow learn some weird obscure fact about people or the Internet:
- A web browser plug-in loads Google's homepage in the background.
- There's a weird standard to writing Polish characters with diacritics.
- Some Windows systems carry around this decades-old bitmapped font.
I think it's similar to why I love those lists titled "X myths that programmers believe about Y". It's insane how varied the world is and how that comes out when we deal with as many domains as we do when programming.
I really don't like the last step of the post-mortem. "Who is to blame?" I much much prefer "how did the system fail?" We want engineers trying crazy things and hit bumpers when they, for example, autoplay sound. The alternative is being hesitant about trying crazy things.
I mean, yeah, personal responsibility, but google already has a preflight checklist with code reviews and tests and such. Treating it as a systemic feature protects a lot of stuff you want.
1.) post-mortems at Google don't blame people, they blame processes, and then they suggest ways those processes could be fixed.
2.) I know Marcin personally and I've never known him to blame anyone either. And indeed, he didn't: the takeaway from the article was that the complexity of the web was at fault.
My best guess is that he included that because many other organizations who do post-mortems, who he might want to reach with his writing, do think in terms of "Who is to blame?", and addressing it explicitly may be better than leaving it unanswered.
Where did this title come from? Not only is it not the title of the post in question, but it's quite inaccurate -- the article doesn't even come close to claiming that anything "almost killed" the doodle.
With that said, I clicked the click-bait title, and I enjoyed the article. (shrug.)
Indeed. I'm not really sure why the author went with "the complexity of the web" is the real culprit. It was google, and specifically this guy's team, that was at fault. They put auto-playing sound on one of the most popular webpages in the world, and as he noted, this page never made sound before.
The rest of the article was an interesting read, but not fully taking the blame seems wrong to me.
An unrelated aside concerning the blog's author, Marcin Wichary. He's also a photographer, including a number of wonderful photos of historical computer equipment.
I met Marcin at SXSW back in 2011 where he talked about this project (and other doodles). Some more info is here http://searchengineland.com/behind-the-scenes-with-googles-d.... Also invited him to come speak at Microsoft to share more about his work (some interesting internal debates about a Googler coming to talk at MSFT ;)
IIRC it was the bug that allows you to pass through ghosts. I vaguely recall learning about it because of something the author wrote when this was released. Apparently speedrunners and the like have memorized patterns that exploit this if you can play with perfect timing.
Just sayin' that we see this as first of a series of amazing stories about bugs--ones that are so vexing, with fixes so illuminating, that telling the story casts light beyond the specific problem... If you got 'em, respond to this piece on the page!
One of the factors we'd identified in the post-mortem was actually the difference in Google culture vs. the rest of the world culture. When we heard strange whirring noises coming from nearby computers in the office, our first impulse was "Woah, that's really cool! We're seriously going to launch this on the home page? Can I help?" While at many other offices, if you hear strange whirring noises from an employee's computer, the reaction is "You're fired!"
Similar cultural mismatches have been responsible for a few other gaffes, eg. Google Buzz was a huge hit internally because pretty much everything inside Google is (was?) public anyway and people are very tolerant of different opinions, and it never occurred to us that people could be seriously harmed by others' knowing details of their lives.