Hacker News new | past | comments | ask | show | jobs | submit login

> Most of the time the truth itself is not super critical

You heard it here first, folks.

edit: (added because it's now hard to read above)

> I work for Google.




Okay, you got me. I accept the downvotes. You've missed the point and taken it out of context, but I made it way too easy for you.

So I'll rephrase: In the context of accomplishing a certain task, the accuracy of some information is crucial and the accuracy of other information is not. For instance, if I'm looking to buy a book about Andrea's experience, I care that she's a woman, that she lives in a western country similar to my own. It doesn't particularly matter to me, in this context, what she looks like, so if I see the wrong photo - it's no big deal.

Obviously, the truth matters. And Google can't know what context you care about. But the only way to never be wrong is to never show anything at all. In other words, if you try to be useful and aggregate and show any information at all, you're going to be wrong here and there.

Yes, Google should continue working and improving and getting to as high an accuracy as possible, but it's never going to be 100%.


> You've missed the point and taken it out of context

1. I find the original quote, at least as presented without additional context, pretty difficult to see as anything but nonsensical. You following it up with "Honestly, I think he's right" and "Most of the time the truth itself is not super critical" is leaning into the same nonsensical position even if you try to cover it with additional context of, well actually there's still useful information.

2. This post is specifically about a woman (or in fact several) who have been harmed from this notion of truth not being critical. And you're doubling down on this with a not so relevant view centered around how it's benefiting you and not about the collateral damage. No one is saying perfect information is possible, but in the very least Google needs to be responsible for negatively affecting people's lives and it needs to provide an easy and reliable way to address situations like this.


> So I'll rephrase: In the context of accomplishing a certain task, the accuracy of some information is crucial and the accuracy of other information is not. For instance, if I'm looking to buy a book about Andrea's experience, I care that she's a woman, that she lives in a western country similar to my own. It doesn't particularly matter to me, in this context, what she looks like, so if I see the wrong photo - it's no big deal.

This is a cherry-picked task. If we switch the task from "I'm looking to buy a book about Andrea's experience" to "I'm looking to know more about Andrea, including what she looks like" then the Google response is worse than an irrelevant one: it is a definitely wrong answer.

And our new task isn't something unlikely or fanciful: there are surely people who have read her book who want to know more about her and who might google her.

Google will surely remove this as they should.


In that case, the information itself isn't important either. If it's wrong and it doesn't matter, why is it even being displayed? By definition then, the only people wrong information matters to are the ones it affects. "It's no big deal" to you, but she has been working for years to get this fixed.

> the only way to never be wrong is to never show anything at all.

How much error does this excuse? Any amount? How's this even helpful if you don't draw that line?

And it absolutely doesn't justify a lack of recourse when it does go wrong.


> In the context of accomplishing a certain task, the accuracy of some information is crucial and the accuracy of other information is not. For instance, if I'm looking to buy a book about Andrea's experience, I care that...

This totally disregards that many people search for stuff without any specific "task" in mind at all, and even more so that in controversial contexts like this, probably quite a lot of people with the explicit intent not to contribute to the author's happiness or cash box.

> Yes, Google should continue working and improving and getting to as high an accuracy as possible, but it's never going to be 100%.

What Google desperately needs to work and improve on, far more than mere accuracy, is some goddarn freaking humility: When you know you're not 100% sure to be presenting accurate results, stop pompously calling them "knowledge".

Try, for a change, not to be evil.


It's no big deal to you. It's a big deal to the person who's life was just ruined!


> For instance, if I'm looking to buy a book about Andrea's experience, I care that she's a woman, that she lives in a western country similar to my own. It doesn't particularly matter to me, in this context, what she looks like, so if I see the wrong photo - it's no big deal.

And yet here we are in a thread about a woman whose life is being very negatively impacted by the misinformation.


Well if even the average Google employee has that opinion then we truly are doomed.

Its no wonder we've got ourselves in this mess when we allow people with those kind of opinions to run a search and marketing company.


Ok, I'll repeat.

The truth matters. It matters ten times more when you're Google, where many, many people go to get answers for questions, and it's a deep abuse of trust when Google shows lies.

I phrased my opinion badly, and I was wrong to do so. Mea culpa.

Also, I'm a lowly engineer. I don't manage anyone, and I don't "run" anything, except, you know, unit tests on my machine and stuff.


You write code that runs in that ecosystem. If it makes you feel better then convince yourself you're not responsible.

Doesnt make it any less bullshit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: