Result of not focussing on trust enough. Google is going to have more of this problem. Google really has a problem understanding the compound effects of trust and perception and dealing with it.
By now, they should have taken some concrete privacy enhancing steps which might be made for short-term but good for long term trust, as usage of GCP, G Suite, AR will depend a lot on their trust factor. Sundar is really coming off as weak in this regard.
I think they have taken privacy enhancing steps, though public perception isn't easy to change. A lot of people (including HN users) still mistakenly think that Google sells user data, or medical history is being used for ads, etc.
Disclosure: I work at Google, but my opinion is my own.
Yeah. I know they have, enhancing Permissions in Android, making data deletion etc more visible. And I don't think they sell data. Personally, as a student I think they've provided incredible positive value to my life. But I wonder how GCP, Google Health will gain big customers because Google has clearly abused their power in certain areas.
That might be true now, but who knows where your current employer will be in 10 years and if they by then even remotely match your current values. We are already seeing cracks forming and it makes difficult to trust them with our most sensitive data. It was easier 10-15 years ago, not so much these days.
personally, I do think that Google handles user data well. some people here might get triggered, but I don't have problem with it. I am talking about the perception problem for Google among big cos, which will impact GCP and Google Health.
Though, frankly I hope now these cos realize that Amazon is some trustworthy alternative either(as some corp thought in the mentioned article), Amazon was in a honeymoon period for the past 3-4 years, now with their face recog, Ring etc. the attention has shifted to them.
> That might be true now, but who knows where your current employer will be in 10 years and if they by then even remotely match your current values.
You’re just cutting the cord of falsifiability here. Because none of us can see 10 years into the future, you’ve made it impossible for anyone to disagree with you or even have a reasonable discussion. It’s an intellectually dishonest position to take.
Not really. Google has had a track record of collecting data on people, in addition to signs that tremendous changes in its corporate culture; it’s not unreasonable to speculate where they may go with that in the future.
Collecting data is not bad, not being transparent/untrustworthy is bad. Now, when these health deals are going on, the companies worry that G will claim data or do something nefarious. And even though I am a Google admirer( enthusiasm has waned over the past 2 years), I also fear that they'll keep making BS moves. My disappointment with Google is mainly due to the cultural changes and the mismanagement, because that was the thing which attracted me to them when I was in grade 9-12, and now in college, I keep getting frustrated by the stupid product decisions in many of their apps( i mean forget messaging, the reminders in Google app/ assistant are a mess). Like it seems some teams are really incompetent.
But, I somehow am really unimpressed by Pichai. I know Pichai, Cook, Satya are the executive types, but of all 3 Pichai seems to have the lowest vision, in every interview he speaks, it's like he's being very careful, like no enthusiasm at all.
> Cerner ultimately accepted a less generous offer from Amazon, in part because the company decided Amazon was more trustworthy on security, according to one of these people.
Well I don't know if even going with Amazon would be any better than going with Google since both of them will be prominent players in healthcare technology anytime soon. But it is more than clear that as Google obtains more health data via hospitals contracts, GCP customers, etc this helps their deep-learning systems learn more about you and despite releasing interesting papers about their achievements, they have closed up some of their methods which limits reproducible science and research.
Think very carefully about the sensitivity of this health data. Do we really want the likes of Google and Amazon analyzing this data for commercial purposes and later getting away with it? I neither trust that Amazon or Google will act in the best interests of whats good for healthcare, but I guess Cerner chose the lesser of two evils and went with a weaker devil.
>>> Yet he is reluctant to allow people to opt out of Google’s core health-search tool. He likens that to a physician knowingly offering substandard care, he says.
>>> “If you believe me that all we are doing is organizing that information to make it easier for your doctor, I’m going to get a little paternalistic here: I’m never going to let that get opted out,” Dr. Feinberg says. “It’s going to screw up your treatment. We’re not going to be able to take care of you.”
Planet Money had an interview with Dr. Feinberg, the head of the Google health initiative. He was asked about people potentially opting out. His response was extremely paternalistic.
He said that he views everybody in the system as his patients, and he thinks this data aggregation and analysis from Google will improve care. So as "our" doctor he cannot morally allow any of his his patients to withdraw themselves from the program because, to him, that would be akin to giving you inferior care.
So this guy gets to unilaterally appoint himself as our doctor? I certainly didn't hire him.
>>> He boasts about his $5 Wal-Mart fleece jacket, and is an astrology enthusiast. “I’m positive,” Dr. Feinberg says to a reporter good naturedly, if inaccurately, “you’re Sagittarius.”
He's also heavily into astrology and was asking the interviewer his sign and trying to guess it. I certainly would never hire a doctor that was into astrology. How can I fire this guy?
How TF do you become a doctor that believes in astrology?
I tried to search on their site but it didn't work out (I block JS, couldn't figure out which set of whitelist selection would make search work w/o enabling tracking. Derp NPR, derp.)
I tried too. I was just listening to in Friday on the Planet Money podcast on Spotify. It was definitely their voices too, not All Things Considered or anything, and it had Dr. Feinberg's voice too.
(lol, that comment is back down to -1 again. Apparently you cannot say anything bad about astrology on HN either.)
My dad’s a physician that believes in homeopathy because his religious guru wrote a book about it. He also listens to Eckhart Tolle. I hope they’re not connected.
>As Google has moved to expand its data collection, some potential partners have been put off by what they viewed as the company’s aggressive maneuvers to acquire data without providing enough information on how it would be used.
>Google pushed one medical-data manager not to share data with other companies, according to a person familiar with the pitch.
>As part of its huge offer for Cerner, whose software is embedded in doctors’ offices in 30 countries, Google used its size to its advantage. Google Cloud executives offered that other arms of the conglomerate would buy unspecified other services from Cerner, people familiar with the matter say.
>Cerner ultimately accepted a less generous offer from Amazon, in part because the company decided Amazon was more trustworthy on security, according to one of these people.
>Existing players in the health-care data market also fear that the tech giant will gain too much power in their industry. Some hospital and technology executives say they declined deals with Google lest it become a future competitor.
>“We could never pin down Google on what their true business model was,” says a Cerner executive involved in the discussions.
It seems like Google is using GCP as a venus fly trap and Cerner was right to avoid them. The lure is cheap bog standard cloud services, but only if you have a more all-encompassing partnership where your data helps them develop other Google products.
>Reiterating what Google has told lawmakers and industry executives in private meetings over the past two months, Dr. Feinberg says he operates on a personal directive from Mr. Schmidt: “Don’t worry about making money.”
This quote is extremely telling in my opinion, and it's just classic Google. The most generous interpretation of this is that Google has altruistic motivations to make advances in health and they shouldn't worry about how to monetize whatever they build. That of course is completely absurd and extremely dangerous because that is exactly how Google slides into evil (evil is banal). The less generous interpretation is that Dr. Feinberg is only there to provide a sheen of health industry credibility (key opinion leader) and he has no actual ownership of Google's health business at all.
And judging by this paragraph, I'm leaning to the latter:
>Outcry over the Ascension deal, including a federal inquiry and objections from patients, shocked executives inside Google, and opened fissures in its top ranks over how to proceed, according to people with knowledge of the discussions. The head of Google Health, Dr. Feinberg, pushed to tell the public more about his division’s operations, but met resistance from longtime staffers who cite the company’s tradition of keeping potential new products under wraps.
They're sitting in a firehose of cashflow. They waste more in a day than most people earn in a lifetime. I was working as a contractor there and the Occam's Razor hypothesis I formed to account for the massive waste I saw was that they have so much money that they can afford to sequester talent. (Meaning, they can hire a bunch of people just to keep them from working for competitors.)
I know that's not true, but it's less upsetting than admitting they're just that wasteful.
> The most generous interpretation of this is that Google has altruistic motivations to make advances in health and they shouldn't worry about how to monetize whatever they build.
You either make money from your users or off your users.
De-anonymzed (health) data likely isn't covered by HIPAA, and with a few still generic data points likely associated with the health data, such as health, gender, age, and location -- you could work backwards to reach an identity. There's likely a loophole for whole security of health data thing.
I can't read it but sounds like a good thing. Its always a waste to see our smartest developers working on social media or advertising. Hopefully the teams at Google can improve our healthcare which would be one of the most rewarding things they could work on.
Does it make sense for a tech giant employing lots of data scientists and existing health products (Google Fit) to expand further into Health? Yes.
Does it make sense for a company earning most of their money through advertising to keep health data they are processing completely separate from their core business? In the long run yes, but they can also make a lot of money in the short to medium term by breaking that commitment
> they can also make a lot of money in the short to medium term by breaking that commitment
I personally don't see an incentive for Google to make a quick buck by sacrificing the long term. I see it as quite the opposite, since Google has a lot of cash on hand. See [1]
Article is paywalled. If someone can see it: is the premise that Google is harvesting patient data for their own AI/ML healthcare projects' benefit? Or is this just cloud storage (Cerner putting objects in a bucket)? Or both...
I find it disingenuous that the article refers to Google as a “tech giant”. The implications are far more sinister if instead it was referred to as “advertising giant” as advertising is what is what is driving Google to acquire this data.
Exactly. There is so much money to be made from healthcare service, it is a market & revenue stream totally separate from advertising, and it can provide great value to many people.
There are things one could be critical of regarding their acquisition of data, etc. but I don't think it is in Google's best interest to use the data for advertising.
Why would you trust a literal corporate mouthpiece? You think Zuckerberg cares about your privacy too?
No way in hell Google isn't going to use this for advertisement. It's exactly the kind of huge free money that drives everyone else to collect data totally unrelated to their core business. Even if the intentions are benign now, there's no reason to trust they'll be so later, based on Google's track record and the typical tech mindset.
Just imagine the kind of profiles you could build on people with medical data. Beyond targeted drug advertisement, medical devices, and the like, you know which people are sedentary, which play sports, who probably smokes, and I'd bet good money there are correlations between certain conditions and shopping habits, religious preference, and even political preferences. Just begging to be clustered with the latest hot neural net.
Wouldn't be the first time Google did something illegal or unethical or tried to keep it secret, like secretly trying to build a censored search for China, firing labor organizers, etc.
I was under the impression that HIPAA applies to anyone in possession of private healthcare information (medical information tied to personally identifiable information)
As that link states, HIPAA extends to "business associates". Google isn't a "covered entity" as that has a specific definition, but it is a business associate in relation to a covered entity (e.g. health care providers). The reason for the two different terms is because covered entities deal with patients directly, thus have certain rules that only apply to them in relation to interacting with said patients. Since Google is a business associate that does not directly interact with patients, only the rules specifically in relation to PHI and other business activities apply as others (e.g. patient interaction) are moot.
Protected Health Information is protected under HIPAA for both covered entities and business associates alike. Otherwise, HIPAA would be pointless if covered entities could just pass the PHI to business associates or shell companies unfettered.
Note: I would not consider myself a "HIPAA expert", but I'm a clinical researcher that has to ensure HIPAA compliance for my lab.
At least from the last time this came up on HN, it was about Google's software being used via a server actually owned by the partner hospital, and google's access was "only via a few employees"; IIRC it wasn't data stored on their servers, or it was only stored via Google Cloud Storage (and I doubt they'd compromise security on GCP buckets for advertising purposes).
Right so how does "Tech company. Automation. Advertising." become "Google has a track record of collecting data totally unrelated to its core business and using it to inform advertising."
If there's a track record of this, you can give other examples of how Google has done that, right? Where they've gone out and collected data under the guise of doing something benign and then used it for advertisements.
The track record is that of them being morally bankrupt. Not a track record of having applied their moral bankruptcy to this particular scenario as of yet.
Tech company = making the world more digital. Perverting humanity. Moving us further away from our natural state. Ripping away any innate semblance of purpose we had.
Automation = Removing the stand in sense of purpose people had found in the modern world. Increasing inequality. Increasing the needed level of education to be productive.
Advertising = creates and reinforces wants. Makes people unhappy with what they already have. Causes them to then have to seek to be productive which is bad because of the previous two.
> Beyond targeted drug advertisement, medical devices, and the like, you know which people are sedentary, which play sports, who probably smokes, and I'd bet good money there are correlations between certain conditions and shopping habits, religious preference, and even political preferences.
Well, I can tell you that Google's advertising profile for me would already have a pretty good idea for all of those attributes you listed based on my search history, location history, google maps usage, youtube usage, etc.
The healthcare market is an ~8.5 Trillion market globally [1]. Advertising is ~ 1 trillion [2,3]. I admit that I don't know the exact quality of the data cited in these - but they do clearly demonstrate that healthcare is a huge industry. Given that Google wants another profit source besides advertising I think it just conspiratorial to suggest they would throw away the medical opportunity for some marginal advertising gains. Dragonfly leaked, employees are otherwise frustrated with certain things. If employees are talking about what you say then I'll listen to them. But you are just making senseless accusations.
>you know which people are sedentary, which play sports, who probably smokes, and I'd bet good money there are correlations between certain conditions and shopping habits, religious preference, and even political preferences.
They could already do this with data from Android, Chrome, Google, Maps, or Youtube.
> Even if the intentions are benign now, there's no reason to trust they'll be so later, based on Google's track record and the typical tech mindset.
I have never heard of Google doing outright illegal things. They make very liberal interpretations of the law in their own favor, but patient data protection is very clear and well understood so there isn't much room to play there.
1. How would you know? How many engineers does it really take, in a massive company with billions of dollars to waste and tens of thousands of employees, to break the law for profit in secret?
2. Do you really trust that the law is sufficiently protective? Do you believe it to be crafted for the novel dangers that come from mining aggregations of data and building individual profiles with next generation ML?
3. What stops Google from throwing it's weight around to lobby for changes to the law which allow it to make use of data in ways that laws protect against now, once it has the data?
Sure, that's a lot of what ifs. But this particular scenario is dangerous because it effects all of us, and it only takes one ambitious authoritarian to turn all of this adtech fun and games into the most comprehensive and fine grained surveillance apparatus that has ever existed. Stalin, Lenin, Hitler, Pol Pot; just imagine the temptation of being just one "lawful" seizure away from such power...that's what slimy adtech companies are building, they don't care and the average person doesn't understand.
Google is really trying to diversify revenue since they know global web-based advertising regulation is coming (see YT Premium, YT TV, Google Store products, Fi), I can see this being purely a way to enter the healthcare software business.
Rather say that google _took_ everything else. Few realized the value of what was up for grabs at the time but by making life easy for web developers with g-analytics google created a firehose of user data. By using g-analytics we gave the web away.
Where is this firehose of health data is going to lead to?