Funny comments, but jokes aside, I think we're missing a very obvious reason why he has signal. Mark most likely has contacts that are super high up in governments and coorporations who use signal since they can't trust any other app. People who say that he just tested out competition, I think you guys underestimate his technological knowledge and his resources waaay too much. If he wanted to do that he could have easily ordered someone to get him a clean phone with a sim card not linked to his name and let that device sit on his desk for testing.
He wouldn't need to use his real number/name just to do product research though. It would just needlessly confuse any contacts who have his actual number and use signal.
Signal also has nothing he can emulate. Its most meaningful selling point is that it doesn't have an association or integration with a scummy social media company, and people using it for that reason are already lost to Facebook. This is a feature he cannot ever hope to copy.
> Signal also has nothing he can emulate. Its most meaningful selling point is that it doesn't have an association or integration with a scummy social media company, and people using it for that reason are already lost to Facebook. This is a feature he cannot ever hope to copy.
Even if there is nothing to emulate, downloading the app at least gives him a feel for how close it is to WhatsApp's capabilities and would help him understand if it is a major competitive threat or not.
An even more obvious answer is he doesn't want his employees or his competitors to be able to monitor his private conversations which they would be able to do if he were to use most commercial messaging systems.
They allegedly can’t monitor WhatsApp. That’s why this is news. Of WA is so secure (supposedly implementing the Signal protocol), then what use does ZuckerB have for Signal?
Maybe you're right, but considering how easy it is to set up a signal account (<5 minutes) vs messaging your assistant to provision a phone and a sim card (or a couple of them, just so that he could text someone), I'm not so sure about it.
Assuming that his threat model is "my phone number will not be leaked", signing up on signal with his personal number seems fine.
Yes, and, just a guess: Were I on Team Zuck, I'd have every competing messaging app installed, just to see what everyone's doing. eg I'm certain everyone touching Instagram also uses Snap.
Further: LAN Manager era Microsoft had a Novell Netware group, Microsoft also continued to use (Windows port) of sendmail while selling Exchange, had an entire Macintosh enthused business group.
... which should be taken as a sign of Signal's technical excellence.
Just consider: if you're looking for privacy-respecting, actually secure, reviewed, audited and tested to bone tech -- short of replicating the work yourself, who's assessment can you trust? Perhaps paradoxically, you should foremost trust the unspoken opinions of criminals. Because those people are likely betting their lives and freedoms on the tech choice. Barely any other group will approach such a choice with more vigor.
So, morals aside, that a technology is being used by criminals is actually a compliment to that technology.
A number of my friends and I use it for cross-platform group chats since we have a mix of android and iOS users. It’s a convenient messaging app, the privacy is a bonus for most of them...
Is it really surprising that he would have all of the most popular messaging apps? Facebook’s playbook of copying popular features from apps gaining traction is by no means a secret.
I’m not totally convinced by the narrative that Mark doesn’t trust his own tools so he prefers Signal.
Yeah, it is likely that he was just checking up on the competition. I wouldn't be surprised if turned the narrative on this by saying something like the following:
"I was curious to check it out.. it wasn't very good, so I reverted back to Messenger/Whatsapp."
IIRC, he made similar comments when he was spotted using G+.
Disclaimer: My views are my own (and not necessarily shared by my employers).
It doesn't seem like you are aware that WhatsApp hired Moxie Marlinspike, the creator of the Signal encryption protocol, to re-implement it for WhatsApp. AFAIK it is still used in WhatsApp today.
I will take this moment also to mention that "re-implement" isn't exactly right in that they modified the protocol slightly to allow for someone in control of the administration server to change a user's private key without their knowing, so that the admin can decrypt the E2E communications using the known key.
> I will take this moment also to mention that "re-implement" isn't exactly right in that they modified the protocol slightly to allow for someone in control of the administration server to change a user's private key without their knowing, so that the admin can decrypt the E2E communications using the known key.
I think that is a very charitable assumption about the GP claim. The linked article describes a very specific implementation vulnerability around handling of offline messages that would appear to be routed in user experience being ranked higher than operational security by WhatsApp (understandably). In this case it also does notify the user once they are online, and the original phone is logged out alerting the compromised user.
The GP claim is far broader that all E2E communication can be compromised without user awareness permitting ongoing communication between two unaware parties to be monitored.
> I think that is a very charitable assumption about the GP claim. The linked article describes a very specific implementation vulnerability around handling of offline messages that would appear to be routed in user experience being ranked higher than operational security by WhatsApp (understandably).
Both points (security vulnerability and user experience prioritization) can be true simultaneously. This is the root of all plausible deniability when it comes to installing vulnerabilities in technologies.
I don't see why we should care at all about WhatsApp's intentions with the change when the effects are so pernicious. Facebook et al. definitely do not deserve the benefit of our doubt anymore.
This is true. But I would suggest your operational security has bigger issues than this potential vulnerability if you are using WhatsApp.
Regardless - you still haven't given a source for you original claim. "not deserving benefit of the doubt" does not qualify. If the linked article is in fact you source then in the future please do not exaggerate such claims as you have done. I would have expected a claim from the article to read (along with a link to the source!):
> WhatsApp have modified the protocol slightly auspiciously for user experience but this allows a third party attacker to intercept messages sent offline only alerting the sender after they have been disclosed.
> they modified the protocol slightly to allow for someone in control of the administration server to change a user's private key without their knowing, so that the admin can decrypt the E2E communications using the known key.
Many people who are incompetent about technology have this weird paradoxical attitude towards apps where they will install random apps without even thinking twice and have loads of apps on their phone which are basically just mobile web sites but when it comes to messengers it's the opposite and they feel overwhelmed by having more than WA,FB and IG.
On the other side of the spectrum are people who won't just install some random app without due dilligence, make a pass on appified websites and don't care about how many messengers they have on the phone because it doesn't really matter anyway due to the reactive usage pattern. I suppose MZ is like that.
If he really wanted to that, he would have several phones with separate numbers for exploring other messaging apps as not to cross business and personal matters.
(1) You can't always trust the opinions of 100 developers, even if you can somehow consolidate them. Certain things are best seen firsthand, even if you have opinions of top 20 people you trust.
(2) Lugging multiple phones is clumsy enough that most people won't consider it, most CEOs, even more so. Two is somehow tolerable. And you need competitors' apps always available, to try things while you have an idea, and to easily compare.
Everybody should have a shutter over the camera, just to protect from blunders and misclicks.
No matter how you trust your software, some malware may want to activate it, especially at a high-value target like Zuckerberg (or Bezos, or Nadella, etc).
Why not cover your camera? Some MacBooks are so delicate that Apple itself warns stickers can break them IIRC¹, but to me that’s just one more great reason never to buy a MacBook. For other cameras, if someone can look into my windows, they can also watch a lock-picking video on YouTube and IIRC learn that something like ≥70% of American homes have locks from the same 1–2 brands, and break in easily, but I still close curtains, because why not?
Don't know about laptops but on recent desktop motherboards the jack detection and output/input switching is done in software. Probably the case on many laptops too. So this approach shouldn't be considered a "hardware switch" as it's still possible to capture audio from another mic even when a headset is plugged in.
On both of my XPSs I can switch to internal or external mic regardless of a headset behind plugged in, so the snarky suggestion to use a cut-off headphone jack is misguided and won't work on many modern laptops.
“Zuckerberg reportedly took action after he learned that a developer wanted to purchase one of his neighbor's homes and use the fact that Zuckerberg lived close by as a marketing tactic. ... Zuckerberg will lease the four homes he just bought back to its current residents.”
Not sure how this paints Zuckerberg in a bad light. The bad guy seems to be the developer who was going to use his fame to advertise, thus exposing Zuckerberg to way less privacy than a normal person. This was Zuckerberg’s way to eke out a bit more privacy in the vein of a normal human. I mean he left the people in their homes.
Not really just kind of proves the point he didn't want people selling houses near him as. " buy now to live next to the. CEO of Facebook" that sounds like a HUGE safety problem for him AND his family.
Rich people have existed before today, you know, and yet they didn't do such things.
> sounds like a HUGE safety problem for him AND his family.
You watch too many action movies. No one's going to spend the months of background checks and negotiation and tens of millions of dollars it would take to buy a mansion next to Zuckerthing's just to kidnap him or something.
The threat to rich people are from career criminals, not other rich people.
Maybe I should have said he sounded like a feudal lord, it doesn't matter to me. I wasn't diving into the circumstances, just the phrasing of that last sentence I quoted.
He literally owns the property of his neighbors in order to make his living situation more comfortable. It's sort of like a king granting lands to lords he approves of.
Also, what's to stop him from kicking those people out if he changes his mind? I doubt Zuck would ever write a honest contract that held him accountable for anything.
Like everything else is a free economy, it was a consensual transaction. If Zuck offered me $1 mil for my small apartment, and said I could live in it, I'd take the deal. He purchased their homes and then leased them back to them, what's the problem here other than how it can be construed to "sound bad"?
That's how America works. It is a feudal system where the rich buy the laws, gated communities, college admissions, genes, designer babies, and pandemic vaccines they want.
Interesting aside. Back on topic: the homes were not granted to Zuckerberg; and his residents are not serfs. So they are not "perfectly comparable," they are barely comparable.
(not to mention serfs could not be bought alone, they could be sold with the land. Maybe you are thinking of slaves?)
So the transfer of state-owned assets to oligarchs is the significantly comparable as buying four houses and renting them to their owners? Do you even know what happened after the fall of the Soviet Union?
Man, you're reading too far into my comment. I was just saying the sentence reads like:
A wealthy individual (WI) coming to his neighbors.
WI: "I'm worried your homes might be bought by individuals that affect my privacy, I know you don't intend to move, so let's make a deal that you can still live here and pay rent."
Neighbors: "Ok"
Random YC Comment: "(WI) left them in their homes, rather than reneging on his deal and casting them out of their familial homes to the street."
----
A Feudal Lord (FL) dies and his heir receives the fiefdom.
FL: "You are now my serfs and I've decided you may continue to live here and pay me a percentage of the crop yield, I am a generous lord."
Serf: "OK, m'lord"
Random YC Comment: "(FL) left the peasants in their huts like the generous lord he is."
---
A state industry has been privatized and is now owned by an Oligarch, to include the housing provided to the workers.
Oligarch: "Party land now my land, you may continue to live and work here to provide me profit, I am a generous businessman."
Worker: "ХОРОШО"
Random YC Comment: "The Oligarch left the workers in their housing, as long as it still provided him a profit."
> after he learned that a developer wanted to purchase one of his neighbor's homes and use the fact that Zuckerberg lived close by as a marketing tactic
That's just... creepy from the developer.
Also how is living next to Mark Zuckerberg a perk? Unless you want to build an illegally zoned Startup Incubator focused on IoT and the main acquirer you're targeting happens to live next door and notices your product everyday...
finding the homes of the rich and famous is absolutely trivial.
>I mean he left the people in their homes.
how good of him . /s
Generally if someone does something you don't like you ask them to stop; if they don't you sue them , or if conditions allow you ask law enforcement to step in.
>This was Zuckerberg’s way to eke out a bit more privacy in the vein of a normal human.
on what planet do normal humans buy all the surrounding real estate in some of the most expensive places to live to 'eke out a bit more privacy'?
This is quite clearly a show of finance and power that few 'normal human' people would ever be able to demonstrate themselves.
Good on him, i'm not upset about it, he SHOULD spend his money how he wants -- i'm upset that people try to paint the behavior as normal and run-of-the-mill.
There is nothing normal/every-man/run-of-the-mill about Mark Zuckerberg's existence.
You know actually there are additional benefits and privileges of this.
In tax-exempt organizations or any organization with "self-dealing" prohibitions, you can still get any benefit you want with asset prices.
So for example, your own private foundation can own all the houses around yours (and be rented if so desired) and be sold strategically to give newer indications of market value for your own house. Sell them all at once and you may be cratering market values for your personal property, or you may be raising the property value, but as long as the funds come from the market and you don't have a self-dealing prohibition. Your personally held property being eligible for contribution to that foundation too (or other tax exempt organization type that you might not have any control over), with its current value being a consideration. Although Zuckerberg famously has an LLC that is not tax-exempt for their philanthropic missions, it doesn't mean he/they don't have any tax-exempt organizations, and they're definitely not precluded from forming or using one in the future and transferring those assets when convenient.
Yes, you also have the privilege of aiming to get more privacy.
Latching on to any one thing just reveals how little privilege you have in comparison. Many other people wouldn't talk about it, as they employ similar strategies.
because 1 is not an opinion but a fact. and probably a fact he probably disagree with, like you
people don't care about privacy but it's not exactly his fault. then his job is to capitalize on it. why would he care. you would probably do the same ..
Ask people for unfettered access to their unlocked phone, even with them sitting there and watching you to prevent you from doing anything but poke around, and you will find a whole lot of people suddenly care more about privacy.
What is probably closer to the truth is that people care little about privacy from someone they think they will never come face to face with, and who they believe won't leak that information to anyone they will come face to face with.
The problem comes when people don't have the knowledge necessary to assess the risk of that happening in any given scenario.
I would definitely not do the same and its not a fact. Plenty of people value their privacy they are just not as tech savvy and not aware that they are giving it up.
Funny you phrase it like that. When I discuss with people about 'privacy', nobody has anything to hide. When I take 2-3 minutes explaining to them why 'someone' (data markets) out there knows: what type of porn gets them off, what they shop, how much money they have, what diseases they have (that may impact their chances to a life insurance - theirs' and their kids'), their drinking habits (that may impact their premiums on car insurance), who they meet/greet/f..k..
Then their expression changes a bit. Ignorance is bliss. I awaken 1-2 people at a time. BUT (big but - sorry for the caps) "all my friends is on FB, and Chrome is such a nice browser".
> you would probably do the same
I used to know a guy who geniunely had the opinion: if it wasn't illegal to sell heroin to kids, I'd be a billionaire right now selling to all my kids, starting with my kids' friends.
Unfortunately this person has kids, voting rights, walks among us. Yes I want to be a billionaire (I'm many-many zeros away from this target). No I wouldn't sell heroin (or your kids' photos you post on social media) to anyone else. Zuck has no problem getting 13yo on Instagram, profiling them, and trading their data. So NO.
But its not the privacy itself that people care about.
The reason is people care is because this information can be used to harm them.
The reason I care about my credit card number being public is because it can be used to stole my money but If my credit card number can be public while still keeping my money belong to me then I would much prefer that.
The better way would be to make everything as public as much as possible then fixes the issue that arise due to that information being public.
I think this is an important point to make. I think most people, myself included, are fine with sharing personal information if the person i'm sharing it with isn't going to use it against me. That's always been the case dealing with people.
I think that it is clear than then you post on Facebook about my/your/someone's cancer treatment, data aggregators will mark this in your medical records. An insurer 20-30 years from now will easily correlate that my/you/someone's kid has me/you/etc as a father, and therefore have increased chances of cancer (or some other disease) and thus affect their chance to get life insurance or affect the price of premiums.
When you share with FB (imho) it is clear that you don't share with a person, but with a hungry-for-information beast. Didn't people hear about Cambridge Analytica? Is anyone so naive so as to believe that this was 'the end of information leakage' by FB? (or other similar platforms, such as Pinterest - in the case someone makes a "medicine for my X disease" table).
It's definitely not a fact, but rather an argument.
Non-technical people within my circle appreciate GDPR. They fall into recent Apple's privacy advertising, FWIW. Many of my non-IT business peers have moved to private email services long ago — e.g. Protonmail, Mailbox.org, Fastmail, etc. — regardless of Google's Eric Schmidt[1] anti-privacy stance back in the day. Almost everyone I constantly communicate with use Telegram, and privacy of comms (FWIW) was among selling points at the time of switching the IM.
I won't exaggerate the value of privacy for a general public. But since Snowden's publications, Cambridge Analytica scandal and GDPR discussion with a widespread media coverage, common citizens became much more privacy-concerned, at least in Europe.
Below I quote highlights from the recent FRA (EU Agency for Fundamental Rights) survey report "Your rights matter: Data protection and privacy - Fundamental Rights Survey", June 2020[2]:
> 41% do not want to share any personal data with private companies, almost double the number compared to public bodies;
> the type of personal data influences people’s willingness to share. Only around 5% want to share their facial images or fingerprints with private companies;
> 72% know the privacy settings on their smart phones. But 24% do not know how to check the privacy settings on their apps;
> 55% fear criminals or fraudsters accessing their personal data. Around 30% worry about advertisers, businesses and foreign governments’ access to information without them knowing;
> 33% do not read the terms and conditions when using online services compared with 22% who always read them;
> 69% know about the GDPR. A similar number know their national data protection supervisory authority (71%);
> only 51% are aware that they can access their personal data held by companies.
Regarding your second point, I think it's more that people don't care as much about privacy if the consequences seem to be low/nonexistent, and they weigh their interest in platforms like Instagram, TikTok, etc. more highly than the potential negative impacts of their privacy being violated. I know users of Instagram, Facebook, TikTok, Slack, etc. who will almost definitely never move to encrypted email because they like the platform and the community. Folks on HN and in the tech sector in general can have quite different experiences with those in the world at-large, and one's viewpoint on this will differ based on who's in your networks. I think it's likely if you walk down the street in Manhattan or LA and ask someone if they know what DRM, copyleft, or GDPR is, the answer will be no. But they will care if the music they have can't be copied/shared, they'll know Wikipedia, and they'll care if their information with Google can't be deleted or is sold to other companies.
Well, certainly an anonymity flaw. Privacy is different than anonymity. Signal does not purport to provide anonymity. Anyone on the system can check for any other user by knowing their phone number. That tends to be the cost of the convenience of the automatic contact discovery provided by phone based messengers like Signal.
Anonymous messaging is an entire category by itself. It is a much more difficult problem. Fortunately, most people don't need anonymity, most of the time. I really don't care who knows that I am communicating with friends, family or the people related to my business. If anything, I care less that someone might find out I use Signal. If I encounter a situation where I need to be anonymous I can and will have to take special measures for a while. Temporary anonymity is relatively easy. The identity management is a cinch.
It is. Its also a security issue - Phone numbers aren't unique. It was interesting seeing signal say my deceased father was 'on signal', it's just the same number recycled to someone else.
I think unique has a specific meaning in your context which isn't what the standard definition of unique means. There are never two people in a carrier's database assigned to the same number. It can however be assigned to another person as long as it is deassigned from the previous person. The number, at any point, is always unique.
Privacy flaw and a security flaw: it is feasible to take over somebody's phone number through social engineering attack on the phone co's tech support. The tech support has ability to issue replacement SIM cards for users, and if an attacker socially engineers them to send SIM card with your number to them, it's a game over for you.
That is aside of the obvious problem of phone numbers being recycled for re-issue after having been abandoned for a stretch of time.
Not exactly a security flaw, as the new owner of the phone number won't have the original owner's secret keys (they never leave the phone, of course) and Signal will still send the messages to the previously authenticated client.
Taking over a Signal account requires access to the phone number and the password used to register it (or waiting a week). And this will still not give the attacker access to the private keys; When these keys change, Signal will put an alert message on ongoing chats and ask for confirmation.
It gets really interesting when you don't allow apps to see your contacts. Take Clubhouse for example, I never allowed it to see my contacts but it does show me all the people who have leaked my personal info to the app because it's made them auto follow me.
So yeah cool I can see that my boss, 4 ex-coworkers, my ex-bosses ex-wife, and I think a recruiter have all leaked my personal info to Clubhouse.
For a normal consumer it's not a flaw.
But if you are a journalist or in any profession where sharing number is a big NO then yes it's not a good design.
Case 1:
me and my friend want to communicate on signal. If he messages me on Signal there won't be any way to know if that's actually him. I would have to call him and confirm if its really him. For a tech-savvy person who is like crazy about privacy this is not a big deal. But for a normal consumer this make them remove the signal.
Case 2:
You want to message someone who you already communicate via WhatsApp or calls on signal. Again from an average consumer point. Calling them and asking them for their signal ID is again inconvenient. Imagine doing this for 400 contacts if you try to move from WhatsApp to signal.
There are other secure messaging app which doesn't require number or even email. But then again, you meet the person or call him and then get his id.
Not sharing mobile number is really for few selected cases and those people form a very small percentage of the market.
Having a feature similar to telegram where you share your number with selected people. Or have a disposal ID which can be used to start a chat and then dispose that ID so that it can'be used again or linked to your account. Something like this would make more sense and probably serve that small percentage of people as well. An ID which gets destroyed once it's used to make a connection. One time use ID only. This is a good way as well. Secret chats which stays on one device only, or you need separate credential for.
I should be able to make a sample app like this. It can be a good demo project.
There was a recent issue where Chinese citizens pointed out that IMEs with telemetry enabled could capture what you type into Signal and it didn't warn you or try to mitigate this; the Signal guy ignored them, then dodged them for weeks and told everyone they were harassing him, then a white guy pointed out the same thing and his suggestions were immediately taken up.
If I remember correctly the claim was from a Chinese citizen who pointed this out and when Signal didn't treat this as a major security issue (which I felt was understandable given that her concern was mostly along the lines of "you can't say that Signal is secure given that people install other software on their phone that could be a keylogger so you should have a disclaimer") she decided to continue to escalate on social media until she was, IMO, harassing Moxie for his "lack of engagement". I didn't come away particularly impressed by that exchange.
And what was the suggestion? The flow of information is (You typing → IME converts key presses into text → Signal receives text) so Signal can't do anything to prevent a malicious IME from sending the text elsewhere. The best they could do is raise users' awareness that Signal won't protect them if there's spyware with privileged access on their device, and that their IME might be such spyware.
Mainly the OS has security options to prevent IME network connections that aren't always on by default, but users can enable them. (They might be off because it disables some features that need internet access.)
IIRC there was also some Android API that could be used in the app to help, but I don't remember what it actually did atm.
Perhaps your man the incognito keyboard option? Signal uses it, but no API I can imagine will protect against a malicious IME.
Worse still, even if you block IME internet access on a device with a factory-maliciois IME, they could just upload the data using some other service on the device.
Installing a known-good IME seems like the only fix I can think of?
How so? I don't find it violating in any way that someone knows that I'm using a messenging app due to my phone number. They aren't exactly private in the first place.
It would be the same if there were usernames, although those would be far harder to assign to a person. Thankfully, Signal is working on that feature and hopefully it's released soon. I'd like to one day not have any phone number at all. It's a system that feels very much antiquated...
You're assuming it takes considerable skill to fuzz an algorithm or look for back doors and easter eggs? To answer your question, I don't know if Wire's been audited by paid researchers. I personally prefer a customer-exposed product whose each commit you can look up rather than the remote promise of security in a locked and hidden program.
I sent Mark a message yesterday on Signal. It was "delivered" but not "read", obviously. Today, I don't see that contact as a Signal user anymore. I see a "Invite to Signal" button on that conversation view. Guess he deleted the account?
That is if all your friends live in the same country, I have quite a few friends all over the world, and especially in Asia, people prefer specific apps. For friends in China it's WeChat, for Japan it's LINE. Other friends use Facebook, my family uses WhatsApp. Here in HK it's a mix between WhatsApp, Telegram or Signal.
I mean, I think it would be reasonable for Zuckerberg's friends to assume that he wouldn't want to use a competing product? Nobody would be offended if the CEO of Coca-Cola declined a can of pepsi.
He is probably aware that other Facebook employees can crack into Whatsapp traffic at will. It might be unwise to be caught doing it to his account; and probably best to make it look like it was somebody else doing it.
Obviously I have no way to show you Facebook proprietary source code, but equally obviously Facebook can make their code do anything they like, including MITMing on an account-by-account basis.
That is always true when all the code in use is controlled by a single closed organization.
Nope, just an expectation that sweeping claims get backed up either by reasoning (like you just did) or a source
As for your reasoning vs your claim: you had made it seem like there was some well-known flaw or tool within FB to disable/intercept E2E. Or that we should expect an E2E-disabler functionality already exists.
I'm no FB developer, but I doubt it's as simple as one or two rogue developers adding in an "intercept mark's messages" functionality.
> That is always true when all the code in use is controlled by a single closed organization.
Fair enough, but FB/whatsapp messengers are probably some of the most scrutinized by third parties, as well as developers who would sooner or later blow the whistle (would hope so anyway). I would not take "mark is on signal so his messengers suck" to be a reasonable conclusion - and I'm not even FB's biggest fan.
Do you imagine it is even possible that FB has not yet been served a National Security letter demanding backdoor access, on demand, to any WA account of the spooks' choice? Or that there would be any technical difficulty in complying with such an order?
Going to signal from other messengers replaces unknown/hard to reason about privacy issues with a single easy to reason about issue. The idea was to do that for the broad group of people using messengers on their phones.
It would have been weird if he didn't have any account on the competitors platforms. You want to check the features they offer, the UX... This article didn't say anything about how often he is using it.
This isn’t a story. I’ve installed competitors’ apps many times to see what they are like. Doesnt even mean that I use them beyond seeing what they are like. And even so, not a big deal if you occasionally use a competitor.
But naturally! He is worried about his internal IT team getting access to his own messages, like the Register's BOFH[0] /s
[0] https://www.theregister.com/offbeat/bofh/
"So he put his tongue in, and took a large lick. “Yes,” he said, “it is. no doubt about that. And honey, I should say, right down to the bottom of the jar. Unless, of course,” he said, “somebody put cheese in at the bottom just for a joke. Perhaps I had better go a little further... just in case...”
- Winnie the pooh, Chapter 5 in which Piglet meets a Heffalump