What does “our computer” mean when it is not owned by you, but issued to you to perform a task with by your employer? Does that also apply to the operator at a switchboard in a nuclear missile launch facility?
Does the switchboard in a nuclear missile launch facility run Crowdstrike? I picture it as a high quality analog circuit board that does 1 thing and 1 thing only. No way to run anything else.
Globally networked personal computers were kind of cultural revolution against the setting you describe. Everyone had their own private compute and compute time and everyone could share their own opinion. Computers became our personal extensions. This is what IBM, Atari, Commodore, Be, Microsoft and Apple (and later desktop Linux) sold. Now given this ideology, can a company own my limbs? If not, they can't own my computers.
> What does “our computer” mean when it is not owned by you, but issued to you to perform a task with by your employer?
Well, presuming that:
1. the employee is issued a computer, that they have possession of even if not ownership (i.e. they bring the computer home with them, etc.)
2. and the employee is required to perform creative/intellectual labor activities on this computer — implying that they do things like connecting their online accounts to this computer; installing software on this computer (whether themselves or by asking IT to do it); doing general web-browsing on this computer; etc.
3. and where the extent of their job duties, blurs the line between "work" and "not work" (most salaried intellectual-labor jobs are like this) such that the employee basically "lives in" this computer, even when not at work...
4. ...to the point that the employee could reasonably conclude that it'd be silly for them to maintain a separate "personal" computer — and so would potentially sell any such devices (if they owned any), leaving them dependent on this employer-issued computer for all their computing needs...
...then I would argue that, by the same chain of reasoning as in the GP post, employers should not be legally permitted to “issue” employees such devices.
Instead, the employer should either purchase such equipment for the employee, giving it to them permanently as a taxable benefit; or they should require that the employee purchase it themselves, and recompense them for doing so.
Cyberpunk analogy: imagine you are a brain in a vat. Should your employer be able to purchase an arbitrary android body for you; make you use it while at work; and stuff it full of monitoring and DRM? No, that'd be awful.
Same analogy, but with the veil stripped off: imagine you are paraplegic. Should your employer be allowed to issue you an arbitrary specific wheelchair, and require you to use it at work, and then monitor everything you do with it / limit what you can do with it because it’s “theirs”? No, that’d be ridiculous. And humanity already knows that — employers already can't do that, in any country with even a shred of awareness about accessibility devices. The employer — or very much more likely, the employer's insurance provider — just buys the person the chair. And then it's the employee's chair.
And yes, by exactly the same logic, this also means that issuing an employee a company car should be illegal — at least in cases where the employee lives in a non-walkable area, and doesn't already have another car (that they could afford to keep + maintain + insure); and/or where their commute is long enough that they'd do most non-employment-related car-requiring things around work and thus using their company car. Just buy them a car. (Or, if you're worried they might run away with it, then lease-to-own them a car — i.e. where their "equity in the car" is in the form of options that vest over time, right along-side any equity they have in the company itself.)
> Does that also apply to the operator at a switchboard…
Actually, no! Because an operator of a switchboard is not a “user” of the computer that powers the switchboard, in the same sense that a regular person sitting at a workstation is a "user" of the workstation.
The system in this case is a “kiosk computer”, and the operator is performing a prescribed domain-specific function through a limited UX they’re locked into by said system. The operator of a nuclear power plant is akin to a customer ordering food from a fast-food kiosk — just providing slightly more mission-critical inputs. (Or, for a maybe better analogy: they're akin to a transit security officer using one of those scanner kiosk-handhelds to check people's tickets.)
If the "computer" the nuclear-plant operator was operating, exposed a purely electromechanical UX rather than a digital one — switches and knobs and LEDs rather than screens and keyboards[1] — then nothing about the operator's workflow would change. Which means that the operator isn't truly computing with the computer; they're just interacting with an interface that happens to be a computer.
[1] ...which, in fact, "modern" nuclear plants are. The UX for a nuclear power plant control-center has not changed much since the 1960s; the sort of "just make it a touchscreen"-ification that has infected e.g. automotive has thankfully not made its way into these more mission-critical systems yet. (I believe it's all computers under the hood now, but those computers are GPIO-relayed up to panels with lots and lots of analogue controls. Or maybe those panels are USB HID devices these days; I dunno, I'm not a nuclear control-systems engineer.)
Anyway, in the general case, you can recognize these "the operator is just interacting with an interface, not computing on a computer" cases because:
• The machine has separate system administrators who log onto it frequently — less like a workstation, more like a server.
• The machine is never allowed to run anything other than the kiosk app (which might be some kind of custom launcher providing several kiosk apps, but where these are all business-domain specific apps, with none of them being general-purpose "use this device as a computer" apps.)
• The machine is set up to use domain login rather than local login, and keeps no local per-user state; or, more often, the machine is configured to auto-login to an "app user" account (in modern Windows, this would be a Mandatory User Profile) — and then the actual user authentication mechanism is built into the kiosk app itself.
• Hopefully, the machine is using an embedded version of the OS, which has had all general-purpose software stripped out of it to remove vulnerability surface.
> the employee could reasonably conclude that it'd be silly for them to maintain a separate "personal" computer — and so would potentially sell any such devices
What a bizarre leap of logic. Can Fedex employees reasonably sell their non-uniform clothes? Just because the employer in this scenario didn't 100% lock down the computer (which is a good thing because the alternative would be incredibly annoying for day-to-day work), doesn't mean the the employee can treat it as their own. Even from the privacy perspective, it would be pretty silly. Are you going to use the employer provided computer to apply to your next job?
People do do it, though. Especially poor people, who might not use their personal computers very often.
Also, many people don't own a separate "personal" computer in the first place. Especially, again, poor people. (I know many people who, if needing to use "a PC" for something, would go to a public library to use the computers there.)
Not every job is a software dev position in the Bay Area, where everyone has enough disposable income to have a pile of old technology laying around. Many jobs for which you might be issued a work laptop still might not pay enough to get you above the poverty line. McDonald's managers are issued work laptops, for instance.
(Also, disregarding economic class for a moment: in the modern day, most people who aren't in tech solve most of their computing problems by owning a smartphone, and so are unlikely to have a full PC at home. But their phone can't do everything, so if they have a work computer they happen to be sat in front of for hours each day — whether one issued to them, or a fixed workstation at work — then they'll default to doing their rare personal "productivity" tasks on that work computer. And yes, this does include updating their CV!)
---
Maybe you can see it more clearly with the case of company cars.
People sometimes don't own any other car (that actually works) until they get issued a company car; so they end up using their company car for everything. (Think especially: tradespeople using their company-logo-branded work box-truck for everything. Where I live, every third vehicle in any parking lot is one of those.)
And people — especially poorer people — also often sell their personal vehicle when they are issued a company car, because this 1. releases them from the need to pay a lease + insurance on that vehicle, and 2. gets them possibly tens of thousands of dollars in a lump sum (that they don't need to immediately reinvest into another car, because they can now rely on the company car.)
The point is that if you do do it, it's on you to understand the limitations of using someone else property. Just like the difference between rental vs owned housing.
There are also fairly obvious differences between work-issued computers and all of your other analogies:
1. A car (and presumably the cyberpunk android body) is much more expensive than a computer, so the downside of owning both a personal and a work one is much higher.
2. A chair or a wheel chair doesn't need security monitoring because it's a chair (I guess you could come up with an incredibly convoluted scenario where it would make sense to put GPS tracking in a wheelchair, but come on).
> just buys the person the chair. And then it's the employee's chair.
It's not because there's a law against loaning chairs, it's because the chair is likely customized for a specific person and can't be reused. Or if you're talking about WFH scenarios, they just don't want to bother with return shipping.
No, it's the difference between owned housing vs renting from a landlord who is also your boss in a company town, where the landlord has a vested interest in e.g. preventing you from using your apartment to also do work for a competitor.
Which is, again, a situation so shitty that we've outlawed it entirely! And then also imposed further regulations on regular, non-employer landlords, about what kinds of conditions they can impose on tenants. (E.g. in most jurisdictions, your landlord can't restrict you from having guests stay the night in your room.)
Tenants' rights are actually a great analogy for what I'm talking about here. A company-issued laptop is very much like an apartment, in that you're "living in it" (literally and figuratively, respectively), and that you therefore should deserve certain rights to autonomous possession/use, privacy, freedom from restriction/compromise in use, etc.
While you don't literally own an apartment you're renting, the law tries to, as much as possible, give tenants the rights of someone who does own that property; and to restrict the set of legal justifications that a landlord can use to punish someone for exercising those (temporary) rights over their property.
IMHO having the equivalent of "tenants' rights" for something like a laptop is silly, because that'd be a lot of additional legal edifice for not-much gain. But, unlike with real-estate rental, it'd actually be quite practical to just make the "tenancy" case of company IT equipment use impossible/illegal — forcing employers to do something else instead — something that doesn't force employees into the sort of legal area that would make "tenants' rights" considerations applicable in the first place.
No, that would be more like sleeping at the office (purely because of employee preferences, not because the employer forces you to or anything like that) and complaining about security cameras.
Tangent — a question you didn't ask, but I'll pretend you did:
> If employers allowed employees to "bring their own devices", and then didn't force said employees to run MDM software on those devices, then how in the world could the employer guarantee the integrity of any line-of-business software the employee must run on the device; impose controls to stop PII + customer-shared data + trade secrets from being leaked outside the domain; and so forth?
My answer to that question: it's safe to say that most people in the modern day are fine with the compromise that your device might be 100% yours most of the time; but, when necessary — when you decide it to be so — 99% yours, 1% someone else's.
For example, anti-cheat software in online games.
The anti-cheat logic in online games, is this little nugget of code that runs on a little sub-computer within your computer (Intel SGX or equivalent.) This sub-computer acts as a "black box" — it's something the root user of the PC can't introspect or tamper with. However:
• Whenever you're not playing a game, the anti-cheat software isn't loaded. So most of the time, your computer is entirely yours.
• You get to decide when to play an online game, and you are explicitly aware of doing so.
• When you are playing an online game, most of your computer — the CPU's "application cores", and 99% of the RAM — is still 100% under your control. The anti-cheat software isn't actually a rootkit (despite what some people say); it can't affect any app that doesn't explicitly hook into it.
• In a brute-force sense, you still "control" the little sub-computer as well — in that you can force it to stop running whatever it's running whenever you want. SGX and the like aren't like Intel's Management Engine (which really could be used by a state actor to plant a non-removable "ring -3" rootkit on your PC); instead, SGX is more like a TPM, or an FPGA: it's something that's ultimately controlled by the CPU from ring 0, just with a very circumscribed API that doesn't give the CPU the ability to "get in the way" of a workload once the CPU has deployed that workload to it, other than by shutting that workload off.
As much as people like Richard Stallman might freak out at the above design, it really isn't the same thing as your employer having root on your wheelchair. It's more like how someone in a wheelchair knows that if they get on a plane, then they're not allowed to wheel their own wheelchair around on the plane, and a flight attendant will instead be doing that for them.
How does that translate to employer MDM software?
Well, there's no clear translation currently, because we're currently in a paradigm that favors employer-issued devices.
But here's what we could do:
• Modern PCs are powerful enough that anything a corporation wants you to do, can be done in a corporation-issued VM that runs on the computer.
• The employer could then require the installation of an integrity-verification extension (essentially "anti-cheat for VMs") that ensures that the VM itself, and the hypervisor software that runs it, and the host kernel the hypervisor is running on top of, all haven't been tampered with. (If any of them were, then the extension wouldn't be able to sign a remote-attestation packet, and the employer's server in turn wouldn't return a decryption key for the VM, so the VM wouldn't start.)
• The employer could feel free to MDM the VM guest kernel — but they likely wouldn't need to, as they could instead just lock it down in much-more-severe ways (the sorts of approaches you use to lock down a server! or a kiosk computer!) that would make a general-purpose PC next-to-useless, but which would be fine in the context of a VM running only line-of-business software. (Remember, all your general-purpose "personal computer" software would be running outside the VM. Web browsing? Outside the VM. The VM is just for interacting with Intranet apps, reading secure email, etc.)
There you go. An anti-cheat rootkit so ineptly coded it serves as literal privilege escalation as a service. Can we stop normalizing this stuff already?
My computer is my computer, and your computer is your computer.
The game company owns their servers, not my computer. If their game runs on my machine, then cheating is my prerrogative. It is quite literally an exercise of my computer freedom if I decide to change the game's state to give myself infinite health or see through walls or whatever. It's not their business what software I run on my computer. I can do whatever I want.
It's my machine. I am the god of this domain. The game doesn't get to protect itself from me. It will bend to my will if I so decide. It doesn't have a choice in the matter. Anything that strips me of this divine power should be straight up illegal. I don't care what the consequences are for corporations, they should not get to usurp me. They don't get to create little extraterritorial islands in our domains where they have higher power and control than we do.
I don't try to own their servers and mess with the code running on them. They owe me the exact same respect in return.
> If their game runs on my machine, then cheating is my prerrogative. v
Sure.
However, due to the nature of how these games work, cheating cannot be prevented serverside only.
So, if you want to play the game, you have to agree to install the anti-cheat because it's the only way to actually stop cheating.
The *only other alternative is to sell a separate category of gaming machines where users wouldn't have access to install cheats, using something like the TPM to enforce.
I don't have to agree to a thing. They're the ones who should have to accept our freedom. We're not about to sacrifice our power and freedom for the sake of preventing cheating in video games. Not only are we going to play the games, we're going to impose some of our terms and conditions on these things.