In 20 years, a virtual WMD may well instantly obliterate millions of people.
I'm already unsure of what the most possible damage someone could do with over-the-air automobile firmware updates is today, just to take one example. What would it be like if someone put out a virus that at 11:32:42am on March 3rd, 2036 causes every GM, Ford, and Tesla self-driving car to lock all the doors, floor the accelerator, and let the chips fall where they may?
Consider not just the immediate impact of the crashes, but the fact that you just completely obliterated emergency services (they couldn't hope to serve but a tiny fraction of the victims), choked every major road and most of the minor roads with wreckage, wrought a catastrophe so large that while I don't predict what the effects would be, we're talking something more defining for a generation that would handily compete with both World Wars combined for psychological effect, with the Great Depression tossed in for good measure... it would be astonishing.
I'm not even sure we couldn't get close to that in 2018, to be honest. What if by some horrors the Stuxnet authors were set the task of making this happen? How close could they get?
The problem all virus authors have is escaping detection. 2036 is too far out for them to count on not being detected, and on cars being the same. Release it today, and even if you infect all cars and are undetected, GM and Ford's normal update cycle is likely to change things such that by accident your virus cannot spread. You can expect to get a handful of cars to accelerate out of control - and odds are the door locks don't work on them so you failed to lock the door.
Infecting a cars is hard for other reasons. Radios tend to be easy to updated (they can sell you new features - maps if nothing else). All other controllers tend to be more locked down such that it is likely that a virus couldn't actually spread to anything that can take control.
Maybe, who knows who GM will change over the next 20 years. GM only has guesses.
"2036 is too far out for them to count on not being detected, and on cars being the same."
Sorry, I conflated two things here. I meant someone in 2036 setting a logic bomb for something like a month in advance in their time, and as a separate question, how close one could get to such a virus today. As we keep wiring up our cars to networks (not necessarily "the Internet", but networks), it's only going to get easier.
One of the problems I think will happen with cars, only accelerated by self driving cars and the high probability that people will largely lease them rather than own them, is that the governments of the world are going to see a big pot of real-time surveillance data and real-time person control mechanisms and won't be able keep their hands off, mandating that cars start getting very connected and that cars have backdoors for authorities to take over and redirect them, etc. My scenario in 2036 may not even be a brilliant virus designer, but just one person with Python scripting skills and a bit too much access to the government control system.
It's not even that hard to imagine such a disaster happening accidentally. I'm sure, no sarcasm, that protections will be put into place, but there always has to be a developer back door mechanism of some sort, and there may be enough controls added, or they may not be added competently enough.
(And in terms of the protections of the cars themselves, remember that Stuxnet included the use of not one, but two code signing certificates that the Stuxnet authors clearly did not have true authority to use. If there's a way from the Internet to the control mechanism, even if it requires signed code, there's no guarantee a particularly capable and motivated enemy won't penetrate the protections.)
My scenario in 2036 may not even be a brilliant virus designer, but just one person with Python scripting skills and a bit too much access to the government control system.
I'm already unsure of what the most possible damage someone could do with over-the-air automobile firmware updates is today, just to take one example. What would it be like if someone put out a virus that at 11:32:42am on March 3rd, 2036 causes every GM, Ford, and Tesla self-driving car to lock all the doors, floor the accelerator, and let the chips fall where they may?
Consider not just the immediate impact of the crashes, but the fact that you just completely obliterated emergency services (they couldn't hope to serve but a tiny fraction of the victims), choked every major road and most of the minor roads with wreckage, wrought a catastrophe so large that while I don't predict what the effects would be, we're talking something more defining for a generation that would handily compete with both World Wars combined for psychological effect, with the Great Depression tossed in for good measure... it would be astonishing.
I'm not even sure we couldn't get close to that in 2018, to be honest. What if by some horrors the Stuxnet authors were set the task of making this happen? How close could they get?