I’m legitimately not sure democracy will survive modern, sophisticated propaganda techniques, plus an open, international Web, plus losing the ability to more-or-less trust audio and video recordings that we’ve grown used to over the last hundredish years. Between state actors and, eventually if not already, transnational corporations waging info warfare, I kinda doubt the institution can take it. Too much info, too fast, from too many sources.
Democracy is historically not the default state of human interactions. Just look at facebook/nextdoor/twitter, most people want to force others how to behave (with force that is available), i.e. authoritarianism.
Any technology that makes it easier to vilify someone by definition can be used to weaken democracy.
If developing nations are any indication we have other problems to worry about. While you've got the occasional genocide the advent of mass literacy (decades ago) and modern information technology has made those nations less authoritarian and less corrupt. The leaders mostly now try to keep their misdeeds on the down low because once they're out they spread over social media like wildfire. It's not great but it's progress.
> I wonder what encryption and key based techniques can be used to verify the authenticity of audio and video records in the future.
None, since encryption isn't the answer to this problem. Take Romney's leaked "47%" comment [1] or Hillary Clinton's leaked "deplorables" comment, how would encryption have been useful to either verify the recordings' authenticity or reject them if they had been a deepfakes? It wouldn't have, as those comments were meant for private audiences, so neither of them would have officially signed them. If the encryption could trace the recording back to the individual that made it, then the leaker might decide never release the recording (since they don't want to be outed). And if all the encryption can do is trace back to a random device, why not just get a random device to sign your deepfake?
There are DRM and signing schemes, stegonography, etc, but in the moment, people don't care. Or rather, the message registers in their minds whether it happens to be true or not. It's how advertising works. Beliefs are essentially tribal, and we all believe our tribal sources. If a source of news isn't a part of your tribe, you're probably not going to believe what it's saying until someone from your ingroup verifies it. Crypto doesn't do that.
The irony is that we all trust crypto because of the perceived tribal affiliation of the developers as well, which doubly reduces the case for crypto verifying media.
We can barely get actual security devices to keep their keys secret. Do you expect rando Chinese $49 video recorder to have a trusted key management solution?
Half of population have IQ lower that average, so they rely on those with larger brains to help them to make decisions. They are vulnerable. You are trying to make them smarter, like you, but this will not help them. You need to assist them, every day of week, to combat enemy propaganda.
I don't get it. By the time someone is running for president, they've been around a while. Someone who doesn't have an understanding of the character or political positions of Joe Biden or Donald Trump (or whoever) by now hasn't been paying attention to anything for a good long while, so why would they start with a random deepfake?
I mean, sophisticated propaganda techniques have been utilized forever, no? How do you defeat it without getting into checksums, etc? Would you agree that critical thinking skill gained from part of a rounded education would help you see through the BS? Of course, one side of the politcal duopoly in the US is trying very hard to keep Americans from getting educated...
> Someone who doesn't have an understanding of the character or political positions of Joe Biden or Donald Trump (or whoever) by now hasn't been paying attention to anything for a good long while, so why would they start with a random deepfake?
Because they saw it on an ad, perhaps even a targeted one. I think deepfakes are going to be a "push" kind of thing: more used to corrupt the background information environment than be engaged with directly.