They are implying that future versions of Signal will drop random files on your phone that "may or may not" cause damage to Cellebrite systems.
They are basically putting the threat out that if you use Cellebrite on Signal in the future, you might not get the data you expect, and at worst, it may corrupt the report/evidence.
This also brings into question the chain of custody, as an untrusted device being imaged can alter reports of unrelated devices.
Damn, a chain of custody where the thing in evidence is also part of not only its own chain but also those of other evidence acquired afterwards? I can't imagine what kind of case law exists around that, but I'm sure it's hilarious!
Which is what I don't really understand - it seems like Cellebrite could spin this in their favor so law enforcement would need to purchase a new kit for each device?
More realistically it is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe. You never run it, but maybe somebody exploits your file server, gets all your files, and automatically runs them?
It really is like dropping a file on your private file server DONT_RUN_THIS_BLOWS_UP_YOUR_COMPUTER.exe - but contrary to your expectations, it's not "oh well", if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up, there's no significant difference from active retaliation - the consequences are there, the intent is there, the act is there, it's pretty much the same.
Of course, if some criminal exploits your file server, they are not likely to press charges, but if it triggers on law enforcement who have a warrant to scan your fileserver, that's a different issue.
You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.
The beauty though, is that law enforcement now can't even know before plugging in and scanning a device whether they'll actually be pwned.
They have to use the exploit to figure out if the phone can nuke that hardware's usability in the future or integrity of any locally stored, non-offsited data.
UNLESS Cellebrite can produce publically for a court of law proof that any potential exploit isn't a valid concern, which means spilling implementation details about how the device works.
Nobody can continue to shut up AND maintain the status quo. Either everyone clams, and Signal can sow reasonable doubt without challenge, crippling Cellebrite's value as a forensic tool. Or someone has to open up about the details of their tool, which, like it or not, will speak very loudly about the ways and methods behind these exploits.
The Checkmate is implied, and oh my, is it deafening.
> if you placed it there with the intent to trap someone who you expect to be looking at your computer, you may well be liable if their computer blows up
Liable for what? You haven’t promised that the code is safe, and they chose to run it.
> there's no significant difference from active retaliation
There is a significant difference, in active retaliation you choose to attack someone elseks computer, with a trap file the attacker chooses to run files they have stolen from you. Big difference.
> You'd be just as liable as for physical boobytraps on your property, with pretty much the same reasoning.
The reasoning is different, lethal or injurious man traps are prohibited because you don’t respond to trespassing with lethal force and you don’t know who or what may trigger the trap. Man traps that lock the intruder in a room without injuring them are fine, and used in high security installations.