>Both Android & iOS also supports disk encryption and are also locked for most of the day.
That lock does not delete the secret key material from the phone. That is how things like the Cellebrite forensics box can still crack phones. An encrypted desktop stores the secret key material in the user's head. Such a system is for all practical purposes unbreakable when it is shut off. The security comes from usage. It is impossible to create a system that is easily available to a user that is not also somewhat available to an attacker.
An open source program maintained in the open by the users of that program is going to be safer than a hostile proprietary program kept in the sandbox of Flatpak, Snap or whatever.
While its likely Cellebrite make use of some known (and possibly some unknown) exploits to bypass the screen lock on some phone models and get at user data, sometimes they require the device to be unlocked and doubtlessly make use of adb to pull out user data.
Encryption on Android devices has changed over the last few years, introducing the possibility for File Based Encryption FBE, later making in mandatory.
File based encryption enables apps to use keys to encrypt their data that are flushed when the screen is locked. Many security concerned apps like password managers and OTP apps make use of this.
If you feel the need you can use 'multiple users' to create an extra user or use a work profile (Island, Shelter, Insular) to keep sensitive apps and data. These have seperate encryption keys that are flushed upon switching off the work profile or restarting the phone (for multi user). You can still use the main user on the phone with the work profile/multi user encryption keys not held in memory.
Disappointing to hear that mobile disk encryption is subpar, I have heard of stuff like Cellebrite but didn't know much about it.
> An open source program maintained in the open by the users of that program is going to be safer than a hostile proprietary program kept in the sandbox of Flatpak, Snap or whatever.
The number of users who have the motivation (let alone skill) to maintain their programs is approximately zero. This is how you get vulnerabilities like the log4j ones that have almost unlimited exposure to the whole machine. This is how you get malware from npm packages. OSS works as long as lots of people are constantly paying close attention, but that's a tall order for the majority of OSS. "Just use a massive OSS project" isn't even feasible in many real use cases. So no, I don't think it's safer.
That lock does not delete the secret key material from the phone. That is how things like the Cellebrite forensics box can still crack phones. An encrypted desktop stores the secret key material in the user's head. Such a system is for all practical purposes unbreakable when it is shut off. The security comes from usage. It is impossible to create a system that is easily available to a user that is not also somewhat available to an attacker.
An open source program maintained in the open by the users of that program is going to be safer than a hostile proprietary program kept in the sandbox of Flatpak, Snap or whatever.