Duplicates aren’t always bad… some files naturally exist in many places, and removing them from some of the places make that directory/app incomplete.
If you do want to save space by storing one copy of the bits/blocks, and still retain the index of all the original locations… you can store all your backups on a ZFS file system with Dedup turned on… (this uses memory and has performance implications)..
…restic stores files encrypted in a tree by the hash, so it naturally stores 1 file but as many references to it as needed. It has lookup and list functions that can tell you what’s duplicated.
To simply find/report dups to be dealt with manually, you could quite easily md5/Sha1 your entire file tree, storing the output in a text file , which you can then pipe through sort , awk, and uniq to see which hashes occupy multiple lines … this is labor intensive… I just let my backup tools “compress” by saving one copy of each hash, and then it doesn’t matter as much (in my opinion).
If its pictures or some other specific file type that you want to focus on the most… I’d pick an app that’s intended for cataloging those. Example: Adobe Lightroom shows me my duplicate pics and I can deal with those easily there.
You would need to take your access point and wave it all around various directions in the space around the microwave. The “leak” could occur in a direction that doesn’t have significant signal. Might be a better test to cook a large bowl of water, while testing your phone (outside the oven) on 2.4ghz … holding it on all sides of the oven to see if any areas degrade the signal. This testing approach isn’t that conclusive.
Agreed. I just tested with two phones and one phone timed out but the other was able to maintain a connection. That would suggest that my microwave is maybe leaking. However I'm able to use the microwave without any noticeable effects from on 2.4ghz devices.
Epik is an interesting company. Their previous CTO (from an acquisition, BitMitigate, which is/was a sad excuse of a nginx reverse proxy on Voxility), is a script kiddie that DDoSed Trump and the Trump campaign site with some message similar to (paraphrased) "you should buy ddos protection from me" open in a notepad instance.
If you do want to save space by storing one copy of the bits/blocks, and still retain the index of all the original locations… you can store all your backups on a ZFS file system with Dedup turned on… (this uses memory and has performance implications)..
Or back everything up with restic:
https://github.com/restic/restic
…restic stores files encrypted in a tree by the hash, so it naturally stores 1 file but as many references to it as needed. It has lookup and list functions that can tell you what’s duplicated.
To simply find/report dups to be dealt with manually, you could quite easily md5/Sha1 your entire file tree, storing the output in a text file , which you can then pipe through sort , awk, and uniq to see which hashes occupy multiple lines … this is labor intensive… I just let my backup tools “compress” by saving one copy of each hash, and then it doesn’t matter as much (in my opinion).
If its pictures or some other specific file type that you want to focus on the most… I’d pick an app that’s intended for cataloging those. Example: Adobe Lightroom shows me my duplicate pics and I can deal with those easily there.