Why do people use xz anyway? As for me I just use tar.gz when I need to backup a piece of a Linux file system into an universally-compatible archive, zip when I need to send some files to a non-geek and 7z to backup a directory of plain data files for myself. And I dream of the world to just switch to 7z altogether but it is hardly possible as nobody seems interested in adding tar-like unix-specific metadata support to it.
xz has substantially better compression than gz or bz2, especially if using the flags -9e. You can use all your cores with -T0 or set how many cores to use. I find it to be on par with 7-zip.
Perhaps folks are trying to stick with packages that are in their base repo. p7zip is usually outside of the standard base repos.
Substantially is a relative term. There are niche cases but how many people really care, or need to care, about the last bytes that can be compressed?
Packing a bunch of files together as .tgz is a quite universal format and compresses most of the redundancy out. It has some pathological cases but those are rare, and for general files it's still in the same ballpark with other compressors.
I remember using .tbz2 in the turn of the millennium because at the time download/upload times did matter and in some cases it was actually faster to compress with bzip2 and then send over less data.
But DSL broadband pretty much made it not matter any longer: transfers were fast enough that I don't think I've specifically downloaded or specifically created a .tbz2 archive for years. Good old .tgz is more than enough. Files are usually copied in seconds instead of minutes, and really big files still take hours and hours.
None of the compressors really turn a 15-minute download into a 5-minute download consistently. And the download is likely to be fast enough anyway. Disk space is cheap enough that you haven't needed the best compression methods for ages in order to stuff as much data on portable or backup media.
Ditto for p7zip. It has more features and compresses faster and better but for all practical purposes zip is just as good. Eventhough it's slower it won't take more than a breeze to create and transfer, and it unzips virtually everywhere.
I never thought bz2 was worth it over gzip, but xz is much much better in many common cases (particularly text files, but also other things). Source code can often be xz compressed to about half the size as gzip. If you are downloading multiple things at once or a whole operating system or uploading something then even on slower DSL lines it makes a huge difference IMO. I wish more package systems provided deltas.
The only issue I've had with xz is that it doesn't notice if it is not actually compressing the file like other utilities do and then just store the file uncompressed, so if you try to xz a tar file with a bunch of already highly compressed media files then it both takes forever and and you end up with a nontrivially larger file than you started with.
Also, I like that, unlike gzip, xz can sha256 the uncompressed data if you use the -C sha256 option, providing a good integrity check. Yes, I would really like to use a format that doesn't silently decompress incorrect data and I can't understand why the author of this article thinks that is a bad thing. For backups I keep an mtree file inside the tar file with sha512 of each file and then the -C sha256 option to be able to easily test the compressed tar file without needing another file. In some cases I encrypt the txz with the scrypt utility (which stores HMAC-SHA256 of the encrypted data).
A major problem of zip is the "codepage hell" (it has been almost eradicated in browsers but still lives in zip archives, e-mails and non-.Net Windows programs). With 7z you just always know nobody is going to have problems decoding the names of the files inside it, whatever languages those are in, regardless to the system locale.