...relative to ... ? Is it better than lzip? lzip sounds like it would also use LZMA-based compression, right? This [1] sounds like an interesting and more detailed/up-to-date comparison. Also by the same author BTW.
Relative to the compression formats people were aware of at the time (which didn't include lzip.)
People began using xz because mostly because they (e.g. distro maintainers like Debian) had started seeing 7z files floating around, thought they were cool, and so wanted a format that did what 7z did but was an open standard rather than being dictated by some company. xz was that format, so they leapt on it.
As it turns out, lzip had already been around for a year (though I'm not sure in what state of usability) before the xz project was started, but the people who created xz weren't looking for something that compressed better, they were looking for something that compressed better like 7z, and xz is that.
(Meanwhile, what 7z/xz is actually better at, AFAIK, is long-range identical-run deduplication; this is what makes it the tool of choice in the video-game archival community for making archives of every variation of a ROM file. Stick 100 slight variations of a 5MB file together into one .7z (or .tar.xz) file, and they'll compress down to roughly 1.2x the size of a single variant of the file.)
7z, xz, and lzip all use the same compression algorithm (LZMA). The differences between them are in the container format that stores the data, not in the compression of the data. 7z is akin to zip in that it functions as an archive in addition to the compressed data. xz and lzip both accomplish the same goal, which is to store the LZMA-compressed stream while letting some other tool handle archival if desired, as is traditional on unixy systems, where archival is usually handled by tar (though you will sometimes fun into cpio or ar archives) while compression is handled by gzip (same compression algorithm as zip) or bzip2 or whatever else.
Thus, the proposed benefits of the compression ratio apply equally to lzip as they do to 7z and xz. When the article talks about shortcomings of the xz file format compared to the lzip file format, it's talking about file structure and metadata, not compression algorithm. Just running some informal comparisons on my machine, an empty (zero byte) file results in a 36 byte lzip file and a 32 byte xz file, while my hosts file of 1346 bytes compresses to a 738 byte lzip file and a 772 byte xz file. An mtree file listing my home directory comes to 268 Mbytes uncompressed, resulting in an 81M lzip file and an 80M xz file (a difference of 720 Kbytes, less than 0.9% overhead). Suffice it to say, the compression of the two files is comparable. Yet, the lzip file format also has the advantages discussed in the article.
That said, for long-term archival I wouldn't use any of the above: I prefer zpaq, which offers better compression than straight LZMA, along with dedup, journaling, incremental backup, append-only archives, and some other desirable features for archival. Together with an mtree listing (to capture some metadata that zpaq doesn't record) and some error recovery files (par2 or zfec), this makes a good archival solution, though I hesitate to call it perfect.
Maybe xz is not good for long term archiving but it's both faster and produces smaller files in most scenarios. However, I'm sticking with gz for backups, mainly because of the speed and popularity. If I want to compress anything to the smallest possible size without any regard for CPU time, then I use xz.
...relative to ... ? Is it better than lzip? lzip sounds like it would also use LZMA-based compression, right? This [1] sounds like an interesting and more detailed/up-to-date comparison. Also by the same author BTW.
[1] https://www.nongnu.org/lzip/lzip_benchmark.html#xz