Interesting that the warez scene adopted RAR as the standard for packaging up pirated works. The alternatives like 7z save more bytes, yet RAR somehow became the winner. Anyone know why?
RAR's initial release was 1993, 7z's 1999. IIRC, I remembered seeing RAR's in the scene around 1995, 1996 or so. My take: RAR handled splitting to multiple arbitrary-sized volumes more gracefully than ARJ/ARC which were CLI first and never had a Windows GUI that was as nicely polished as WinRAR was. SFVs made error checking and redownloading corrupted volumes easy without relying on the compression format to handle it. ACE just came too late.
Edit: another major use case overlooked here: a considerable amount of media applications will stream files from within a RAR archive without manually unarchiving beforehand, making them more accessible from file storage sites like mega, 1fichier, without carrying the external appearance/negative baggage of an mp4 or mkv.
> I remembered seeing RAR's in the scene around 1995, 1996 or so. My take: RAR handled splitting to multiple arbitrary-sized volumes more gracefully than ARJ/ARC which were CLI first and never had a Windows GUI that was as nicely polished as WinRAR was.
Of course there were still people who screwed that up. I remember downloading some software once that came in the form of a multi-volume RAR, maybe 20 floppy disk-sized RARs. Download all 20 of them, get them in the same directory, then un-rar them, and lo and behold out the other end comes a single .RAR file that was inside! So I un-rar that single RAR, and out the other end comes 20 separate floppy disk images. Obviously the "scene" distributors weren't always the computing world's best and brightest...
What really bothers me now are movies being distributed over torrents as a multi-volume RAR.
Just...why?
The BitTorrent protocol will handle corrupt data and fix it at the segment level. You won't have to redownload an entire file.
Splitting a file into pieces made sense 20+ years ago when connections (Both your physical ISP connection and the logical TCP connection) were unstable, web servers didn't always support download resuming, and software didn't handle graceful unexpected disconnections, but those days are long behind us. We transfer data over encrypted channels that include checksums at the packet level. We use software that can handle disconnections, automatically reconnect, and resume where it left off, not to mention detect when data went bad and re-download just the bad part.
There's just no damn reason to split a 5 GB .mkv into 100 50 MB .rar files which will then take my poor RPi 15 minutes to decompress.
I think what we're seeing maybe can be chalked up to distributors adhering to their group's internal rules that just haven't changed since the 90s. Another likely "Scene rule" thing that lasted a long while were movies that were encoded aiming for a particular size. The quality was tuned up down such that every release was exactly 700MB. Why 700MB? Well, turns out there is an ancient form of data storage called CD-ROM which is limited to 700MB, and the rule (or habit) to aim for that size just never changed.
It's just a part of the scene. Momentum is a hell of a thing to stop in that respect. I don't know if 7zip or any other format will spit out chunked files in 50MB or 100MB part files for "easier" distribution. I say "easier" because it's the poor mans format to send files without using offsets to resume downloads.