![]() The Linux 3.7 kernel source code is 34 MiB smaller in XZ format than in Gzip format. ![]() ![]() Now, on the flip side, there are situations where XZ compression is vastly superior: doing the same thing I just described through an SSH tunnel over the Internet). I routinely shuffle folders quickly across a trusted LAN connection with commands such as "tar -c * | lzop -1 | socat -u - tcp-connect:192.168.0.101:4444" and Gzip could be used similarly over a much slower link (i.e. Honestly, lzop is much faster than Gzip and still compresses okay, so applications that need the fastest compression possible and don't require Gzip's ubiquity should look at that instead. If compression time is more important than compression ratio, Gzip beats XZ. Compression is useless without the ability to decompress it. Gzip is pretty much universally supported by every UNIX-like system (and nearly every non-UNIX-like system too) created in the past two decades. Gzip compression is extremely fast on such processors when compared to all better compression methods such as XZ or even Bzip2. Three seconds extra to decompress on your Core i5 can be severely long on a 200 MHz ARM core or a 50 MHz microSPARC. Such systems usually have slow processors, and decompression time increases can be very high. One might scoff at the thought of downloading and compiling the latest version of Bash on an ancient SparcStation LX with 32MB of RAM, but it happens. Gzip) off of a package destined for an OpenWrt router, what good is the minor space savings if the router has 16 MiB of RAM? A similar situation appears with very old computer systems. As an example, if XZ can shave 400 KiB (vs. Reasons why XZ is not necessarily as suitable as Gzip:Įmbedded and legacy systems are far more likely to lack sufficient available memory to decompress LZMA/LZMA2 archives such as XZ. The ultimate answer is accessibility, with a secondary answer of purpose.
0 Comments
Leave a Reply. |