Only the normal mode was tested because it's faster. Times are from the output of the command 'time' line 'user' and rounded. Because of this, the compression and decompression time and speed tables should be taken as suggestive and not as the absolute truth.
In practice, the bigger test files should be more reliable in terms of speed comparison. When reading the tables, it is important to keep in mind which settings are the default in each program:. Note: The first column with numbers In this test bzip2 is a tough adversary to lzmash in fast modes. XMMS 1. The file was first gunzipped, resulting uncompressed size of bytes 5. For some reason, "bzip2 -6" took more time than even "bzip -9".
The result didn't change when the test was repeated. From your data, it looks like xz is worth testing have already tried Bzip and gzip. PNG is probably your best bet, lossless image compression. It should be better then generalised compressors. I did some extremely thorough tests of this and found that xz -1 is nearly always a better drop in for gzip -9 with quite impressive gains for little to no increase in cost. If you run tests on a lot of data then you might notice that xz is a bit special.
Sometimes higher compression levels will slightly worsen compression ratio. As for how bzip2 relates to this I am not sure. Thanks for the information, I agree, bzip2 does not seem very useful anymore now that xz is around. As far as I know, xz only has parallel compression but not decompression. There was also a post by Antonio Diaz Diaz concerning the longevity of data compressed by xz vs say bzip2.
This would be quite interesting for archivists, but I imagine backups of the original uncompressed data obviates the concern. Excellent work here but I think redefining compression ratio to meet your explanation is a little confusing i. In your conclusion it may be a good idea to under which conditions is it is wise to change compressors. Great article! This answered a lot of questions for me. Will probably be using xz in the future : Thanks!
Note this may be biased, I did not read it. Good article. Suggesting one improvement: add a chart that will show the 3 method modes overall performance. X scale should be time, y scale should be compression ration. Will allow understanding trade off between options.
Website URL. Notify me of follow-up comments by email. Notify me of new posts by email. Email Address. RootUsers Guides, tutorials, reviews and news for System Administrators.
The Benchmarking Process The linux Bring on Compression Wars! Gather together a variety of compression tools, test them head-to-head against a variety of file types and see how the perform. There needs to be a few different types of file types involved as certain files compress easier than others. For example, text files should compress alot more than video due to the fact that video codecs already contain compression algorithms. I have therefore split the test into the following categories:.
Each test has been run from a script 10 times and an average has been taken to make the results as fair as possible. Although there are many compression tools available, I decided to use the 5 that I consider the most common.
We will also have the option to use the zless program. This performs the same function as the previous pipes:. Another advantage to keep in mind of gzip is that supports compression level. Supports 3 levels of compression as below. To compress the file called ubunlog. Another possibility that gzip offers us is that of concatenate multiple compressed files into one. We can do this in the following way:. The above two commands will compress ubunlog1.
We can view the contents of the files ubunlog1. El bzip2 it is very similar to the gzip program. The main difference is that it uses a different compression algorithm called Burrows-Wheeler block classification text compression algorithm and Huffman encoding. Files compressed with bzip2 will end with the extension.
0コメント