f4df Wrote:
Stone Wrote:
So, your contention is that every 1 and 0, after it is unpacked, is identical to the file before it was packed? I know this is theoretically possible, but my understanding is that these programs use some rounding algorithms to compress the files, which to me means the files cannot end up exactly like the original file.
Yes, that is what lossless means- if a computer program is missing a single letter or line of code.. no workee, right? It has to be lossless.
Compression can simply mean more economical representation of information.... so instead of saying
a) "I like bananas because bananas make me go bananas" - you could represent it like
b) "I like * because *s make me go *s" along with the info that says something like : c) "* = banana"
When the 'compressed' file is processed then it simply rewrites each * as banana on output. Obviously the savings (amount of compression) is equivalent to something like 'a-(b+c)'. That would be an example of lossless.
Another way to more economically represent information is to not represent all of it, but only what functionally matters. So for instance, mp3 compression works off of psychoacoustic theories of human hearing - to determine what bits of information can be left out without affecting (or minimally) affecting perception. JPEG etc. work on similar principles of approximating the input.
Well, I understand this, but I also understood (although maybe incorrectly) that compression/decompression was not perfect. A quick search on the innernets resulted in this, which was what my undestanding was before we started this debate:
Quote:
An algorithm that is asserted to be able to losslessly compress any data stream is probably impossible. While there have been many claims through the years of companies achieving "perfect compression" where an arbitrary number of random bits can always be compressed to N-1 bits, these kinds of claims can be safely discarded without even looking at any further details regarding the purported compression scheme.
But admittedly, that does not mean that an algorithm cannot compress a
certain specific data stream losslessly, which is maybe what ZIP and RAR do.