Quantizing (high Data Loss)
Quantizing is a process that involves adapting the data encoding precision to the capacity of human perception. Due to the fact that the eye is not able to monitor changes to fine details (such as the parked cars in the image) very well, the observer will not notice the slightly reduced display precision. Looking at the image very carefully however, a certain softening effect can be seen which smudges sharp edges. In addition, so-called artifacts occur if the degree of quantization is too high.
Code Optimization 1: Run Length Encoding
Run Length Encoding works by grouping elements that repeatedly occur and by encoding them with a count value. As the counter also requires space, elements that occur twice or three times remain uncoded. This type of compression is used in the graphics field, for example to display smooth surfaces with a minimum Byte count.
Code Optimization 2: Huffman Encoding
The Huffman method encodes often-repeated elements with a few bits and rare ones with more bits. The number of times the elements occur is used to determine the respective bit encoding method.