We use it everyday without realizing it. From the latest rap album, to that new film everyone’s talking about (which you may have downloaded through questionable means, but hey, I’m not judging), digital compression is everywhere. We take these mathemagical, file-shrinking formulas for granted. For the average user, it could mean saving up a few hundred megabytes of space on his hard drive, but for large corporations housing massive databases of user-uploaded information, it could mean the difference in the tune of several million dollars. From Morse code using shorter dits and dashes for common letters like “e” and “a”, all the way to advanced motion-compensation based video codecs like H.265 and VP9, data compression has indeed come a long way. However, like all good things in life, data compression comes at a cost. Compression requires processing power to encode and decode compressed data. Every time you open a jpeg file, your computer isn’t just opening up an image file. It’s building up the image using fancy math formulas like “DCT” and “entropy coding”. Another drawback of compression is that some data will be lost (lossy). In lossy compression schemes like JPEG and MP3, there is an adjustable trade-off between the size of the file and quality. We’re going to be focusing on the latter.