This article is more than 1 year old

Google open sources very slow compression algorithm

Your server will hate it but your mobe will love it

Google has open sourced a new compression algorithm called Zopfli that it says is a slower-but-stronger data squasher than the likes of zlib.

The product of Googler Lode Vandevenne's 20 per cent time, the day a week Google allows its staff to work on side projects, Zopfli is said to reduce files to sizes 3.7–8.3 per cent smaller than its rivals (PDF). The catch is that it takes one hundred times longer to do so.

Google's 'fessed up to that delay, noting that Zopfli requires two to three times the CPU power of its rivals and is therefore “best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web.”

Over the years many have claimed to have created compression systems that cram data into wonderfully tiny bundles. Several have been revealed as scammers. So while Google has made only modest compression gains and has done so without time savings, eight per cent is still nothing to be sneezed at in applications like content delivery to mobile devices, where any reduction in the amount of time radios work means better battery life and lower bills for data consumption. And while Zopfli takes ages to pack data, it uses existing tools to unpack it at the same speed as rival algorithms, meaning no processing penalty on the device.

Open sourcing the algorithm therefore makes a lot of sense, given Google (and world+dog) is keen on getting more content into more mobile devices more often in more places, and that most everyone who owns a mobile device wishes it would do things faster.

And let's not forget the looming spectrum crunch predicted to send mobile data costs rising to unpleasant levels, a phenomenon a little more compression could ease.

Zopfli's code is available here. ®

More about

TIP US OFF

Send us news


Other stories you might like