Google just launched Zopfli, a new open source compression algorithm that can compress web content about three to eight percent more densely (PDF) than the standard zlib library. Because Zopfli is compatible with the decompression algorithms that are already part of all modern web browser. Using Google’s new algorithm and library on a server could lead to faster data transmission speeds and lower web page latencies, which would ultimately make the web a little bit faster.
The new algorithm, which Zurich-based Google engineer Lode Vandevenne created as a 20% project, is an implementation of the Deflate algorithms – the same algorithm that’s also used for the ZIP and gzip file formats and PNG image format. Zopfli’s output is compatible with zlib, but uses a different and more effective algorithm to compress data.
As Vandevenne writes in the announcement today, “the exhaustive method is based on iterating entropy modeling and a shortest path search algorithm to find a low bit cost path through the graph of all possible deflate representations.”
There is, however, a price that needs to be paid for this: it takes significantly longer to compress files with Zopfli (decompression times are virtually the same, though). Indeed, as Vandevenne notes, “due to the amount of CPU time required — 2 to 3 orders of magnitude more than zlib at maximum quality — Zopfli is best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web.”
Image credit: Volvo
Nasdaq quotes delayed at least 15 minutes, all others at least 20 minutes.
Markets are closed on certain holidays. Stock Market Holiday List
By accessing this page, you agree to the following
Press Release Service provided by PRConnect.
Stock quotes supplied by Six Financial
Postage Rates Bots go here