Mozilla’s Mozjpeg Should Make Firefox Faster

    March 6, 2014
    Zach Walton
    Comments are off for this post.

The JPEG has been around for more than 20 years now. When technology gets that old, you either take it out back or teach it some new tricks. Mozilla is opting for the latter even as it prepares for a future where the former is a reality.

Mozilla announced Wednesday that it’s working on a new project called mozjpeg that will improve JPEG compression without breaking browsers. The non-profit says it’s doing this because the modern Web uses pictures more than ever before and this can really slow down a page’s loading time. With new compression techniques, they can decrease the time it takes Firefox to load a page full of images.

Even though its building mozjpeg, Mozilla doesn’t see JPEG remaining the dominant image format on the Web. Of course, moving to a new image format brings with its own unique challenges so mozjpeg is being built to help improve JPEG encoding even while the Web moves to a new format.

Production JPEG encoders have largely been stagnant in terms of compression efficiency, so replacing JPEG with something better has been a frequent topic of discussion. The major downside to moving away from JPEG is that it would require going through a multi-year period of relatively poor compatibility with the world’s deployed software. We (at Mozilla) don’t doubt that algorithmic improvements will make this worthwhile at some point, possibly soon. Even after a transition begins in earnest though, JPEG will continue to be used widely.

Given this situation, we wondered if JPEG encoders have really reached their full compression potential after 20+ years. We talked to a number of engineers, and concluded that the answer is “no,” even within the constraints of strong compatibility requirements. With feedback on promising avenues for exploration in hand, we started the ‘mozjpeg’ project.

If you want to try out mozjpeg for yourself, Mozilla released version 1.0 today. It’s a fork of libjpeg-turbo with “jpgcruch” functionality added for good measure. They found this combination can reduce jpeg file sizes by 10 percent and they’re obviously hoping they can increase this with help from the community.

Image via Wikimedia Commons

  • guest

    Regarding data compression, it’s worth to mention that there is a new approach to entropy coding (ANS) – while Huffman is fast but inaccurate, arithmetic coding is accurate but slow, this one is both fast and accurate.
    Some implementation and sources: https://github.com/Cyan4973/FiniteStateEntropy/tree/master/data

  • Lorea

    The title of the article is a bit misleading. They are working on making a better compression for JPEG. It impacts the encoding stage and they want (at least for the moment) to keep the same decompression algorithms. Thus, there is not going to be anything special in Firefox to take advantage of this work