Google Guetzli Image Compression in Sage

Does anybody knows how to implement the new google guetzli image compression into sage development?

I would just love to profit from 20 to 30% smaller images in my web project.

2 Likes

You could probably pipe it in with something like this.
From the docs though it seems like guetzli requires quite a bit of time to run and optimize your images, so I feel I’d rather manage it myself from my own bash script or something like that, without interfering with the theme development :slight_smile:

Would love to see this in Trellis!

1 Like

Great post! I second this.

From the usage notes on the Github page:

Note: Guetzli uses a large amount of memory. You should provide 300MB of memory per 1MPix of the input image.

Note: Guetzli uses a significant amount of CPU time. You should count on using about 1 minute of CPU per 1 MPix of input image.

Doesn’t sound great for on-the-fly compression.

And…

Note that Guetzli is designed to work on high quality images. You should always prefer providing uncompressed input images (e.g. that haven’t been already compressed with any JPEG encoders, including Guetzli).

I can’t speak for everyone but a high percentage of image that I’d be handling would have been compressed at some point, even if that’s just saving out of Photoshop.

ImageOptim is integrating it as an option, will be interesting to experiment with: Integrate Guetzli into ImageOptim for jpeg files. · ImageOptim/ImageOptim@64c11c6 · GitHub.

2 Likes

Thanks for your reply!

Thank you for your reply, this makes a lot of sense. It seems like we have to wait and see for technology to develop further.

My take on it, is that I can’t wait to have it on my server. Perhaps have the current libjpeg or whatever it is process images to relevant sizes in WP Media Library, then have Guetzli run in the background processing. Or an alternative is have the Delicious Brains mob have an option in their S3 plugin to have an intermediat Guetzli render step in their Offload plugin.

But maaaan, if it really gets files down to 30% of current size, whilst maintaining quality, then I have no issue with finding a way to deal with the slow compression time.

From the feedback so far, a 3MB image can take around 20 minutes to compress on an i7 machine with plenty of RAM with the results often coming in larger than ImageOptim. Also, the resources used would potentially have a huge impact on your server.

It’s also lossy, so depends if that bothers you, although you can specify the compression amount.

It’s early days and the devs have stated that they haven’t optimised resource usage at all yet but I guess it boils down to what can be optimised within the implementation or if it’s the algorithm itself that is resource intensive.

1 Like

+1 for ImageOptim, they already have a beta version for Guetzli.

Also if using ImageOptim, make sure you click on Tools->Lossy Minification. If not, you won’t be doing much with it :wink: