2018.05.11 01:43 "[Tiff] LZ4 compression", by fx HAYAKAWA MICHIO

2018.05.11 20:54 "Re: [Tiff] LZ4 compression", by Kemp Watson

Interesting about cloud-optimized GeoTIFF; I wasn't aware of the project - we've been doing exactly that for 12 years with our ZIF format (although we used to need a bit of server-side Javascript before range-requests, in the pre-BigTIFF version). See http://zif.photo.

So much duplication of effort since the Internet - connection is not communication.

W. Kemp Watson
Objective Pathology Services Limited
Halton Data Center
8250 Lawson Road
Milton, Ontario
Canada L9T 5C6

http://www.objectivepathology.com

kemp@objectivepathology.com
tel. +1 (416) 970-7284

> On May 11, 2018, at 4:14 PM, Even Rouault <even.rouault@spatialys.com> wrote:

On vendredi 11 mai 2018 14:50:36 CEST Bob Friesenhahn wrote:

Development libtiff now supports Facebook's Zstandard (zstd) compression which is pretty fast and does seem to be effective with image data.

One other alternative is use of filesystem-level compression. I've been using ZFS with no compression, lz4 and gzip compression of various levels. The advantages of having it at the filesystem level are complete transparency at the application level, including memory mapping, with the writer and reader not needing to take any action to do the compression. This gives you all the benefits of LZ4 compression but without the need to alter libtiff or any of your own code. It's also very fast for lz4 (not so much for gzip; you'll need a decent CPU to avoid slowness).

Zfs with lz4 is good (and what I use) but libtiff does add additional smarts to re-arrange the data so that it is more easily compressed via its 'predictor' mechanism.

Another advantage of using compression codecs (compared to full file compression) is that they make it possible to have efficient random read for TIFFs stored in cloud storage. See http://www.cogeo.org/