2017.06.27 20:51 "[Tiff] Excessive memory allocation while chopping strips", by Nicolas RUFF

2017.06.29 05:52 "Re: [Tiff] Excessive memory allocation while chopping strips", by Paavo Helde

On 28.06.2017 23:37, Bob Friesenhahn wrote:

wrt the question:

Very large strips and or tiles are used by people who expect that it will increase the performance of their application in some sense, and yes there might be some gain.Most computers can handle MB easily (even

Even now, most (Intel) CPUs have L2 caches optimized for looping through no more than 256k data at a time. Even if more data is read into RAM, the processing loops should usually not consume more than about 128k and performance definitely drops off once the working set exceeds 256k.

Yes, limiting the strip or tile size to some pretty large limit would be reasonable. However, people are not reasonable. Also, lots of processing software read the whole image into memory before any processing so the potential performance benefits from piecewise processing would be lost anyway.

Writing the whole image always in a single strip is also simpler then chopping it up to strips or tiles, so I am sure there are lots of software pieces out there which are writing single-strip TIFF files without using libtiff or any other library (this is actually pretty simple) and these strips are growing larger all the time because of hardware advancements. A fully reasonable 2000x2000x16bpp image would already be 8 MB and that doesn't stop there. The reading part (libtiff) should be lenient and allow for reading such files as long as they fit in memory.

Just my 2 euro cents

Paavo