2004.02.11 18:39 "RE: [Tiff] Photoshop 8.0 and libtiff", by Don Ellis
I have just finished some testing using libtiff with the only tsize_t changed to a 64 bit signed integer and have been able to compress and uncompress images between 2GB and 4GB. To break the 4 GB barrier, I think you are correct that it will take a 64 bit computer. There may be something that I am missing, but I have not found it yet. I will keep testing and keep you informed.
Cc: Frank Warmerdam; email@example.com Subject: RE: [Tiff] Photoshop 8.0 and libtiff
Both td_imagewidth and td_imagelength are unsigned 32 bit integers and in the specifications they are also defined this way. So a Tiff image conceivably could be a total of 18 x10 to the 18th pixels. Multiply this by 4 to get four colors and the file size could be huge. Strips, tiles, and most of the other limiting variables are just used to speed things up. For very large files you may have to forgo the use of strips or do some modifications, such as limit a strip to the size of available memory, and modify some pointers to uint64. I see nothing in the code that prevents huge images.
I agree. If the API interface bases toff_t and tsize_t on the operating system's off_t and size_t definitions, then artificial limitations imposed by libtiff's interface are removed. These will both be 64-bits on 64-bit systems using a 64-bit compilation environment. For 32-bit systems that support large files, off_t will be 64 bits and size_t will be 32 bits.
The alternative is for libtiff to always use 64-bit types. I have not yet run across a modern CPU that doesn't support native 64-bit types, although some may rely on assembly routines for some 64-bit math operations.
Should I continue to copy firstname.lastname@example.org?
Why not? Files which stress the TIFF limits will become common over the next several years. Just as we passed the 640K "limit", we will soon be passing the 4GB "limit".