AWARE SYSTEMS
TIFF and LibTiff Mail List Archive

Thread

2003.12.26 16:00 "[Tiff] [ANNOUNCE]: Libtiff 3.6.1 released", by Andrey Kiselev
2004.01.14 12:01 "[Tiff] COLORMAP and byte padding", by Stephan Assmus
2004.01.14 15:07 "Re: [Tiff] COLORMAP and byte padding", by Frank Warmerdam
2004.01.14 17:17 "Re: [Tiff] COLORMAP and byte padding", by Joris
[...]

2004.01.14 12:01 "[Tiff] COLORMAP and byte padding", by Stephan Assmus

Hello,

I'm adding write support to the libtiff based OpenBeOS TIFF Translator. I want to support palette images. The format of the colormap that

TIFFSetField(tif, TIFFTAG_COLORMAP, ???);

expects is unclear to me. In the documentation, it just says

Tag Name           Count  Type      Notes

TIFFTAG_COLORMAP   3      uint16*   1<<BitsPerSample arrays

Ok, so there are 256 entries in the palette if I have 8 bits per sample. So much I follow, but what's with the uint16*?!? And why Count == 3? The BeOS colormap bitmaps all use the same system wide palette, which consists of 256 entries each representing an rgb color with 3 (4 actually) 8 bit values for r, g, b (and alpha). Am I supposed to fit a 24 bit RGB value into 16 bits and have 256 uint16s? Or do I have 3 * 256 uint16s with 16 bits for each r, g and b?

I have also a couple more questions about strip size and byte padding. There seem to be two alternating fields that determine strip size:

TIFFTAG_ROWSPERSTRIP rows per strip of data
TIFFTAG_STRIPBYTECOUNTS bytes counts for strips

Can both be used at the same time to tell libtiff that

strip byte counts != rows per strip * samples per pixel * width?

(So that there are some padding bytes at the end of each row...)

Best regards,
-Stephan