2016.11.12 20:16 "[Tiff] Libtiff 4.0.7 release pending ...", by Bob Friesenhahn

2016.11.21 20:28 "Re: [Tiff] 12 bit byte format", by Kemp Watson

If it helps, there¹s a world of difference between ³used² bits and ³encoded/stored² bits. The software and files don¹t care about ³used² bitsŠ

If data is stored raw, there¹s an impact on file size; but if LZW or other compression, storing 12 used bits in a 16-bit field won¹t effect file size noticeably.

W. Kemp Watson

kemp@objectivepathology.com

Objective Pathology Services Limited
www.objectivepathology.com
tel. +1 (416) 970-7284

On 2016-11-21, 2:54 PM, "Bob Friesenhahn" <tiff-bounces@lists.maptools.org on behalf of bfriesen@simple.dallas.tx.us> wrote:

TIFFTAG_BITSPERSAMPLE equals 12

TIFFTAG_SAMPLESPERPIXEL equals 3

I guess there is no standard format for 12 bit, in terms of bits on disk.

The above seems to be the standard to me.

Since you are likely doing DCI related things, the 12-bits in 16-bits issue comes up because the original sample scans were done this way (on a Filmlight) because TIFF decoders for 12-bits were rare and because they did not trust that DPX readers would support 12-bits. Because of this, the original JPEG 2000 encoders targeting DCI were implemented for this form of input.