2016.11.21 21:05 "Re: [Tiff] 12 bit byte format", by Aaron Boxer
On Mon, Nov 21, 2016 at 3:28 PM, Kemp Watson <email@example.com> wrote:
If it helps, there¹s a world of difference between ³used² bits and ³encoded/stored² bits. The software and files don¹t care about ³used² bitsŠ
If data is stored raw, there¹s an impact on file size; but if LZW or other compression, storing 12 used bits in a 16-bit field won¹t effect file size noticeably.
Thanks, yes, I am looking primarily at uncompressed data, so 12 bits helps a lot.
On 2016-11-21, 2:54 PM, "Bob Friesenhahn" <firstname.lastname@example.org on behalf of email@example.com> wrote:
TIFFTAG_BITSPERSAMPLE equals 12
TIFFTAG_SAMPLESPERPIXEL equals 3
I guess there is no standard format for 12 bit, in terms of bits on disk.
The above seems to be the standard to me.
Since you are likely doing DCI related things, the 12-bits in 16-bits issue comes up because the original sample scans were done this way (on a Filmlight) because TIFF decoders for 12-bits were rare and because they did not trust that DPX readers would support 12-bits. Because of this, the original JPEG 2000 encoders targeting DCI were implemented for this form of input.
> >firstname.lastname@example.org, http://www.simplesystems.org/ > users/bfriesen/