1999.11.02 10:35 "Re: grayscale tiff", by Ivo Penzar
I have a ccd camera with a software package that saves TIFF images as grayscale images, but they have 16 Bits Per Sample. (The TIFF6 standard only allows 4 or 8 bits per sample - so most readers can not read the images, or convert them to 8 bit images, but then the image contrast is lost). What is the best solution for this problem? (without losing the information contained in an image)
TIFF 6.0 (and at least 5.0, too) allows even 13 bits per sample grayscale images, with 9 extra samples per pixel (not used bits). OK, it is not required for baseline TIFF readers to know how to handle such images...
You can check MinSampleValue and MaxSampleValue (if not present, you can calculate them per each image, or you can try some of your images to see how actually that driver/package stores the data) and then you can apply more adequate remapping of 16 bpp grayscale data to 8 bpp. You can even apply the gamma correction (nonlinear mapping) to enhance the contrast. For example, your grazscale samples might all lay within the range of 0-1^12 or so. Most readers would probably divide them by 256 (shift by 8 bits), instead of only 64 (4 bits), and thus they would really loose the contrast (the image would appear as rasterized).
With the technique as described above (appropriate of 16 bpp grayscale samples to 8 bpp), I can't believe that you would visibly loose any information. After all, (most) graphic cards and CRTs use up to 8 bits per each RGB componnent, and thus they can not show more than 256 levels of gray.