AWARE [SYSTEMS]
AWare Systems, Home TIFF and LibTiff Mail List Archive

LibTiff Mailing List

TIFF and LibTiff Mailing List Archive
January 2019

Previous Thread
Next Thread

Previous by Thread
Next by Thread

Previous by Date
Next by Date

Contact

The TIFF Mailing List Homepage
Archive maintained by AWare Systems



New Datamatrix section



Valid HTML 4.01!



Thread

2019.01.12 17:25 "[Tiff] tiffcp altering image contents (in contrast to what the manual says)?", by Binarus
[...]
2019.01.13 21:11 "Re: [Tiff] Tiff Digest, Vol 3, Issue 5", by Richard Nolde
2019.01.13 23:12 "Re: [Tiff] tiffcp altering image contents (in contrast to what the manual says)?", by Binarus
2019.01.14 17:44 "Re: [Tiff] tiffcp altering image contents (in contrast to what the manual says)?", by Daniel McCoy
[...]

2019.01.13 23:12 "Re: [Tiff] tiffcp altering image contents (in contrast to what the manual says)?", by Binarus

Dear Richard,

thank you very much for your impressive answer.

On 13.01.2019 22:11, Richard Nolde wrote:

Bob is certainly correct in stating that the issue is that the output is written using YCBCR encoding.

One of my previous messages shows the output from tiffinfo for each of the source files. tiffinfo shows that the source files are indeed written using YCbCr encoding, so this is true.

But I don't understand why this imposes an issue. Since OJPEG does not seem to be a problem in this case, why does tiffcp (obviously) alter the image data? Why doesn't it just copy the YCbCr encoded data byte by byte when merging the images (just altering the directory, endianness, offsets,... accordingly)?

Why not simply use another compression algorithm and why use Graphics Magick at all if you are just compression and combining them?

Two questions in one sentence :-)

We can't use another compression algorithm for the 24 BPP files because they will get huge if we do. One one hand, we can afford the degradation which is caused by encoding as JPEG with 90% quality if the degradation is guaranteed to happen only once. On the other hand, the size of those files will be at least 5 times their current size if we use any other compression than JPEG. Since we will have to handle some 100000 of them, this is a problem.

The current compression scheme is well-crafted and approved. The problem is that degradation must only happen once. We couldn't accept that tiffcp would re-encode the image data, and I therefore would like to understand exactly what is happening here and why tiffcp obviously touches the image data at all.

Likewise, we don't want to use another compression scheme for the 1 BPP images because ZIP has turned out to be the most efficient for our images, and as a bonus, it is lossless. It is no option to use JPEG compression for this image type, because it would simplify things, but would make the images 10 times their current size.

Tiffcp or tiffcrop can compress an uncompressed file on the fly while making a multi-page TIFF from a series of uncompressed single page TIFF files.

I think that this is a very good idea. It will need thorough testing, though, and replacing gm by tiffcp will cause the whole process to have to be re-approved, documented and so on.

If your files have various bit depths for which the same compression algorithm cannot be used, you would have to first combine all of them at a given bit depth and specify a compression algorithm that is appropriate for that bit depth. After the subsets are combined, ie, all the 1 bit bilevel images with CITT Group3 or Group4 encoding, all the RGB images with Zip or LZW or ZIP, you call tiffcp with no compression algorithm specified to build the final version.

By this, do you mean to just leave away the "-c" switch, as I did? The manual says (in the section where the -c switch is explained):

"... By default tiffcp will compress  data  according  to  the va