2008.02.18 14:23 "Re: [Tiff] bug? JPEGQUALITY seems not to be read correctly", by
Joris Van Damme (AWare Systems) wrote:
One point that makes this whole argument moat is that this LibJpeg quality setting we're all familiar with, ranging from 0 to 100 is itself a LibJpeg specific measure.
I did forget to mention one last but important detail that invalidates the whole concept.
JPEG, in TIFF, can use the JPEGTables tag to emit a JPEG stream with tables only and no image data. Doing so, it can next contain inside the strip/tiles JPEG streams with image data and no tables. In this case, there's one consistent set of tables for the whole image.
However, the JPEGTables tag can also be omitted, and the tables can be embedded in the strip/tile JPEG streams. In that case, there can be different strip/tile specific tables. You'd end up with a different quality estimation for each strip/tile by itself, if your estimate wants to remotely reflect any reality at all. Even if the JPEGTables tag is used, individual strips/tiles can override the image-global tables in that tag, and specify other tables for use on that specific strip/tile only. This can be used so as to have better compression on some image regions, and better quality on others.
Add this last fact to the ones mentioned already, and indeed you end up asking yourself what quality we're discussing here. The concept is moot.
Joris Van Damme
Download your free TIFF tag viewer for windows here: