AWARE SYSTEMS
TIFF and LibTiff Mail List Archive

Thread

2000.01.27 23:05 "latest libs - where?", by Paul Kaiser
[...]
2000.02.01 04:16 "Re: G4 and DCT documentation [was: tiff2ps: G4 compression in PostScript]", by Tom Lane
2000.02.01 05:11 "Re: G4 and DCT documentation [was: tiff2ps: G4 compression in PostScript]", by Joris Van Damme
2000.02.01 12:47 "Re: G4 and DCT documentation [was: tiff2ps: G4 compression inPostScript]", by Tom Kacvinsky
[...]

2000.02.01 12:47 "Re: G4 and DCT documentation [was: tiff2ps: G4 compression inPostScript]", by Tom Kacvinsky

Hi Tom,

Well, good luck finding out whether Adobe has determined it has to be "this way." If I am not mistaken, the problem with DCTDecode filter in Adobe CPSI is this (from GS's gsjmorec.h):

(Was there something about DCTDecode elsewhere in the thread? If so I missed it...)

No, I sort of hijacked the thread. The point I was trying to make is that I thought (notice past tense) that Adobe hadn't done a good job documenting the DCT filter blocks/MCU, and as such, would not document CCITT filters that well, either.

/*

  • Read "JPEG" files with up to 64 blocks/MCU for Adobe compatibility.
  • Note that this #define will have no effect in pre-v6 IJG versions.
  • */

#define D_MAX_BLOCKS_IN_MCU 64

whereas the standard for blocks/MCU is lower (I think).

Yes, the standard specifies a limit of 10 blocks/MCU. IIRC, that was driven by 1988-vintage considerations of silicon chip area for hardware JPEG implementations. It looks a little silly now, but it's still the standard.

As far as I know, this was (is) undocumented.

You're being unfair to Adobe: they spelled this out in their TN 5116, which gives substantial details about their DCTEncode/DCTDecode implementations. I've never rooted through the Various Color Books to see how much of the tech note made it into the books, but it's not as though they never published the info at all...

Thanks for the clarification. You are right, I am being unfair to Adobe...

Tom