1999.06.09 18:41 "Large File Support", by Chris Barker
I think getting Adobe involved is a bad idea. We have already suffered too long because tiff is not in the public domain. I don't think they can have a patent on the idea of tags, but if they can we are screwed. If we leave it to Adobe we will get stuck with flashpix or something similar that is full of someone or other's intellectual property.
We need a public process for making a new format to handle large files. While we are at it we can make tiff better.
I would suggest improving the tiling specification (a natural for large formats anyway) to include a "tile type" (this allows for zero-size white or black tiles, and optimal compression for complex tiles). Anyway, I'm getting ahead of myself here.
We need a real honest-to-goodness *standard* (without options) so we can write good (and freely available) software. While standardizing the format it would also be an excellent idea to standardize the programming interface to read/write/verify/info-extract libraries. Decoupling compression methods and photometric stuff from page, tiling and overview info is definitely the right way to go. This will allow the new spec to be used as a component in something like IIP, where a client can request the tiles it needs at the nearest available resolution, and receive the minimal set of tiles. This can make internet-based imaging practical over lower bandwidth connections.
Barker's law: For every inaction, there is an equal and opposite excuse.
Check out Interlinear's web page at http://www.ilt.com