2004.04.22 23:59 "Re: [Tiff] Large TIFF files", by Chris Cox
Revising the spec. is yet another issue (which I'm working on).
- Is good news
- and thanks
- and best of luck
Any chance of an answer on the memory managment issue?
As an example, changing software tag from 'abc' to 'abcdef copyright 2003', in an 10 gig file, if there is no such thing as memory managment, involves either
- doing hacky stuff, like appending new data to file, changing pointers, and thus the old data becomes virtually untracable garbadge inside the file that is never cleaned up
The IFD already allows for you to do things like that (solution a). Yes, the current approach leaves a hole and adds to the end. But you rarely end up with too large a hole.
With memory managment, the task becomes simply freeing the old data block, which then becomes tracable and reusable free but managed memory, and asking the memory manager for a new data block in the file that is of suitable size for the new data, which may be (part of) an old reused block or... Very small IO operation instead. TIFF instantly becomes an 'editable' format this way. Seeing that the new file format is specifically targetted towards huge files, the benefit may be substantial.
OK - that's much too complex.
Now you're trying to implement a file system within a file. That's been tried before - and it was a nightmare.
I've been doing some experimenting with such a 'memory managed file format' myself, though in a more general-purpose way, but I've never been quite satisfied with what I came up with. Anyone willing to comment? Is there any standard or known scheme or something? How do you feel about such a scheme in conjunction with the new gigatiff?
The most complex I'd go is something like the ZIP archive format. And that's only really useful if you're dealing with several independent chunks of data.