2009.04.26 17:22 "[Tiff] Packbits worst case encoded length", by Simon Berger

2009.05.05 22:57 "Re: [Tiff] Packbits worst case encoded length", by Graeme Gill

Yes and no. You're wasting address space, not the RAM itself. There won't be a big chunk of unused RAM because of this.

At worst, some systems will reserve some swap space.

In many systems this amounts to the same thing. Lots of 32 bit systems have 3-4Gig of memory now. It's really interesting to see the sorts of problems that crop up due to virtual address space exhaustion, when you actually try and use all that memory! (ie. address space fragmentation problems in the memory allocators).

I'm suggesting to use a different allocator for risky allocations. These would primarily be buffers that you decompress into.

Since this requires judgment on the programmers part, it would seem to map right back to the need to write perfect code.

I actually think it is possible to make a 100% bullet proof file parser, assuming that the format isn't stupidly intricate or performance sensitive, and that is by taking the approach of funneling all the file accesses and resulting memory allocations through one small set of routines that can reasonably be made bug free.

That won't prevent higher level attacks (ie. triggering memory problems by exploiting bugs in the code that interprets the contents of the parsed file), but it would prevent any exploits in the actual parsing of the file itself.

Graeme Gill.