AWARE SYSTEMS
TIFF and LibTiff Mail List Archive

Thread

2002.08.20 07:31 "memory allocation", by Peter Majer
[...]
2002.08.21 08:34 "Re: OT: large memory allocation in Windows", by Rob van den Tillaart
2002.08.21 11:30 "Re: OT: large memory allocation in Windows", by Peter Majer
2002.08.21 13:47 "Re: OT: large memory allocation in Windows", by Rob van den Tillaart
2002.08.21 11:48 "Re: OT: large memory allocation in Windows", by Peter Nielsen
[...]

2002.08.21 11:30 "Re: OT: large memory allocation in Windows", by Peter Majer

Dear Rob,

the problem of heap fragmentation cannot be that seldom. what
are the standard proven ways to solve it? i am quite surprised
about the lack of information on this topic.

the solutions that spring to my mind are:

  1. 64 bit compilation
  2. reorganizing memory requirements of the application such that contiguous large blocks are not necessary. E.g. store the data of a large image as an array of rows.
  3. make sure the application can run for ever with the memory that can be allocated in the beginning. In case of applications working with images, we might allocate the one image we load plus another one for some processing purposes and make sure that we never ever free these and need to allocate
  4. new large blocks.
  5. Also, I have been thinking about the third point you mention as a solution as well. Does anyone by chance know if there is an autopointer class out there that is of help with memory reorganization?

of memory

on some virtual contiguous memory. (looks a hell of a job to me). Memory allocated by this class could only be accessed through this class, not by pointer aritmatic like "x = *p + 35".

--- Roger Bedell <roger@sylvanmaps.com> wrote:

I am also confused about memory allocation. For the life of me, I cannot get Windows to allocate more memory than is contiguous in physical memory. In other words, if I do a GlobalAlloc for a fairly large chunk (perhaps 1GB or so), it always fails, even though I have lots of virtual memory. Anyone suggestions?