2002.08.21 11:30 "Re: OT: large memory allocation in Windows", by Peter Majer
the problem of heap fragmentation cannot be that seldom. what
are the standard proven ways to solve it? i am quite surprised
about the lack of information on this topic.
the solutions that spring to my mind are:
- 64 bit compilation
- reorganizing memory requirements of the application such that contiguous large blocks are not necessary. E.g. store the data of a large image as an array of rows.
- make sure the application can run for ever with the memory that can be allocated in the beginning. In case of applications working with images, we might allocate the one image we load plus another one for some processing purposes and make sure that we never ever free these and need to allocate
- new large blocks.
- Also, I have been thinking about the third point you mention as a solution as well. Does anyone by chance know if there is an autopointer class out there that is of help with memory reorganization?
- a last (awful) solution is to write a memory-class, which maps chunks
on some virtual contiguous memory. (looks a hell of a job to me). Memory allocated by this class could only be accessed through this class, not by pointer aritmatic like "x = *p + 35".
--- Roger Bedell <firstname.lastname@example.org> wrote:
I am also confused about memory allocation. For the life of me, I cannot get Windows to allocate more memory than is contiguous in physical memory. In other words, if I do a GlobalAlloc for a fairly large chunk (perhaps 1GB or so), it always fails, even though I have lots of virtual memory. Anyone suggestions?