Hi,using native C++ in dlls called from a VB.NET application I want to dynamically allocate up to 3 GB of memory, in stages, using the “new” command.
The following code (example to illustrate the problem) produces the following error “Unhandled exception at 0x7c81eb33 n VMTest.exe: Microsoft C++ exception:std::bad_alloc at memory location 0x0012fe10..”
I have configured the virtual memory under control panel|system properties|Advanced tab|Advanced tab|settings to “System managed size” and customised with Initial size = 756 and Maximum size = 8192 with no success. I am using VS2005 on XP SP2.
Please can anybody advise me on a way forward using the "new" command or an alternative.
using namespace std;
int _tmain(int argc, _TCHAR* argv)
int count = 1000000000;
Hi Cedric, system requirements change. The application I am working now has to process large amounts of raw (binary) data. I was under the impression that Windows XP virual memory management would take care of data larger than RAM size. I could change the program so that I read chuncks i.e 0.5 GB at a time but this will impact other parts of the system.
My current development PC has 1GB of memory so I have will be upgrading as soon as possible.
Windows XP virual memory management would take care of data larger than RAM size
It will but if you're gong to be manipulating those huge chunks of memory at once - as in repeatedly allocating 4 GB then running through the entire 4G allocation - you're system is going to be constantly thrashing on the paging file. The additional time spent accessing the hard drive is probably going to be greater than whatever time you will gain be rewriting your logic to use the bigger chunks of memory. Processor speeds blow away disk access speeds.
Also, even if you put more than 4G of physical memory in your machine, XP cannot access it. It has a hard upper limit of 43G (32G using the default settings).
If you want to do this sort of thing you're probably best off using the Virtual Memory Win32 API, ::VirtualAlloc etc. Windows doesn't cope all that well if you try and grab all the RAM off it through the normal heap mechanism. You're also incurring CRT overhead and possibly tripping over inherent 2GB limits in the CRT malloc etc implementations when you use new. If you don't want to use Virtual memory you could use the Win32 Heap API e.g. ::HeapAlloc instead. Depending on how much RAM you've actually got you might get away with it. If you do my advice is make sure your allocation size are multiples of 64K and don't make them larger than about 4MB max at a time. That might help Windows to cope well enough. There's all sorts of wiredness going on underneath as ever with disk cache sizes and paging stuff in and out so you might want to target a specific and relatively recent Windows version aswell e.g. at least Server 2003 by defining _WIN32_WINNT 0x0502 and WINVER 0x0502. I'm not certain this will help but it's worth a try.
"The secret of happiness is freedom, and the secret of freedom, courage."
Thucydides (B.C. 460-400)
I believe that applications need to be flagged to indicate that they are able to handle a memory address space larger than 2GB. In VC++6.0 there was the /LARGEADDRESSWARE linker switch.
Remember that allocation of memory requires two things: (1) you need available free memory that can be allocated, and (2) you need enough contiguous addresses in your application's address space to access that memory through.
When you start heading beyond 2GB, you may need to be starting the OS with special flags (/3GB in the boot.ini file) which forces it to only reserve 1GB of address space instead of 2GB.
You should Google for things like the switch I mentioned above "AWE", and the "/3GB" switch for more information. Also, look into the VirtualAlloc(...) and related functions for handling large amounts of memory.
-=- James Please rate this message - let me know if I helped or not!<hr></hr>If you think it costs a lot to do it right, just wait until you find out how much it costs to do it wrong! Remember that Professional Driver on Closed Course does not mean your Dumb Ass on a Public Road! See DeleteFXPFiles
then using the GetCommState() function for reading the settings of the port and SetCommState() for setting the desired value's and trying to read the inputbuffer with ReadFile() I stumbled on a problem:
When starting my app this reports no error at all but it doesn't detect any input on the serial port (in my case a barcode scanner) and stops the reading function on a time_out and the handle is closed. If I start now (even when my app is still running) the Windows HyperTerminal with the appropriate settings for the scanner communication and quit the HyperTerminal again (even if I didn't scan any code, just establishing the connection and terminating it) and I start my app again the scanned codes are shown. If I check the settings of the DCB they are the same, when I run my app and then HyperTerminal.
Do I have to define something, somewhere regarding the "COM1" string used in the CreateFile section, and if so how is it done?
then using the GetCommState() function for reading the settings of the port and SetCommState() for setting the desired value's
What are you setting with SetCommState ? The baudrate ?
Make sure that you specify the correct parity, stop bits and byte size. Does your barcode scanner use some kind of handshaking ? That might be the problem.
If I look in HyperTerminal the dataflow control is set to NONE and I will try to set the DCB also to NONE.
But, how is it possible that the same app runs well when het Hyperterminal is started and stopped before the app starts to communicate and doesn't communicate when I start after rebooting Windows and not opening the hyperterminal?