R
Rich
Hello all
I have an application that is segmented into various Dlls, one of
which contains caching mechanisms that create very large in-memory
data structures.
No matter what I do to attempt to allow the Application to utilize
2-3GB of RAM, both through standard and /3GB /4GT /PAE switches and
the like for the system and combinations of /LARGEADDRESS... etc. for
the app config .. the App will always fail when it hits *1GB* on the
mark.
Now this is very confusing to me. I have read everything that I can
get my hands on about the NT(ish) kernel and there should be at least
2GB of user space available in the regular 2:2 split and 3GB available
on the 3:1 split.
Why is it dying at 1GB?
I found some things about shared memory in older windows systems with
limits at 1GB, but have found no smoking gun that indicates why this
would be happening.
The machine has 4GB physical RAM, the OS recognizes it, there is a
4096k swap file allocated, there is nothing else loaded or running on
the machine during testing.
The only complicating factor is that all the DLLs are JNI wrappers for
C++ libraries and the application is executing through a Java server.
I mention the Java part last because I have completely exhausted
causality from that direction and am left with what I believe to be a
Windows/C++/DLL problem.
Where I am right now, and I may be totally off, is that it is any of:
- a project configuration problem in VC++
- a server configuration problem
- an unpublished DLL memory limitation
- an effect of running through Java, which may be executing in some
protected legacy mode in the OS with a 1GB limit
I also found some materials about forcing relocation of the DLLs into
lower memory to make more space available via /rebase.
Any thoughts from anyone? Please!!?
Thanks
Rich
I have an application that is segmented into various Dlls, one of
which contains caching mechanisms that create very large in-memory
data structures.
No matter what I do to attempt to allow the Application to utilize
2-3GB of RAM, both through standard and /3GB /4GT /PAE switches and
the like for the system and combinations of /LARGEADDRESS... etc. for
the app config .. the App will always fail when it hits *1GB* on the
mark.
Now this is very confusing to me. I have read everything that I can
get my hands on about the NT(ish) kernel and there should be at least
2GB of user space available in the regular 2:2 split and 3GB available
on the 3:1 split.
Why is it dying at 1GB?
I found some things about shared memory in older windows systems with
limits at 1GB, but have found no smoking gun that indicates why this
would be happening.
The machine has 4GB physical RAM, the OS recognizes it, there is a
4096k swap file allocated, there is nothing else loaded or running on
the machine during testing.
The only complicating factor is that all the DLLs are JNI wrappers for
C++ libraries and the application is executing through a Java server.
I mention the Java part last because I have completely exhausted
causality from that direction and am left with what I believe to be a
Windows/C++/DLL problem.
Where I am right now, and I may be totally off, is that it is any of:
- a project configuration problem in VC++
- a server configuration problem
- an unpublished DLL memory limitation
- an effect of running through Java, which may be executing in some
protected legacy mode in the OS with a 1GB limit
I also found some materials about forcing relocation of the DLLs into
lower memory to make more space available via /rebase.
Any thoughts from anyone? Please!!?
Thanks
Rich