64 bit memory usage
I am trying to understand how much memory is available to a 64 bit python process running under Windows XP 64 bit. When I run tests just creating a series of large dictionaries containing string keys and float values I do not seem to be able to grow the process beyond the amount of RAM present. For example, on a box with 2GB RAM and 3 GB pagefile the process stalls at around 2GB. On another machine with 16GB RAM and 24GB pagefile the process stalls at 16GB. In other tests where a C++ program loads and runs the python DLL, if C++ based operations are performed the memory usage will grow to 40GB, but if python is used to grab the memory it can still only grow to 16GB. With this program if the memory usage is grown to over 16GB on the C++ side, attempting to grab any from python crashes the process. I was under the impression that python could grab as much memory as other programs. Can anyone tell me what is happening or where I may be going wrong? Thanks, Rob Randall -- http://mail.python.org/mailman/listinfo/python-list
Re: 64 bit memory usage
But the C++ program using up memory does not slow up. It has gone to 40GB without much trouble. Does anyone have a 64 bit python application that uses more the 2GB? On 9 December 2010 16:54, Antoine Pitrou wrote: > On Wed, 8 Dec 2010 14:44:30 + > Rob Randall wrote: > > I am trying to understand how much memory is available to a 64 bit python > > process running under Windows XP 64 bit. > > > > When I run tests just creating a series of large dictionaries containing > > string keys and float values I do not seem to be able to grow the process > > beyond the amount of RAM present. > > > > For example, on a box with 2GB RAM and 3 GB pagefile the process stalls > at > > around 2GB. > > > > On another machine with 16GB RAM and 24GB pagefile the process stalls at > > 16GB. > > How is it surprising? When you go past the available RAM, your process > starts swapping and everything becomes incredibly slower. > > Regards > > Antoine. > > > -- > http://mail.python.org/mailman/listinfo/python-list > -- http://mail.python.org/mailman/listinfo/python-list
Re: 64 bit memory usage
Basically the process runs at around 1% and it never seems to grow in size again. When running the C++ with python app the process slows when a new 'page' is required but then goes back to 'full' speed. It does this until basically all the virtual memory is used. I have had memory exceptions when running the same sort of stuff on 32 bit, but never 64 bit. On 9 December 2010 16:54, Antoine Pitrou wrote: > On Wed, 8 Dec 2010 14:44:30 + > Rob Randall wrote: > > I am trying to understand how much memory is available to a 64 bit python > > process running under Windows XP 64 bit. > > > > When I run tests just creating a series of large dictionaries containing > > string keys and float values I do not seem to be able to grow the process > > beyond the amount of RAM present. > > > > For example, on a box with 2GB RAM and 3 GB pagefile the process stalls > at > > around 2GB. > > > > On another machine with 16GB RAM and 24GB pagefile the process stalls at > > 16GB. > > How is it surprising? When you go past the available RAM, your process > starts swapping and everything becomes incredibly slower. > > Regards > > Antoine. > > > -- > http://mail.python.org/mailman/listinfo/python-list > -- http://mail.python.org/mailman/listinfo/python-list
Re: 64 bit memory usage
I will give it a try with the garbage collector disabled. On 9 December 2010 17:29, Benjamin Kaplan wrote: > On Thursday, December 9, 2010, Rob Randall wrote: > > But the C++ program using up memory does not slow up. > > It has gone to 40GB without much trouble. > > > > Your C++ program probably doesn't have a garbage collector traversing > the entire allocated memory looking for reference cycles. > > > Does anyone have a 64 bit python application that uses more the 2GB? > > > > On 9 December 2010 16:54, Antoine Pitrou wrote: > > On Wed, 8 Dec 2010 14:44:30 + > > Rob Randall wrote: > >> I am trying to understand how much memory is available to a 64 bit > python > >> process running under Windows XP 64 bit. > >> > >> When I run tests just creating a series of large dictionaries containing > >> string keys and float values I do not seem to be able to grow the > process > >> beyond the amount of RAM present. > >> > >> For example, on a box with 2GB RAM and 3 GB pagefile the process stalls > at > >> around 2GB. > >> > >> On another machine with 16GB RAM and 24GB pagefile the process stalls at > >> 16GB. > > > > How is it surprising? When you go past the available RAM, your process > > starts swapping and everything becomes incredibly slower. > > > > Regards > > > > Antoine. > > > > > > -- > > http://mail.python.org/mailman/listinfo/python-list > > > > > -- > http://mail.python.org/mailman/listinfo/python-list > -- http://mail.python.org/mailman/listinfo/python-list
Re: 64 bit memory usage
You guys are right. If I disable the gc it will use all the virtual RAM in my test. The application I have been running these tests for is a port of a program written in a LISP-based tool running on Unix. It does a mass of stress calculations. The port has been written using a python-based toolkit I am responsible for. This toolkit offers much of the same functionlity as the LISP tool. It is based around the use of demand-driven/declarative programming. When the porting project started no one realised just how much memory the heaviest of the test cases used. It uses 40+ GB on an HP Unix machine. It is easy to see now that the port should have been written differently, but it is essentially complete now. This has lead me to see if a hardware solution can be found using 64 bit windows machnes. I will try running one the tests next to see what impact disabling the gc will have. Thanks, Rob. On 9 December 2010 22:44, John Nagle wrote: > On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote: > >> On Wed, 8 Dec 2010 14:44:30 +, Rob Randall >> declaimed the following in gmane.comp.python.general: >> >> I am trying to understand how much memory is available to a 64 bit python >>> process running under Windows XP 64 bit. >>> >>> When I run tests just creating a series of large dictionaries containing >>> string keys and float values I do not seem to be able to grow the process >>> beyond the amount of RAM present. >>> >> > If you get to the point where you need multi-gigabyte Python > dictionaries, you may be using the wrong tool for the job. > If it's simply that you need to manage a large amount of data, > that's what databases are for. > > If this is some super high performance application that needs to keep a > big database in memory for performance reasons, CPython > is probably too slow. For that, something like Google's BigTable > may be more appropriate, and will scale to terabytes if necessary. > >John Nagle > > -- > http://mail.python.org/mailman/listinfo/python-list > -- http://mail.python.org/mailman/listinfo/python-list
Re: 64 bit memory usage
I manged to get my python app past 3GB on a smaller 64 bit machine. On a test to check memory usage with gc disabled only an extra 6MB was used. The figures were 1693MB to 1687MB. This is great. Thanks again for the help. On 10 December 2010 13:54, Rob Randall wrote: > You guys are right. If I disable the gc it will use all the virtual RAM in > my test. > > The application I have been running these tests for is a port of a program > written in a LISP-based tool running on Unix. > It does a mass of stress calculations. > > The port has been written using a python-based toolkit I am responsible > for. This toolkit offers much of the same functionlity as the LISP tool. > It is based around the use of demand-driven/declarative programming. > > When the porting project started no one realised just how much memory the > heaviest of the test cases used. > It uses 40+ GB on an HP Unix machine. > > It is easy to see now that the port should have been written differently, > but it is essentially complete now. > > This has lead me to see if a hardware solution can be found using 64 bit > windows machnes. > > I will try running one the tests next to see what impact disabling the gc > will have. > > Thanks, > Rob. > > > > On 9 December 2010 22:44, John Nagle wrote: > >> On 12/8/2010 10:42 PM, Dennis Lee Bieber wrote: >> >>> On Wed, 8 Dec 2010 14:44:30 +, Rob Randall >>> declaimed the following in gmane.comp.python.general: >>> >>> I am trying to understand how much memory is available to a 64 bit >>>> python >>>> process running under Windows XP 64 bit. >>>> >>>> When I run tests just creating a series of large dictionaries containing >>>> string keys and float values I do not seem to be able to grow the >>>> process >>>> beyond the amount of RAM present. >>>> >>> >> If you get to the point where you need multi-gigabyte Python >> dictionaries, you may be using the wrong tool for the job. >> If it's simply that you need to manage a large amount of data, >> that's what databases are for. >> >> If this is some super high performance application that needs to keep a >> big database in memory for performance reasons, CPython >> is probably too slow. For that, something like Google's BigTable >> may be more appropriate, and will scale to terabytes if necessary. >> >>John Nagle >> >> -- >> http://mail.python.org/mailman/listinfo/python-list >> > > -- http://mail.python.org/mailman/listinfo/python-list
