@Daniel Eggleston You may be right as i was wrtiting a very big numpy array 
file in python. But due to some issues i had to terminate the execution in 
between. The size if ram may be due to that processed numpy array that 
needs to be written at the end of the script. I was monitoring RAM  and 
other resources from System Monitor in ubuntu. Yes size of swap reduced 
from 14 gb to 8 gb. May i know if the file buffers are the caise of this 
huge usage of RAM. How to clean this unused wasted space of the RAM. Please 
help

On Thursday, May 28, 2020 at 5:54:27 PM UTC+5:30, Rahul Gupta wrote:
>
> I am having a Ubuntu system which has 125 Gb of RAM. I executed few python 
> scripts on that system. Those scripts uses numpy arrays and pandas. Now 
> execution was over but still 50 gb of RAM and 2 Gb cache and 8.4 Gb of swap 
> is occupied. At this moment nothing is running on the system. I have 
> googled it. Most of th result shows that python garbage collector is poor 
> in performance. I want this memory to be cleaned and re claim. One of the 
> easiest way is to restart the system but i dont want to restart i want a 
> way to do this when the system is up and running. Kindly tell me how to do 
> this. Thanks
>

-- 
-- 
You received this message because you are subscribed to the Linux Users Group.
To post a message, send email to [email protected]
To unsubscribe, send email to [email protected]
For more options, visit our group at 
http://groups.google.com/group/linuxusersgroup
References can be found at: http://goo.gl/anqri
Please remember to abide by our list rules (http://tinyurl.com/LUG-Rules)
--- 
You received this message because you are subscribed to the Google Groups 
"Linux Users Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/linuxusersgroup/538eaa6b-589c-4c6b-b8b7-25b96c1bfad9%40googlegroups.com.

Reply via email to