I am creating and deleting several filters and I thought I notice ‘memory leakage’ in paraview when creating a filter, and then deleting it. but now I am for sure experiencing it, my ram is enought to open one time a image filter that I am opening using the ImageReader() function, which creates the filter on the pipeline, this consumed a bunch of ram and worked correctly, but then i selected the created object on the pipeline and delete it, then in the python consol, i import gc and collect it, and the ram is still ‘consumed’ and if I run again the reading of the image, paraview closes for lack of memory.
is there any way in python to clean this memory after not being used anymore? i would like to open the file sequentially, one by one in same paraview instance, as i am limited in ram.
I am opening the raw data that I shared with you yesteday (as the TIFF importer looks like it has a bug, i succesfully imported the data without the intermediate format (TIFF), and directly imported the raw data using the image reader). so I have a .vol file which is the raw data, i open it in paraview (by open file → show all files → select the .vol file) then select ImageReader to import it, and it works as expected, reading the data correctly.
This takes a big ram comnsomption 32 gb, which i have enought, I would like to import it, cut a slice export the slice and the continue with another similar file of same weight, the thing is that when I delete the fist filter, the memory is not clean I still have in total 47 gb consumed (shown by htop, and the ram bar on paraview). therefore if i open a second file, paraview will crash, as i dont have 47 +32 gb.
i tried with gb.collect() but nothing changed, I would like to keep the rest of the session pipeline without modification therefore I am not reseting the session.
it is more or less my case, i want to loop over several files, import them do some treatment, recover a small section to export and then import the next one.
thanks in advance.
in the case of the GUI, i select the filter and simply delete it
in the case of python, i tested the two options, select the filter and simply delete it, and use Delete(filter)
in non of the cases the ram is being free, and therefore when I run the python function again OR import using the GUI on exactly same file (so memory should be enough, and when importing first time i had around 5 gb extra free) paraview simply crashes for lack of memory.
Opening and closing the file repeatably show only a marginal increase of memory usage here. While the ram may not appeared freed as soon as the source is deleted, it is usable by the application when reallocating. There is no leak as far as I can tell.
I’ll need to run this test on a machine able to actually open the file, because I had to simulate it on my machine with only 32Gb, this will take me some time im afraid.
Maybe a way to test is to open an image that the machine is capable 1 time but not 2. like if you have 16 gb open something in 8 gb realm, and then re opening it?
I dont know I suspect that is something in the realm of paraview re using the ‘pre located clean/not clean memory’ if it requires a partial part or something like this, as for the second passage i dont have enought memory and would require use a lot of the ‘pre located clean/not clean memory’.