So I’m wondering if there’s a limit on the size of input file in Paraview or the memoryError happened just because my local memory was too small? And is there any way that I can run this on my local machine or it has to be HPC?
Yeah, this could be a solution, I’ll try it out. And yes, I’m using the PVGeo-CMAQ reader, the file contains 25 timesteps (looks like the file I sent you last time but has bigger scale). Do I have to update script in both PVGeo and PVGeo-HDF5 or only update PVGeo-HDF5?
Thanks!
You will only need to update the code for the PVGeo-CMAQ reader in PVGeo-HDF.
To outline the changes needed, instead of reading the data up front (this is what the PVGeo-CMAQ reader is currently doing), you’ll update it to get the requested time step and pass that timestep to the reading function to only read the requested part of the file.
So change lines 190-193 to pass the requested timestep on every RequestData call. Then change _ReadUpFront to take the timestep index and restructure that function to only grab the needed data for each time step.
This might be a bit clunky, but it should allow you to visualize the 50GB data file on your local machine.