I have Paraview 5.10 installed on Ubuntu 20.04. The workstation has two AMD Radeon VII cards. However, when I am running it appears that the cards are not utilised and the RAM usage shoots up.
How can make the best use of my resources?
Please share your Help → About content
Qt Version: 5.12.9
vtkIdType size: 64bits
Embedded Python: On
Python Library Path: /home/ws1/paraview/lib/python3.9
Python Library Version: 3.9.5 (default, Sep 24 2021, 21:29:01) [GCC 7.3.1 20180303 (Red Hat 7.3.1-5)]
Python Numpy Support: On
Python Numpy Path: /home/ws1/paraview/lib/python3.9/site-packages/numpy
Python Numpy Version: 1.21.1
Python Matplotlib Support: On
Python Matplotlib Path: /home/ws1/paraview/lib/python3.9/site-packages/matplotlib
Python Matplotlib Version: 3.2.1
Python Testing: Off
MPI Enabled: On
ParaView Build ID: superbuild 804a787c9ac8f9b0f8abc7c3001d1e61eeb15208 (!922)
Disable Registry: Off
SMP Backend: TBB
SMP Max Number of Threads: 48
OpenGL Vendor: AMD
OpenGL Version: 4.6 (Core Profile) Mesa 21.0.3
OpenGL Renderer: AMD Radeon VII (VEGA20, DRM 3.40.0, 5.11.0-41-generic, LLVM 12.0.0)
Remote Connection: No
You are already using your GPU.
I do not feel the power of GPU. My computer freezes just when I load a 4GB unstructured dataset.
Sounds like RAM issue, not GPU issue. Monitor your memory.
I notice that Paraview is using as much as 33 GB of memory (64 GB in total) before the system freezes. Paraview defaults to ‘surface’ representation even after I tried to set ‘Outline’ as default.
radeontop output suggests that the GPUs are idle.
please share your dataset.