Nvidia Index renderer failed

Hi all. I am trying to use Nvidia IndeX to do volume rendering. I manage to load the plugin with latest version of paraview57. Unfortunately, it shows a bunch of error when it rendering.

For you information, I am using Nvidia GTX 980Ti, with proprietary driver NVIDIA-SMI 430.26 Driver Version: 430.26 CUDA Version: 10.2. I am not sure is that my graphic card is not powerful enough as the case contains 47millions of cells. or my graphic driver got problem.

Thank you for your concern.
Screenshot%20from%202019-10-08%2018-50-05|690x388

The third line in the error window in the screenshot you posted indicates the problem. More RAM was requested than is on your GPU (which, btw, has a lot of RAM). 47 million cells is a good size dataset for a single GPU - if you want to do parallel rendering, you can contact NVIDIA about a license for IndeX that covers parallel usage (the version that ships with the plugin operates only in a single process).

1 Like

Hi Cory. Thank you for your reply. Is that mean I need a graphic card that have at 13GB of RAM to render?

Besides, can you provide more information about parallel rendering?

Thank you very much

That is what the error message suggests. Otherwise, you can reduce the amount of data you are trying to render by subsetting it in some way.

Please see the NVIDIA IndeX web page for how to obtain support on using IndeX in parallel.

I need a graphic card that have at 13GB of RAM to render?

The only consumer grade NVIDIA GPU with more than 13Gb of memory currently is the TITAN RTX.
It’s quite a budget.