OptiX Ray Tracing

Hello community,

I was looking forward to the 5.7 release when I read that it would support the new NVidia RTX card, so with the first RC of 5.7, I downloaded it, when tried to setup a scene to try it, with an RTX 2080 in my workstation.
I now see that I can choose the OptiX backend when ray tracing is on, but how can I check whether it actually works ? my CPU shows usage peaks around 100% when I change the view, which was typical when using OSpray, but I was expecting a lot more GPU load when choosing OptiX.

Or was my intuition just wrong ?

So, is there a possibility to know whether the RTX cores are actually used ?

Also I can still choose the OptiX backend, even on my laptop, which does not have a RTX card, so is this expected ?


It is a bug that the dialog shows OptiX when it is not available. With our RC1 and nightly binaries, only the Linux packages actually have OptiX included. In other situations the behavior right now is to silently use OSPRay in those situations.

Before the RC cycle is through OptiX should be included in the Windows binaries and the OptiX choice should be disabled on Mac.

Tracking the issue here: https://gitlab.kitware.com/paraview/paraview/issues/19180

Thanks for the quick reply.

So does that mean that my gpu is not used and paraview falls silently back to ospray ? (I have a windows workstation)

It looks like it is not working with linux package either.

It works great on my build 5.7.0-RC1 and outputs

VisRTX 0.1.6, using devices:
 0: GeForce RTX 2080 Ti (Total: 11.5 GB, Available: 10.9 GB)

While the downloaded release does not output anything and fallback to ospray.

(On ArchLinux with GeForce RTX 2080 Ti)

1 Like