As shown in the below screen recording, the rendering of small-sized (here edge length of 0.3e-6)
grids is buggy. This erratic behavior does not happen for meshes larger than about 0.5e-6… If the mesh gets even smaller, the specifics of the rendering change, which makes me suspect that there are two planes involved in the rendering strategy that are about 0.1e-6 apart.
If true, it would be useful to have those plane spacings automatically scaled relative to the bounding box or the smallest distances in the geometry instead of them being hardwired.
diffusion_couple.vti (1.3 KB)
- If one zooms out (enough) of the scene and then rotates the object around, the clipping does not happen anymore.
- Exporting a screenshot (File → Save Screenshot) does not show the same color clipping but results in a correct visualization even when highly zoomed in.
ParaView 5.10, MacOS 12.2.1, 15 in MacBook Pro 2018 with Radeon Pro Vega 20 4 GB
Use Transform filter to work around this ?
I am not sure what the benefit of a transform would be here? If I am applying the transform filter but leave it “empty”, i.e. no translation, rotation, or scaling, then the underlying/transformed grid is still showing the same rendering glitches.
@mwestphal, can you reproduce the issue based on the attached vtkImageData mentioned in my above post?
By the way, I can confirm that the same issue is present in 5.9.1.
With Transform filter, you can scale the data up
I see. Was not aware of that filter since I usually accomplish this directly under “advanced properties → transform”.
Nevertheless, it is clear that one can work around the issue by not having data at this small physical scale. My concern is more of a fundamental nature as I consider this scale-dependent rendering a bug worth fixing. If one imagines having a combination of many small-scale vtkObjects in one visualization it becomes quite tedious to have to rescale each and every one of those consistently to avoid the rendering glitch…
Most definitely, fixing this would require some investigation effort though.
Cannot reproduce with the given dataset on ArchLinux, NVidia GeForce GTX 1650 Ti. Could you give some steps to reproduce ?
you may want to try to scale down the data. The tolerances could be hardware specific ? In my case, everything works ok with the original data but I can reproduce the issue if I scale down the cube by 0.1 in every dimension.
ParaView 5.10, Ubuntu 20.04, NVidia RTX A2000.
You are most likely running into limitations of single-precision floating point operations on graphics hardware at this scale. ParaView does rescaling behind the scenes to avoid this in a lot of cases, but there is a known issue that remains. Specifically, there is a problem where if no surface normals are supplied, you can see this behavior. The normals that are estimated in I believe the shader program are not accurate. To reproduce, try this:
- Add Sphere source, Radius 1e-7. Things look great.
- Now change the Normal Array property to “None”. Lighting is wrong.
@martink I’m pretty sure there is an issue open for this… do you happen to know it? Basic searches didn’t turn it up.
A workaround is to use the Generate Surface Normals filter to add normals to the geometry. That should take care of the rendering artifacts.