Bad display for large dataset

Hi everyone,

Recently I have been working with this dataset :
VTU :: restart_00000000.vtu - Google Drive
HDF5 + XDMF :: restart_00000000.h5 - Google Drive + restart_00000000.xdmf - Google Drive

It’s large but not huge yet I would say (only 50 million cells) so I wasn’t expecting any trouble, but it turns out that
1/ although Paraview can display the mesh itself fine (when no field for color-mapping is chosen)
2/ pieces of the data literally disappear when I choose to display a field on it.

I tried VTU, HDF5+XDMF, VTK legacy … nothing works, it’s always the same issue.
Above are links to the data in case someone would want to try it : I left only one dataset “Density” that is supposed to be constant everywhere with a value of 1.4.

I tried Paraview 5.9.0_RC2 and Paraview 5.9.1 on MacOSX Big Sur. I am sorry I cannot provide screenshots right now I do not have them with me, but the problem is quite clear as soon as one try to display the Density field … only 2 small pieces are still displayed, the rest disappears …

Thanks for our help !!

Thibault

Unable to reproduce here.

ParaView 5.9.1, ArchLinux.

Thank you Mathieu.

Turns out I found a Paraview 5.8.0 on Centos 7 and it displays fine too. The machine I used here has 64 Go memory and opening the *.vtu leads to about 50% memory use.

Is it therefore possible that on my Mac, the lack of memory leads to display glitches ?


To add to the discussion, here is what I see when I try to visualize the mesh (it’s fine) and the density (not fine!).

Here it is using 13Go CPU Ram and 2G GPU Ram.

Check your GPU RAM and GPU driver.

As it turns out it uses 12.43 Go CPU RAM, but since the GPU is embedded in the CPU on the macbook pro, I cannot really give you the consumption on that end … :confused:
Is there a way to dump that from paraview maybe ?

Yeah, I think your integrated GPU is not able to handle rendering of a dataset that big.

You can try with --mesa option to not use your GPU at all.

OK thanks. The exe I got from the paraview 5.9.1 dmg does not seem to like “paraview --mesa”, it is looking to open the file called “–mesa” … do I have to compile Paraview by hand ?

It should be in the release, no idea how to set the option on MacOS. Maybe @cory.quammen knows.

The macos binaries do not include mesa, so that option is not available to you.

Thanks @cory.quammen .

Sorry @tbridel , not sure there is a solution here.

Ah … ok …
I tried using the --force-offscreen-rendering but I think it also relies on GPU somewhat, or anyways it does not work either.
So on the macs there is no way to disable the GPU use ?

I think a feature should be added for macs. I means macbooks rarely have GPUs and a lot of Paraview users have those, so I find it strange that I’m the first one noticing this shortcoming … but okay, I’ll find a way !

MacOS integrated GPU can show data without issue. Your dataset is too big for your GPU. You just do not have the usual fallback of using mesa the other OSes have.

You can open a feature request in the dedicated category:

Okay.
I can still compile paraview with MESA by hand right ?

Perhaps. That configuration on macos is untested, so I’m not sure how successful it will be.