And I could not confirm it yet myself… Looks like combined data space becomes so huge, that vector visualization filter crashes ParaView.
I don’t know what to do. I have no other option than Paraview, But it was not working with Intel-cpu graphics, and still keeps crashing with nVidia 1030 board. Maybe that is because of crappy windows compiler? Are things better in Linux? Or the 5.x.x branch is bad?
An NVIDIA 1030 should be reasonably capable with 2GB of RAM. Maybe you could provide some more information about your data. How many points, for example? Can you provide a data set with instructions for reproducing the crash?
ParaView is quite capable of rendering millions of points and cells on laptops these days, and there is no appreciable difference between Visual Studio or gcc compilers on linux in terms of how much data can be rendered.