MPI Integer Limit Error

Hi everyone,

I come back to an older issue regarding the MPI error

"vtkOutputWindow.cxx:86 WARN| Generic Warning: In /home/paraview_5.8.1/ParaView-v5.8.1/VTK/Parallel/MPI/vtkMPICommunicator.cxx, line 220
This operation not yet supported for more than 2147483647 objects "

that was discussed by someone else back in 2017 (https://public.kitware.com/pipermail/paraview/2017-June/040405.html)

Using a multiblock structured mesh with about 1.3e9 cells. We have implemented a parallel reader for HDF5 into paraview using vtkMultiBlockDataSet and everything runs smoothly for smaller meshes in client-server sessions. However, similar to the issue from 2017 with this larger mesh the file can be opened and even the data loads, such that the outlines are displayed. However, when switching to ‘surface’-view I get thrown the error above. This is independent of the number of cores I use on the server. We use ParaView 5.8.1.

Has anyone experienced the same and might have a remedy?

Best wishes
Marian

Please try with the last release of ParaView

That fixed it, thank you!

I am getting a similar issue when trying to read a VTM file output from pvbatch (on 1008 cores) into ParaView 5.9.1. It reads fine into a serial instance of ParaView, but when AutoMPI is turned on it hangs (on Windows the pvservers crash, on linux just keeps outputting error messages). 2.36GB of data and 3,018 VTU objects.

( 74.052s) [paraview ] vtkOutputWindow.cxx:86 WARN| Generic Warning: In /builds/gitlab-kitware-sciviz-ci/build/superbuild/paraview/src/VTK/Parallel/MPI/vtkMPICommunicator.cxx, line 206
This operation not yet supported for more than 2147483647 objects
( 74.066s) [paraview ] vtkOutputWindow.cxx:76 ERR| ERROR: In /builds/gitlab-kitware-sciviz-ci/build/superbuild/paraview/src/VTK/IO/Legacy/vtkDataReader.cxx, line 544
vtkGenericDataObjectReader (0x172c32a0): Unrecognized file type: for file: (Null FileName)

Please try with the last release of ParaView, 5.10-RC1

The latest release doesn’t even have AutoMPI enabled (and it may be deprecated?). Also in 5.9.1 and 5.10 when you try and set up a localhost client server you can’t enter a command into the server setup dialog without ParaView instantly crashing…on both Windows and Linux. I had to write an external batch/bash script to start a server and automatically connect to it. The files then read in successfully.

The discussion about removal happened here:

Also in 5.9.1 and 5.10 when you try and set up a localhost client server you can’t enter a command into the server setup dialog without ParaView instantly crashing…on both Windows and Linux.

This issue has been fixed already in master and will not be present in ParaView 5.10

I had to write an external batch/bash script to start a server and automatically connect to it. The files then read in successfully.

You can write a .pvsc file instead

Good news about the setup server dialog being fixed, thank you. As to MPI not being required, I think that is just not true. Creating contours, slices, stream tracers all speed up with the benefit of MPI. None of those calculations are threaded that I can tell? The only obviously threaded process is rendering, using OSPRay.

Angus,
MPI is not required, as in the filters you mentioned work. If it is desired, you can always just build and connect to a remote server, and enjoy the performance improvements thereof. With regards to builtin server (i.e., it just works out of the box), the solution is threading. This is on the plate, and will be done as time and resources are available.
Alan

Also MPI has not been removed, AutoMPI has been removed, not the same thing.

The only obviously threaded process is rendering, using OSPRay.

This is defnitely not true.