Python plugin failing to load in script using paraview.simple.LoadState(), but works fine in GUI

I have created a Python plugin using vtkmodules.util.vtkAlgorithm.VTKPythonAlgorithmBase. This plugin works great in the GUI, but I am trying to use paraview.simple.LoadState() in a script so that I can parse my .pvsm state file into a .py state file. This operation fails, because one of the components of my plugin calls vtk.vtkMultiProcessController.GetGlobalController().GetCommunicator(). In the GUI, it returns a datatype vtkMPICommunicator for this call. But through LoadState(), it appears to return a datatype vtkDummyCommunicator and fails.

The relevant parts of the plugin that I am trying to use:

 29 import vtk
 30 _GC = vtk.vtkMultiProcessController.GetGlobalController()
 31 _RANK = _GC.GetLocalProcessId()
 32 _COMM = vtk.vtkMPI4PyCommunicator.ConvertToPython(_GC.GetCommunicator())
 33 print(type(_GC.GetCommunicator()))

Using the print statement on Line 33, I can tell that when it loads in the Paraview GUI, the datatype is vtkMPICommunicator.

Next I use that plugin filter in Paraview and save a .pvsm state file. After the .pvsm is saved, in my workflow I use another script to convert that .pvsm state file to .py. That script is reproduced below:

  4 import sys
  5 from paraview import simple as pvs
  6 from paraview import smstate
  8 pvsm_file = sys.argv[1]
 10 # Load pvsm state file
 11 pvs.LoadState(pvsm_file)

When I run that script, it crashes with the error trace in Python being:

  2 Failed to call `paraview.detail.pythonalgorithm.load_plugin`.^[[0m
  3 Traceback (most recent call last):
  4   File "/opt/Software/paraview/5.9.0_server/lib64/python3.7/site-packages/paraview/detail/", line 509, in load_plugin
  5     spec.loader.exec_module(module)
  6   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  7   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  8   File "path_to_plugin/", line 30, in <module>
  9     _COMM = vtk.vtkMPI4PyCommunicator.ConvertToPython(_GC.GetCommunicator())
 10 TypeError: ConvertToPython argument 1: method requires a vtkMPICommunicator, a vtkDummyCommunicator was provided.

From that error trace, I am led to believe that something in LoadState() is forcing _GC.GetCommunicator() to output as datatype vtkDummyCommunicator instead of vtkMPICommunicator, which is what it does in the Paraview GUI.

Is there a solution to this? Is it a problem with paraview.simple, or is it a problem with my implementation?

Can you provide simple steps to reproduce with ParaView ?

Here is a step-by-step using simple inputs that I’ve confirmed reproduces what I’m seeing. I did not provide a source dataset, because I believe it will occur regardless of what source is being used. My datasets are not shareable.

  1. Create a Python-based plugin that calls vtk.vtkMultiProcessController.GetGlobalController().GetCommunicator() within a vtk.vtkMPI4PyCommunicator.ConvertToPython() call. I have attached such a plugin. (1.8 KB) [EDIT: fixed a typo in plugin]
  2. Load that plugin into Paraview and make sure it is an auto-loaded plugin so that other scripts based on the Paraview Python APIs will recognize it (I am less confident on how this is done in a general sense. Within my company, this is handled by someone else).
  3. Using some source (as I stated above, I believe it can be any source and still see the problem), apply the filter from the plugin in Step 1. Save this as a state file in .pvsm format.
  4. Create a script that applies paraview.simple.LoadState() to the .pvsm file from Step 3. I have attached such a script. (146 Bytes)
  5. Run the script
  6. If the same problem occurs which I am seeing, then this script will crash because the datatype of vtk.vtkMultiProcessController.GetGlobalController().GetCommunicator() is vtkDummyCommunicator when it should be vtkMPICommunicator

If that process doesn’t reproduce the same problem, then it could be particular to how I am using Paraview, in which case I would appreciate any help determining what is causing it.

FYI @utkarsh.ayachit (I did not test)

Are you using pvpython to execute your script? Use pvbatch instead. pvpython does not initialize MPI by default since it’s not intended to use in distributed mode. Use pvbatch and you’ll always have an MPI controller available in an MPI-enabled build, irrespective of whether you’re actually running on more than 1 ranks.

That worked! I wasn’t aware you could run the same tool in pvpython or pvbatch, so that’s a good lesson to know. I was just executing it on the command line as python <script_name>, and that must default to pvpython.

Thanks for your help!