calling pvbatch script with python

Hi all,
I cannot get the following simple script to work with python (works fine with pvbatch). I use ParaView master. Any ideas?
Thanks!

import sys
from mpi4py import MPI

import paraview
paraview.options.batch = True
paraview.options.symmetric = True

from paraview.simple import *

sphere = Sphere()
rep = Show()
ColorBy(rep, ("POINTS", "vtkProcessId"))
Render()
rep.RescaleTransferFunctionToDataRange(True)
Render()
WriteImage("parasphere.png")

# servermanager.Finalize()

It works fine if I call it with pvbatch on 2 nodes even if it prints the warning (I see the sphere partitioned in two halves each colored differently.):

[~/projects/ParaView/build]$ PYTHONPATH=~/projects/ParaView/build/lib/python2.7/site-packages/ mpiexec -np 2 bin/pvbatch test.py
Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x55c20cfa9b00): Could not determine array range.

I see only one color (and two warnings) If I call it with python on 2 nodes. Note I need to uncomment the last line otherwise MPI is not cleaned up properly and also I need to import mpi4py otherwise MPI is not initialized properly.

[~/projects/ParaView/build]$ PYTHONPATH=~/projects/ParaView/build/lib/python2.7/site-packages/ mpiexec -np 2 python test.py
Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x5563d395afd0): Could not determine array range.

Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x55cd078ecd90): Could not determine array range.

vtkProcessId is an array that gets added by the representation when updating and hence it may not be available at the point where you’re trying to color by it. Either manually set the lut range or call a Render or view.Update before the ColorBy call to update the representation to generate the vtkProcessId array. Even cleaner, just plop the Process Id Scalars filter on the sphere instead and color by ProcessId array is generates. This should fix the warning. Let’s start there.

as a side note, I’d also like to highlight this PEP which proposes a design to unify Python initialization logic to avoid running into pitfalls, as you have.

Thanks Utkarsh. The warning went away, but I get the same behavior: two colors for pvbatch and one color for python. Printing of the rank works fine in both cases.

This is the script I have:

import sys

# needed when called from python

from mpi4py import MPI

import paraview

paraview.options.batch = True

paraview.options.symmetric = True

from paraview.simple import *

print("rank {}".format(MPI.COMM_WORLD.Get_rank()))

Sphere()

ProcessIdScalars()

rep = Show()

ColorBy(rep, ("POINTS", "ProcessId"))

Render()

WriteImage("parasphere.png")

# needed when called form python

# servermanager.Finalize()

The following script run through python produces wrong.png while run through pvbatch produces correct.png (with 2 nodes)

import sys

# needed when called from python

from mpi4py import MPI

import paraview

paraview.options.batch = True

paraview.options.symmetric = True

from paraview.simple import *

print("rank {}".format(MPI.COMM_WORLD.Get_rank()))

Sphere()

ProcessIdScalars()

elevation1 = Elevation()

elevation1.LowPoint = [0.0, -0.5, 0.0]

elevation1.HighPoint = [0.0, 0.5, 0.0]

rep = Show()

ColorBy(rep, ("POINTS", "Elevation"))

Render()

ResetCamera()

rep.RescaleTransferFunctionToDataRange(True)

Render()

WriteImage("parasphere.png")

# needed when called form python

servermanager.Finalize()
wrong.png: produced with python correct.png: produced with pvbatch

@danlipsa, filenames don’t show up in discourse, so please use heading text before the images or next to it to indicate which is which. assume top is wrong and bottom is correct, you just need to manually reset the range for LUT in your script. ranges are not computed in parallel when in symmetric mode by design. this is to avoid unintentional mpi sync between ranks. this does mean that your script has to take the extra burden of sync ranges if needed.

Thanks Utkarsh! I clarified the images in discourse.

Should there a difference between pvbatch and python? pvbatch produces the correct result and python does not with the same script. (a small differences is the MPI init and cleanup needed for python)

Yes. You’re running pvbatch and not pvbatch -sym or pvbatch --symmetric. You’ll see the same thing if you run pvbatch in symmetric mode.

I see. Great. Thanks Utkarsh!

Dan

Hi Utkarsh,
Here is another difference between the behavior of a script called through pvbatch and python:

The script is the same as before, with the range set properly and a CreateWriter at the end to write the data.

import sys

# needed when called from python
from mpi4py import MPI
print("rank {}".format(MPI.COMM_WORLD.Get_rank()))

import paraview
paraview.options.batch = True
paraview.options.symmetric = True
from paraview.simple import *

Sphere()
ProcessIdScalars()
elevation1 = Elevation()
elevation1.LowPoint = [0.0, -0.5, 0.0]
elevation1.HighPoint = [0.0, 0.5, 0.0]
rep = Show()
ColorBy(rep, ("POINTS", "Elevation"))
elevationLUT = GetColorTransferFunction('Elevation')
elevationLUT.RescaleTransferFunction(0, 1)
ResetCamera()
Render()
WriteImage("parasphere.png")
writer = CreateWriter("test.pvd", elevation1)
writer.UpdatePipeline()


# needed when called form python
servermanager.Finalize()

When I call this from pvbatch it works correctly. When I call it from python I get:

ERROR: In /home/danlipsa/projects/ParaView/src/VTK/Parallel/Core/vtkMultiProcessController.cxx, line 207
vtkMPIController (0x5623638d6110): Communicator not set.

ERROR: In /home/danlipsa/projects/ParaView/src/VTK/Parallel/Core/vtkMultiProcessController.cxx, line 207
vtkMPIController (0x5556f47e7d30): Communicator not set.

ERROR: In /home/danlipsa/projects/ParaView/src/VTK/Parallel/Core/vtkMultiProcessController.cxx, line 207
vtkMPIController (0x5623638d6110): Communicator not set.

ERROR: In /home/danlipsa/projects/ParaView/src/VTK/Parallel/Core/vtkMultiProcessController.cxx, line 207
vtkMPIController (0x5556f47e7d30): Communicator not set.

Do you have any suggestions on how to initialize the communicator so that this code works from python as well. Thanks!

This warning goes away if I don’t do
servermanager.Finalize()

if I use the writer, even if I call this example from python.