calling pvbatch script with python


(Dan Lipsa) #1

Hi all,
I cannot get the following simple script to work with python (works fine with pvbatch). I use ParaView master. Any ideas?
Thanks!

import sys
from mpi4py import MPI

import paraview
paraview.options.batch = True
paraview.options.symmetric = True

from paraview.simple import *

sphere = Sphere()
rep = Show()
ColorBy(rep, ("POINTS", "vtkProcessId"))
Render()
rep.RescaleTransferFunctionToDataRange(True)
Render()
WriteImage("parasphere.png")

# servermanager.Finalize()

It works fine if I call it with pvbatch on 2 nodes even if it prints the warning (I see the sphere partitioned in two halves each colored differently.):

[~/projects/ParaView/build]$ PYTHONPATH=~/projects/ParaView/build/lib/python2.7/site-packages/ mpiexec -np 2 bin/pvbatch test.py
Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x55c20cfa9b00): Could not determine array range.

I see only one color (and two warnings) If I call it with python on 2 nodes. Note I need to uncomment the last line otherwise MPI is not cleaned up properly and also I need to import mpi4py otherwise MPI is not initialized properly.

[~/projects/ParaView/build]$ PYTHONPATH=~/projects/ParaView/build/lib/python2.7/site-packages/ mpiexec -np 2 python test.py
Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x5563d395afd0): Could not determine array range.

Warning: In /home/danlipsa/projects/ParaView/src/ParaViewCore/ServerManager/Rendering/vtkSMPVRepresentationProxy.cxx, line 277
vtkSMPVRepresentationProxy (0x55cd078ecd90): Could not determine array range.


(Utkarsh Ayachit) #2

vtkProcessId is an array that gets added by the representation when updating and hence it may not be available at the point where you’re trying to color by it. Either manually set the lut range or call a Render or view.Update before the ColorBy call to update the representation to generate the vtkProcessId array. Even cleaner, just plop the Process Id Scalars filter on the sphere instead and color by ProcessId array is generates. This should fix the warning. Let’s start there.


(Utkarsh Ayachit) #3

as a side note, I’d also like to highlight this PEP which proposes a design to unify Python initialization logic to avoid running into pitfalls, as you have.


(Dan Lipsa) #4

Thanks Utkarsh. The warning went away, but I get the same behavior: two colors for pvbatch and one color for python. Printing of the rank works fine in both cases.

This is the script I have:

import sys

# needed when called from python

from mpi4py import MPI

import paraview

paraview.options.batch = True

paraview.options.symmetric = True

from paraview.simple import *

print("rank {}".format(MPI.COMM_WORLD.Get_rank()))

Sphere()

ProcessIdScalars()

rep = Show()

ColorBy(rep, ("POINTS", "ProcessId"))

Render()

WriteImage("parasphere.png")

# needed when called form python

# servermanager.Finalize()

(Dan Lipsa) #5

The following script run through python produces wrong.png while run through pvbatch produces correct.png (with 2 nodes)

import sys

# needed when called from python

from mpi4py import MPI

import paraview

paraview.options.batch = True

paraview.options.symmetric = True

from paraview.simple import *

print("rank {}".format(MPI.COMM_WORLD.Get_rank()))

Sphere()

ProcessIdScalars()

elevation1 = Elevation()

elevation1.LowPoint = [0.0, -0.5, 0.0]

elevation1.HighPoint = [0.0, 0.5, 0.0]

rep = Show()

ColorBy(rep, ("POINTS", "Elevation"))

Render()

ResetCamera()

rep.RescaleTransferFunctionToDataRange(True)

Render()

WriteImage("parasphere.png")

# needed when called form python

servermanager.Finalize()
wrong.png: produced with python correct.png: produced with pvbatch

(Utkarsh Ayachit) #6

@danlipsa, filenames don’t show up in discourse, so please use heading text before the images or next to it to indicate which is which. assume top is wrong and bottom is correct, you just need to manually reset the range for LUT in your script. ranges are not computed in parallel when in symmetric mode by design. this is to avoid unintentional mpi sync between ranks. this does mean that your script has to take the extra burden of sync ranges if needed.


(Dan Lipsa) #7

Thanks Utkarsh! I clarified the images in discourse.

Should there a difference between pvbatch and python? pvbatch produces the correct result and python does not with the same script. (a small differences is the MPI init and cleanup needed for python)


(Utkarsh Ayachit) #8

Yes. You’re running pvbatch and not pvbatch -sym or pvbatch --symmetric. You’ll see the same thing if you run pvbatch in symmetric mode.


(Dan Lipsa) #9

I see. Great. Thanks Utkarsh!

Dan