A problem of parallel processing in paraview

I have made two machines as a remote server. And it seemed work good when I put something from the Sources menu.

But when I put something from other place, the parallel processing don’t work.

There is only one color rendered in the bunny and I can’t tell if there is only one cpu work for rendering it.
Do you have any idea of this problem??

Certain readers, like OBJ reader, are not parallel-aware. In such cases, all data is generally read on the root node. You can then apply filters such as Redistribute DataSet (5.8 and newer) or D3 (for earlier versions) to redistribute the dataset among all ranks.

Thank you! My problem partly solved. I can apply D3 for redistribute the dataset like this,image
but when I use Redistribute DataSet(and my paraview version is 5.8.0), it will be a connection refused like this, image
and I have no idea about it.

I am not sure what that is. Looks like it’s coming from OpenMPI code. Is it causing any issues? Maybe try updating to a newer version of OpenMPI, if possible. You can simply continue to use D3 in 5.8.

Does CONVERGECFD reader not parallel-aware ?

I had a state file (*.py) prepared using non-MPI ParaView, when I was trying to run this in a MPI ParaView version, it was crushing. is this the reason ?

From the docs, it seems it does not support loading data in parallel.

cc: @cory.quammen

Indeed, the CONVERGECFD does not read in parallel.

Thanks @utkarsh.ayachit, @cory.quammen.

In my state file ( prepared in for non-mpi version), I have added below line beginning of the pipeline :

redistributeDataSet1 = RedistributeDataSet(registrationName=‘RedistributeDataSet1’, Input=venturi_data)

my expectation was I will be able to load this state file. Looks like its just stuck, not have any error message.

Is there anything else I need to add ?

You shouldn’t need to add anything. If you can provide more details about your workflow that is leading to the hang, that would be helpful.

thanks @cory.quammen.

In attachment there are two state files. MPI_Test_01.py was my older state file, created in non-MPI version of Paraview. To use this code in MPI version I modified MPI_Test_01.py the code to MPI_Test_02.py. You will see in line 69, I have added redistributeDataSet filter.

When I use the MPI_code_02.py, Paraview GUI crashed without giving any error message.

MPI_Test_01.py (25.7 KB) MPI_Test_02.py (25.8 KB)