Line probe fails in parallel for Nek5000 data

Hello,

I’m using Paraview 5.7.0. When I do PlotOverLine on data from Nek5000 running Paraview in serial, it works fine. But when I run in parallel, the line probe fails. Basically, it doesn’t show the Nek5000 variables, only stuff like Points_X, Points_Y, etc. This also happened with versions 5.4 - 5.6. Is it a bug? Is there a workaround?

Thanks,
Juan Diego

the Nek5000 reader is not behaving correctly in Parallel.
This is a bug, but the work around is easy, there are actually two :

Easy but may take some time to compute with big datasets :

  • Open your .nek5000 file with a // ParaView
  • Add a D3 Filter, Apply (this redistribute the data)
  • Plot over line works perfectly

More complex but more efficient :

  • Open your .nek5000 file with a // ParaView
  • Add a MergeBlocks filter, Apply
  • Add an AppendReduce filter, All Data to All processor, Apply
  • Plot over line works perfectly

Hi Mathieu,

Thank you for your reply. The first option worked well for me. The second option caused Paraview to crash. I was running it as client-server mode, and this is the last few lines from the console where I was running pvserver:

( 468.216s) [pvserver.0      ]vtkPVDataInformation.cx:1446   ERR| vtkPVDataInformation (0x90eae70): Error parsing class name of data.
( 468.216s) [pvserver.0      ]vtkPVDataInformation.cx:1446   ERR| vtkPVDataInformation (0x90eae70): Error parsing class name of data.
( 468.316s) [pvserver.0      ] vtkMPICommunicator.cxx:71    WARN| MPI had an error
------------------------------------------------
Message truncated, error stack:
PMPI_Gatherv(416)...................: MPI_Gatherv failed(sbuf=0xae09040, scount=757, MPI_UNSIGNED_CHAR, rbuf=0xa721180, rcnts=0xa87bc30, displs=0xa87bed0, datatype=MPI_UNSIGNED_CHAR, root=0, comm=MPI_COMM_WORLD) failed
MPIR_Gatherv_impl(184)..............:
MPIR_Gatherv_intra_auto(93).........:
MPIR_Gatherv_allcomm_linear(99).....:
MPIC_Irecv(596).....................:
MPIDI_CH3U_Request_unpack_uebuf(516): Message truncated; 926 bytes received but buffer size is 757
------------------------------------------------
application called MPI_Abort(MPI_COMM_WORLD, 808023310) - process 0

Thank you,
Juan Diego

I can’t reproduce your issue. Can you share your dataset ?