Catalyst 2 API C++: Documentation and architecture

Hello, I am new to visualization in-situ. I would like to implement it in my C++ simulation software.
I have looked into the documentation Catalyst — Catalyst documentation
Is there a place with more detailed documentation?

Also, where can I find a process flow or system architecture on how Catalyst integrates with code and Paraview?

You can find some examples in ParaView code source. They are good starting point.

Examples do not say anything on how to actually run it?
do I have to link libraries from my ParaView build? or just Catalyst2?

How do I connect with Paraview? default port 22222 will work?

Building

  • no need for paraview
  • MPI required
  • installed version of catalyst2 required
  • use CMake for configuration to find the packages

Running

Run the generated executable with a catalyst pipeline script as parameter (can be generated from ParaView). The command is also added as a test (you can enable it in CMake and then use ctest).

The executable should find the correct catalyst lib to do effective computation. That’s where ParaView comes.

export CATALYST_IMPLEMENTATION_PATHS="/path/to/ParaView/lib/catalyst" 
export CATALYST_IMPLEMENTATION_NAME=paraview
./bin/CxxFullExample catalyst_pipeline.py

Be careful that MPI used to build the simu should be the same that MPI used by ParaView.

Connecting

This is an option you can enable in your python script. See Save as Catalyst script in ParaView to enable it and optionally configure the port (but the default works).

Use Catalyst / Connect menu in ParaView so ParaView will hear for catalyst. Then run your simu as explained above. The configured pipeline will appears in your ParaView session.

tip: use Catalyst / pause simu menu before starting the simu, so it will be paused on the first timestep, letting you inspect what you want. Useful because the simu will close the connection when terminating. (and examples code go fast).

Thank you so much for this thorough instruction!

I am not sure what do you mean about MPI. Do you mean, we should like this ?

/path/to/ParaView/bin/mpiexec

Currently when I execute the ./CxxFullExampleV2 I am getting this

Loguru caught a signal: SIGSEGV
Stack trace:
10 0x55f00158a27a ./bin/CxxFullExampleV2(+0x427a) [0x55f00158a27a]
9 0x7fceb6d25bf7 __libc_start_main + 231
8 0x55f00158f942 ./bin/CxxFullExampleV2(+0x9942) [0x55f00158f942]
7 0x55f00158de36 ./bin/CxxFullExampleV2(+0x7e36) [0x55f00158de36]
6 0x7fceb7d5e69d catalyst_initialize + 174
5 0x7fceaf5eaec3 /home/karol/ParaView-5.10.0-MPI-Linux-Python3.9-x86_64/lib/catalyst//libcatalyst-paraview.so(+0x9ec3) [0x7fceaf5eaec3]
4 0x7fceaf3d5897 vtkInSituInitializationHelper::Initialize(unsigned long long) + 87
3 0x7fceacb0cea9 vtkMPICommunicator::InitializeExternal(vtkMPICommunicatorOpaqueComm*) + 121
2 0x7fceacb0cc08 vtkMPICommunicator::InitializeNumberOfProcesses() + 40
1 0x7fceb77262d7 PMPI_Comm_size + 55
0 0x7fceb6d43040 /lib/x86_64-linux-gnu/libc.so.6(+0x3f040) [0x7fceb6d43040]
( 0.003s) [main thread ] :0 FATL| Signal: SIGSEGV
Segmentation fault (core dumped)

Problem is that distributed PV is build with a given MPI and when you build your example, it may be another, incompatible MPI.

The correct fix is to build ParaView by yourself using your version of MPI.

A workaround for testing purpose is to comment any lines related to MPI in FEDataStructure.cxx and in CMakeLists.txt (startxPoint=0 and endXPoint=numPoints[0])

I followed your instructions.

catalyst_initialize fails and returns only this

Attempting to use an MPI routine before initializing MPICH

That is weird. I reproduced with the 5.10.0 release.
But I do not have the problem with a nightly release, so something have been fixed since then.

Also, I just created this tip based on our discussion. Feel free to add some feedback from your experience!

Thank you for creating these tips, it is a very good and clear start!
The original catalyst script from the example seems to work now, but when I try to load yours catalyst2_pipeline.py I get this generic error:

(   1.827s) [pvbatch         ]vtkXOpenGLRenderWindow.:465    ERR| vtkXOpenGLRenderWindow (0x562e039aeb60): bad X server connection. DISPLAY=localhost:10.0. Aborting.
...

Can you detail your environment ? Specially which OS and is there any graphical capabilities installed on this system ?

Ubuntu 18.04 and GPU GeForce RTX 2070.

Can you try the following please ?

  • run it from command line: ./bin/pvbatch catalyst2_pipeline.py
  • load it from ParaView GUI: File / Load State

I use nightly build: ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64
When I load the state in Paraview it looks all ok, a grid is loaded.
The first command gives me segmentation fault:

Loguru caught a signal: SIGSEGV
Stack trace:
45            0x401dbf /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/pvpython-real() [0x401dbf]
44      0x7f45c85d8bf7 __libc_start_main + 231
43            0x4023aa /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/pvpython-real() [0x4023aa]
42      0x7f45c5e5d0a4 vtkPythonInterpreter::PyMain(int, char**) + 916
41      0x7f45c46d2bf0 Py_RunMain + 1936
40      0x7f45c46ae948 PyRun_SimpleFileExFlags + 376
39      0x7f45c46aca25 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x207a25) [0x7f45c46aca25]
38      0x7f45c466831b PyEval_EvalCode + 27
37      0x7f45c46682ee PyEval_EvalCodeEx + 62
36      0x7f45c46682a2 _PyEval_EvalCodeWithName + 82
35      0x7f45c4667f09 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x1c2f09) [0x7f45c4667f09]
34      0x7f45c4512dcb _PyEval_EvalFrameDefault + 22283
33      0x7f45c450c187 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x67187) [0x7f45c450c187]
32      0x7f45c4512d2d _PyEval_EvalFrameDefault + 22125
31      0x7f45c45622d1 _PyFunction_Vectorcall + 177
30      0x7f45c4667f09 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x1c2f09) [0x7f45c4667f09]
29      0x7f45c4513750 _PyEval_EvalFrameDefault + 24720
28      0x7f45c450d2a7 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x682a7) [0x7f45c450d2a7]
27      0x7f45c4565be3 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0xc0be3) [0x7f45c4565be3]
26      0x7f45c45622d1 _PyFunction_Vectorcall + 177
25      0x7f45c4667f09 /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x1c2f09) [0x7f45c4667f09]
24      0x7f45c450f3e9 _PyEval_EvalFrameDefault + 7465
23      0x7f45c456211e _PyObject_Call + 94
22      0x7f45c45bd50b /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/bin/../lib/libpython3.9.so.1.0(+0x11850b) [0x7f45c45bd50b]
21      0x7f458379632d /home/karol/Downloads/ParaView-master-5.10.0-657-gbd25839943-MPI-Linux-Python3.9-x86_64/lib/python3.9/site-packages/paraview/modules/vtkRemotingAnimation.so(+0x3332d) [0x7f458379632d]
20      0x7f45bd544983 vtkSMSaveAnimationExtractsProxy::SaveExtracts() + 1395
19      0x7f45bd543715 vtkSMAnimationSceneWriter::Save() + 53
18      0x7f45c2a0adcf vtkAnimationCue::Tick(double, double, double) + 511
17      0x7f45bd53ecbd vtkSMAnimationScene::TickInternal(double, double, double) + 1901
16      0x7f45bcfbc453 vtkSMViewProxy::StillRender() + 307
15      0x7f45c68e8fc5 vtkPVSessionBase::ExecuteStream(unsigned int, vtkClientServerStream const&, bool) + 53
14      0x7f45c68e9f9b vtkPVSessionCore::ExecuteStream(unsigned int, vtkClientServerStream const&, bool) + 59
13      0x7f45c68ea162 vtkPVSessionCore::ExecuteStreamInternal(vtkClientServerStream const&, bool) + 242
12      0x7f45c5c291dd vtkClientServerInterpreter::ProcessStream(vtkClientServerStream const&) + 29
11      0x7f45c5c28f3e vtkClientServerInterpreter::ProcessOneMessage(vtkClientServerStream const&, int) + 1294
10      0x7f45c5c2880d vtkClientServerInterpreter::ProcessCommandInvoke(vtkClientServerStream const&, int) + 1229
9       0x7f45c5c281a9 vtkClientServerInterpreter::CallCommandFunction(char const*, vtkObjectBase*, char const*, vtkClientServerStream const&, vtkClientServerStream&) + 345
8       0x7f45c72d8810 vtkPVRenderViewCommand(vtkClientServerInterpreter*, vtkObjectBase*, char const*, vtkClientServerStream const&, vtkClientServerStream&, void*) + 8496
7       0x7f45bcf188d1 vtkPVRenderView::StillRender() + 97
6       0x7f45bcf246ea vtkPVRenderView::Render(bool, bool) + 2106
5       0x7f45ac70bcb0 vtkXOpenGLRenderWindow::Render() + 32
4       0x7f45ac66d151 vtkOpenGLRenderWindow::Render() + 81
3       0x7f45a7c006c9 vtkRenderWindow::Render() + 169
2       0x7f45ac31a69c vtkXRenderWindowInteractor::Initialize() + 204
1       0x7f45ac01225b XSync + 27
0       0x7f45c85f6040 /lib/x86_64-linux-gnu/libc.so.6(+0x3f040) [0x7f45c85f6040]
(   1.841s) [paraview        ]                       :0     FATL| Signal: SIGSEGV

Well, no problem for me with the same binary (on archlinux), and no more idea …

As it seems unrelated to Catalyst, maybe open a new clean thread about this pvbatch issue to catch attention from other people than me …

Yes, I do realize that the problem is most likely related to my machine only. I will try on another system and update you about the outcomes. Thank you so much again for all the help.