Catalyst V2 for OpenFOAM

I have an interesting use-case for in-situ and I have been looking to become Catalyst user for while. I read the blog about the new API, have gone through some examples in ParaView source tree, basic Conduit examples and so encouraged… dived into OpenFOAM.

I am aware of the legacy API based function object but before trying out my application I decided to adapt it to the new API. I did not have any particular reason other than wanting to learn more about the internals, but also future proofing the code as the team I am in tends to use the latest ESI OpenFOAM and the latest ParaView and VTK.

Unfortunately I underestimated the difficulty. I think I understand the overall structure and I am very impressed with the progress that has been made on incorporating different sources of information that OpenFOAM produces. But, I am finding it hard to understand what is happening at the lower-levels. I am happy and would like to do some of the programming work over a the next 2-3 months, but would use some supervisory support. This is all in preparation for a bigger project where I expect that flexibility will be important.

My current and still fairly early stumbling block is passing the mesh information. I can see that the current function object uses vtuAdaptor and vtuCells/vtuSizing classes from the main source code. Do I understand correctly that translation from OpenFOAM internals to VTK happens through this instances of these classes? Is it advisable, under the new API, to make a translation into vtkUnstructuredGrid and pass it through Conduit or should I rather translate the full OpenFOAM mesh into Conduit mesh format.

1 Like

Let me caveat by saying I don’t know much about how OpenFOAM’s current Catalyst adaptor works. However, I’d guess it takes the data arrays from OpenFOAM converts it to vtkMultiBlockDataSet comprising of vtkUnstructuredGrid and passes that on to ParaView using the Legacy Catalyst API.

What changes with Catalyst V2 is this need to convert to VTK data types. You won’t be create a VTK datatype of any sort at all. If you ever see an include <vtk....h> that a clear indication that it’s not right. What you pass using Conduit is directly the arrays. To make sense of these arrays, you provide some meta-data which lets the other side know how to interpret those arrays. The Conduit Mesh Blueprint describes a way to describe several types of meshes.

While the currently, it’s easy to describe a mesh split into partitions (e.g. vtkPartitionedDataSet), and collection of partitioned-datasets together will relationships between them is on the todo list. I plan to work on that next, so we’ll have something for that in the next few weeks.

Thanks for clarifying.

Yes, the current code calls several classes sitting in $FOAM_SRC/conversion/vtk and $FOAM_SRC/fileFromats/vtk. I have still not understood how they relate to each other exactly, but it seems to me that the copying of mesh representation data happens in vtuSizing::populateArrays and that leads to creation of vtkUnstructuredGrid object.

I have read about mesh blueprint and I am currently trying with pure hex meshes. The old API adaptor covers many different topologies of elements, so focusing on hexes will be a massive simplification, but I think it’s ok for my education purposes.

I have not completely understood your last paragraph. Are you saying that partitioned datasets will not be seen as a single object by Catalyst2?

Current in Catalyst2, you’ll be able to describe a single partitioned dataset just fine. Thus a collection of unstructured grid that are simply parititions of the whole are representable. You can even represent multiple such paritioned-grids. What’s missing is if you have multiple such grid each forming different parts components of an assembly, for example, then there’s no away to provide information about that to Catalyst currently. Ideally, we’d have from ParaView extension to the protocol that lets you provide that information.

Thanks!

I have made some progress, although just because I made some sacrifices in generality. For now I fill focus on static meshes with hex elements. I am aiming for zero-copy and therefore I went with constructs like this (iter is a mesh region):

const label nPoints = iter.val()->nPoints();
meshData["coordsets/coords/type"].set_string("explicit");
// TODO: Test this in parallel.
meshData["coordsets/coords/values/x"].set_external(
    const_cast<double *>(iter.val()->points().cdata()->cdata()),
    nPoints,
    /*offset=*/0,
    /*stride=*/3 * sizeof(double));
meshData["coordsets/coords/values/y"].set_external(
    const_cast<double *>(iter.val()->points().cdata()->cdata()),
    nPoints,
    /*offset=*/sizeof(double),
    /*stride=*/3 * sizeof(double));
meshData["coordsets/coords/values/z"].set_external(
    const_cast<double *>(iter.val()->points().cdata()->cdata()),
    nPoints,
    /*offset=*/2 * sizeof(double),
    /*stride=*/3 * sizeof(double));

I think ultimately there will be a need to create classes in src/conversion and src/fileFormats to handle this more generically.

I have several questions:

  1. Is it possible with the new API to pipe the objects back into an an interactive ParaView session?
  2. Are there any working examples for multi-rank producers? The examples I found in Examples/Catalyst2 appear to be all single-rank.
  3. Is it possible to suppress missing options warning?
  4. To test velocity transfer I created a script which draws glyphs and colours them with magnitude. It seems to work, but it appears to producing a SOA warning. If you have an idea what may be causing it, please advise.

Options warning

(  13.617s) [pvbatch         ]        v2_internals.py:150   WARN| Module 'slice_extractor' missing Catalyst 'options', will use a default options object

SOA warning

WARN| 23vtkSOADataArrayTemplateIdE (0x55a3e7539640): GetVoidPointer called. This is very expensive for non-array-of-structs subclasses, as the scalar array must be generated for each call. Using the vtkGenericDataArray API with vtkArrayDispatch are preferred. Define the environment variable VTK_SILENCE_GET_VOID_POINTER_WARNINGS to silence this warning.

Great!

As far as the SOA warning goes, it’s just an indication that some filter needs to be updated. Typically, a debugger will help isolate the filter that needs to updated to now using raw array pointer and instead use the array access infrastructure documented here

  1. Is it possible with the new API to pipe the objects back into an an interactive ParaView session?

It’s experimental, but Live Catalyst is still supported. When exporting Catalyst script, check the option to enable live.

Are there any working examples for multi-rank producers? The examples I found in Examples/Catalyst2 appear to be all single-rank.

Catalyst2/CxxImageDataExample is indeed multi-rank.

  1. Is it possible to suppress missing options options warning?

Not currently, but I think we can make it not a warning and just log as a TRACE.

Hi @robertsawko and @utkarsh.ayachit

I have been delaying any updates on the OpenFOAM/catalyst interfaces, pending time and waiting for the dust to settle a bit on the conduit interface. I’m not actually sure how the new interface will work for us. There are probably enough details that it might make more sense to discuss as a gitlab issue - either with the kitware.com gitlab or the openfoam.com one.

The current setup of the OpenFOAM/catalyst interface attempts to have a minimal copying mechanism. Since OpenFOAM is face-based, we don’t have any mesh primitives (hex, tet etc) to hand over, but have to determine them and copy them into the VTK format, while also handling things like flipping the orientation of prisms etc.

To an outsider, the code will look fairly opaque, but essentially it does the following:

  • use OpenFOAM to walk across mesh to determine overall sizing: count of different cell types, required allocation
  • use this information to allocate vtkCellArray directly
  • wrap the VTK WritePointers as various OpenFOAM UList<T> types (zero-copy)
  • use OpenFOAM to walk the wrapped types and fill in the values.

This keeps all of the sizing and populating of arrays within the realms of OpenFOAM, where we have the best overview of how things are setup, but delegates all of the memory management and final data structures to VTK.

On the receiving (VTK) side, we then have some type of unstructured grid bits that are cached as smart pointers. The assembling of a multiblocks is then fairly straightforward and doesn’t involve OpenFOAM really at all. Since we have cached smart pointers of the geometry bits, it is very simple to manage static meshes and/or point motion without triggering a full update. The output fields are just added on top of a shallow copy of the geometry.

/mark

Addendum:

  • the geometry for OpenFOAM/catalyst is all vtkMultiPieceDataSet

Eg, from the catalystFvMesh header:

    The output block structure:
    \verbatim
    |-- region0
    |   |-- internal
    |   |   |-- piece0
    |   |   |-- ...
    |   |   \-- pieceN
    |   \-- boundary
    |       |-- patch0
    |       |   |-- piece0
    |       |   |-- ...
    |       |   \-- pieceN
    |       |-- ...
    |       \-- patchN
    |           |-- piece0
    |           |-- ...
    |           \-- pieceN
    |-- ...
    \-- regionN
        |-- internal
        |   \-- ...
        \-- boundary
            \-- ...
    \endverbatim

where each piece corresponds to a rank.
Would be very easy to make other multi-piece composite datasets.

@olesenm, many thanks for commenting on this. I have studied your current implementation and have come across this block structure, sizing and allocation code. I am happy to start a GL issue, but I also don’t want to push anyone into developing a general solution at this early stage of the new APIs.

My understanding is that the only difference with new API is that we will be passing the pointers to node.set_external function instead of creating VTK objects.

My objective is to support though some high-fidelity VOF work with in-situ post-processing so even a naive implementation for hexes and internal mesh only, would work for our use-case. I am forcing myself to study the low-level a little here for purely educational purposes, but also in my previous experiences with VTK and OpenFOAM it was quite beneficial in automation of large-scale studies.

Based on what you said, here’s how Catalyst V2 helps.

  1. Removes dependency on VTK (or ParaView) during build. You should be able to build the Catalyst adaptor without linking against VTK/ParaView. This is huge since this saves on having to build SDK for the same.
  2. Makes it easy to test with a newer release of ParaView. If ParaView 5.11 is release, but you have an older build of OpenFOAM that uses ParaView 5.10, fret not. You still test out ParaView 5.11 for in situ processing without having to rebuild OpenFOAM simply by changing LD_LIBRARY_PATH.
  3. Uses most suitable VTK API / DataStructure. Take an example, vtkCellArray layout recently changed. Has OpenFOAM adaptor been updated to use the new layout? Same can be true for other data structure changes. We are now replacing vtkMultiBlockDataSet. With the old approach, we now have to update the OpenFOAM-Catalyst Adaptor as well. With the new approach, this becomes obsolete. OpenFOAM simply puts out data as it knows best fitting the mesh description defined by Conduit Blueprint. ParaView side will handle conversion to VTK. Thus, each release of ParaView can make best-case decisionsf or what’s supported by that release.

Hi @robertsawko
Good to hear that you are working on this. I am wary of fully embracing it at this stage, despite the glowing endorsement from @utkarsh.ayachit . I certainly really like the idea of having an agnostic stub, but we will need to see how (or if) it works in detail. The function object itself is fortunately pretty small, but the devil is always in the details.

I would welcome it if you open an openfoam visualization issue to discuss/clarify OpenFOAM-specific aspects. Since you are not screaming for attention, it shouldn’t be too bad. Can probably leave the more general VTK-related bits here.

If we get in early with some ideas, we will eventually get there. Will probably need to add a conduit-tools header/code similar to what we have in fileFormats and conversion. From the way that conduit sounds, I’m not sure that we’ll get enough code reusage to warrant putting it into the OpenFOAM source tree at the moment. The regular vtkTools on the other hand are currently recycled for writing legacy/xml formats as well as the rewriting for internal format (ie, plugins and catalyst v1).

/mark

Cheers,
/mark

I am not advocating full embrace at all. Especially right now. Catalyst API changes are still under development. They should be considered preview-release at best, intended for early adaptors and experimenters, not for production.

Hi @utkarsh.ayachit,

I certainly like the idea of all the benefits, but fear that we (OpenFOAM) will be a bit of an outlier in terms of what our meshes look like. A quick scan earlier of the conduit docs made it look like even things like prisms and pyramids are not considered to be primitives.

For our general polyhedrals, this means that we will still be building a ridiculous number of face, size and offset lists for each cell. Flipping the face points for half of the cells. Sending out the same face twice. Always.

I agree with the potential future-proofing, and it will certainly make life easier for the Catalyst developers if you can make internal changes without breaking things for everyone else. The vtkCellArray change made a lot of sense though. Got ahead of that one in January 2020 already. Since the wrapper is header-only, we can handle different VTK versions without issue.

Cheers,
/mark

There are two ways to address this:

  1. propose extensions to upstream.
  2. add custom extensions to Catalyst to support more mesh descriptions.

I’d imagine we’ll end up with a combination of the two approaches over time.

1 Like

Thanks for all the comments.

@olesenm just want to reiterate that I am not in any rush. I have a working prototype for what I need which I called naiveHexCatalyst2FunctionObject. Later this week I can create an issue on Gitlab to discuss it and possibility of something more general. As @utkarsh.ayachit said it may help inform development of Conduit at least.

My prototype appears to be working with extractor objects. I am now going to build a more complex pipeline in Catalyst. I expect to encounter some issues particularly when processing in parallel.