I am visualizing a mesh that has 155520 parts (> 10B elements) on 48 processes using pvbatch. As you can imagine, almost any contour I choose is going to be empty on a large number of those parts. I am assuming that this message is telling me that for every one of those empty parts:
( 396.069s) [pvbatch.39 ] vtkPVContourFilter.cxx:183 INFO| Contour array is null.
It pretty much floods my screen and slows down the code to write this so I wanted to confirm that this is all it is trying to tell me (no other error which seems to be the case because the image output looks good) and if that is the case whether there is something better than 2> /dev/null (which I don’t like to do in case there is a real error message).
I have a workaround for this. By inserting a MergeBlocks just ahead of the Contour filter the problem goes away. I had ruled out this option earlier when I attempted to put the MergeBlocks right after reading the data from files. This doubled the memory and was thus not a feasible solution. It works in this case because I take a slice of the 3D data and then do a contour of that.
I think I will still have this problem for future visualizations where I would like to do contours of the 3D data without paying the memory penalty of MergeBlocks so I hope that someone has a more general solution to the complaints of the contour filter.