Normalize Transfer Function based on Data Resolution

Simple question - I have data 128x128x128, use a combination of 50% opaque points, fine-tune to what I like, leaving sections of it semi-transparent. Then, loading 1024x1024x1024 data with same (exported) transfer function, and my data looks basically opaque (not see through anymore).

As far as I know, this is because the sampling of points is not normalized with regards to the data resolution, and as rays have more points to sample (higher data resolution = more points), we reach 100% opacity much quicker than for lower resolution data.

This seems like a feature that would be too simple for ParaView not to have, since it is a basic normalization. Where can I toggle the option to do so (if it exists, which if not, should be a top priority in my opinion!).

NOTE: Log scale opacity (which is not really same as normalization here) checkbox also seems to fail for me:
Ranges not valid for log-space. Changed the range to (7.54269e-06, 0.0754269).
Then, it seems to continue “working”, but I stop seeing lines - can only move some dots around, not sure how to use it properly after that point.

Update: I have found the relevant Display variable called ScalarOpacityUnitDistance, which Kenneth Moreland confirmed in the following forum post is exactly what I asked about:
https://public.kitware.com/pipermail/paraview/2011-August/022327.html

However, since that post is from 2011 and the UI has changed, I do not see a Scale button anymore in the Color Map Editor. You can manually update it in Python Shell by:
Display.ScalarOpacityUnitDistance = 0.084 (for example), …
… where Display is the currently active View, which is created from before as follows:
# show data in view
Display = Show(datareaderobject, renderView1, 'UniformGridRepresentation')

However, my question remains, how to do this automatically? What function is used for this scaling?
I have found this out when generating a Python Trace. This Scaling is apparently applied when the data is loaded, however, I have found that my transfer functions still do not scale, as described in my original post from 1st of September 2022.

Hello @Dosmi ,

scaling is not done in the Color Map editor anymore but in the display property panel, in Display -> Volume Rendering -> ScalarOpacityUnitDistance. There is a button to automatically compute the scaling of your current representation just next to the value editor widget.

However there may be an issue with how this value is saved relative to the colormap. Could you share your datasets please ? Or equivalent dataset that can reproduce the issue.

Hello, unfortunataly I cannot see the ScalarOpacityUnitDistance button you are referring to?

My Display → Volume Rendering menu just has these two options:

image_2022-09-02_153232830

My color map is saved to json file and applied to other dataset of higher resolution. Quite sure it would be fine, provided I find that Scaling Opacity button.

You need to activate the advanced option to be able to see this :
image

1 Like

Perfect, I see it now. Thanks for pointing it out.

When doing Python tracing, and loading in various datasets, …
… I saw that these values are computed:
Display.ScaleFactor = 0.6280000000000001
Display.GaussianRadius = 0.031400000000000004
Display.ScalarOpacityUnitDistance = 0.0849799064386218
Display.Slice = 63

This was for a 128x128x128 dataset with bounding box size 6.28x6.28x6.28

How can I programmatically determine these values myself?

Got it. All the values below are computed automatically when you create the display and pass it the data reader object, like such:
Display = Show(data_reader_object, renderView1, 'UniformGridRepresentation')

But for anyone wondering in the future how to compute those values yourself, it is as follows:

  1. Display.ScaleFactor
    This gets automatically computed for you from the reader …
    … and corresponds to the Bounding Volume Dimension / 10
    (note you can find bounding volume dimensions like this: [General] Size of object in ParaView -- CFD Online Discussion Forums)

  2. Display.GaussianRadius = Display.ScaleFactor / 20

  3. Display.ScalarOpacityUnitDistance = Voxel Dimension * sqrt(3), where Voxel Dimension = Data Dimensions / Bounding Volume Dimension - example Voxel Dimension = 128 / 6.28 = 0.0490625 , therefore Display.ScalarOpacityUnitDistance = 0.0490625 * 1.732 for data resolution 128x128x128 and 6.28x6.28x6.28 bounding box.
    The square root multiplication comes from it being the distance from the cube’s corner to corner, and is mentioned in Volume Rendering papers/blogs such as:

  1. Display.Slice = Data Dimensions / 2 - 1
1 Like