I’ve noticed that when a dataset is transformed, not only the coordinate are scaled but also vector data such as the velocity field.

I’ve tried to turn off the transform all input vectors option but nothing changed.

Any hint?

I’ve noticed that when a dataset is transformed, not only the coordinate are scaled but also vector data such as the velocity field.

I’ve tried to turn off the transform all input vectors option but nothing changed.

Any hint?

The Transform filter will always transform the array currently identified as the “Vectors” in a VTK sense.

A very simple work around is the following :

- Open ParaView
- Open your dataset, Apply
- Add a calculator filter, click on “Vectors”, select your vector, Apply
- Add a transform filter, configure it, uncheck “transform all input vectors”, Apply
- your vector has
**not**been transformed.

Hi Mathieu.

Thanks a lot for the info and the workaround.

Much appreciated.

For anyone looking at this in the future, the reasoning for why the vectors are scaled is given in a reply by Kenneth Moreland in this old mailgroup thread:

https://public.kitware.com/pipermail/paraview/2015-January/032957.html

To add a bit more detail to the answer, the reason the transform filter is not scaling all the vectors is because the proper transformation of vectors depends on the type of vector so the filter is being conservative and not transforming most of the vectors at all. To understand this, let’s back up and look at different types of vectors that might be in scientific data.

First, there are vectors that deal with spatial movements. Things like displacements, velocities, and accelerations. These have units like m, m/s, and m/s^2, respectively. They clearly should be scaled along with the space like you described.

But other types of vectors shouldn’t be scaled. For example, consider electrical current. Current is the amount of electricity that passes through a point per unit of time, and a vector can give the direction of the movement. However, it is not clear that scaling up the model means that more electricity passes through the point. Other “vectors” might just be triples that have no spatial meaning and therefore should also not be scaled. Other spatial vectors have different types of transformations when being scaled. A normal vector, for instance, is transformed by the inverse transpose to maintain its perpendicular-to-surface property. A flux vector, which is related to surface area, would be scaled quadratically.

So the problem is that the transform filter has no way to reliably determine how each vector field should be scaled. VTK data sets have the ability to identify certain fields with special attributes. For example, one field can be attributed as the “vectors” and another can be attributed as the “normals”. The transform filter assumes that the vectors are to be scaled linearly and the normals are transformed by the inverse transpose. The rest of the arrays are left alone.

So in summary, in a sense this is in fact a “bug” in that the fields are not being handled correctly. But the filter is doing about as best as it can, so there is no real fix for this bug.

-Ken

1 Like