Parallel rendering - Opening large raw dataset crash

Hello,
I am working on a cluste environment using parallel rendering.
So far, it works with some datasets (TIFF stacks), but it crashes when I load a .raw file.
http://cdn.klacansky.com/open-scivis-datasets/pig_heart/pig_heart_2048x2048x2612_int16.raw

I’m attaching the log in case you want to read it.paraview_log.txt (13.3 KB)

The file is too large to open on a single client and convert it to tiff stack.
Any thoughts? Any ideas on how to convert it to tiff stack or something paraview can read?

Thanks

Your file can be read, in parallel, without any issue with the following code

reader = ImageReader(FileNames=[‘pig_heart_2048x2048x2612_int16.raw’])
reader.DataScalarType = ‘unsigned short’
reader.DataExtent = [0, 2047, 0, 2047, 0, 2611]

the only issue is making sure you have enough RAM on your remote nodes and on your GPU. Reading your data on two nodes showed 10GB of memory being used on each GPU for volume rendering (default settings)

1 Like

Hello @jfavre
It is a programable filter, correct?

no, this is not intended for a programmable filter. This is part of a regular pipeline. This code would be for the embedded python shell.

It works! Thanks @jfavre.
Do you know why it wasnt opening using the regular file->open option?

I can only guess…perhaps the wrong reader was inadvertently chosen, or the extents or data scalar type was wrong…Anyway, problem solved. :grinning: