Is there a max file size in the IOSS Exodus reader? I have a large memory node and a large dataset (500B elements). When I load this dataset with multiple processes, or when I load it with a single process with the legacy reader, it reads in the full data. When I load it with the IOSS reader, it only picks up about 150B elements, but it doesn’t error out or fail.
What type of cells do you have? and also how are you able to create such a big mess? What’s your computer node(S)?
These are all hex cells in a multiblock grid. It’s just a very highly resolved sim. I’m not sure what information you’re looking for regarding the nodes, but they have 2TB of memory each.
Are there blocks missing or do the existing blocks just have less cells? And do all of them have less cells or just some of them?
Given that there is no way for you to give me the dataset due to size. The only way I can think of to figure this out is to have a meeting together and debug it.
I checked with @Gregory_Sjaardema, the main author of IOSS, and there is no inherent limitation in IOSS itself that would lead to this behavior.