Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Paraview import issue for exported compressed VTU and VTI files #23

Closed
MaygurovMV opened this issue Jan 29, 2023 · 6 comments
Closed

Comments

@MaygurovMV
Copy link

MaygurovMV commented Jan 29, 2023

Hello. Thanks for crate! I've had a issue when i'm using Paraview to overview compression VTU and VTI file generated by vtkio
Example of VTI export code:

let mut point = vec![
        Attribute::DataArray(DataArray {
            name: String::from("concentration"),
            elem: ElementType::default(),
            data: IOBuffer::F64(
                cloud
                    .c
                    .index_axis(Axis(0), t_index)
                    .map(|x| *x)
                    .into_iter()
                    .collect(),
        )}),
    ];

let vti_vtk = Vtk {
    title: String::from("cloud_vti"),
    file_path: Some(PathBuf::from(path.clone())),
    byte_order: ByteOrder::LittleEndian,
    version: vtkio::model::Version { major: 2, minor: 2 },
    data: DataSet::ImageData {
        extent: Extent::Ranges([
            -x_ext..=x_ext,
            0..=y_ext,
            0..=vert_ext - 1,
        ]),
        origin: [0., 0., 0.],
        spacing: [
            (simulation.step) as f32,
            (simulation.step) as f32,
            simulation.step as f32,
        ],
        meta: None,
        pieces: vec![
            Piece::Inline(Box::new(ImageDataPiece {
                extent: Extent::Ranges([
                    -x_ext..=x_ext,
                    0..=y_ext,
                    0..=vert_ext - 1,
                ]),
                data: Attributes {
                    point,
                    cell: Vec::new(),
                    },
                }
            ))
        ],
    },
};
let mut file = File::create(path.as_str()).unwrap();
let vtk_file = vti_vtk.try_into_xml_format(vtkio::xml::Compressor::ZLib, 9).unwrap();
file.write_all(vtk_file.to_string().as_bytes()).unwrap();

If i'm using any type of compression Zlib, Lzma or Lz4 paraview has this issue.

ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.

ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.

ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0

ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0.  The data array in the element may be too short.

ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.

ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.

ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0

ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0.  The data array in the element may be too short.

ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.

ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.

ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0

ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0.  The data array in the element may be too short.

ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.

ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.

ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0

ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0.  The data array in the element may be too short.

ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.

ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.

ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0

ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0.  The data array in the element may be too short.

If I'm using Compressor::None there isn't issue and all is good but a size of file is too large for me.

I tried to use vtkio = {version = "0.6.3", git="https://github.com/elrnv/vtkio.git", branch="fix-pygmsh"} branch because it commit message says:

Implement compression support for individual (inline) DataArrays.

But, unfortunately, it's still doesn't work.
Verison of Rust: rustc 1.66.1
Version of Paraview: 5.11.0

@elrnv
Copy link
Owner

elrnv commented Jan 29, 2023

Thank you for the issue!
Do you have a sample .vti file I can use to test/debug this issue (preferably one with reduced size)?
It seems I haven't quite finished fixing that branch.

@MaygurovMV
Copy link
Author

cloud.zip
I've attached zip with 2 vti file (the smallest and biggest one) and 1 vtu file. It will need to represent plots in paraview.
the compressor in all files is Zlib. The Level of compression in all files is 9.

@elrnv
Copy link
Owner

elrnv commented Feb 16, 2023

@MaygurovMV I have reworked the compression mechanism in release-0.7. Could you try your code with that branch and see if it is still causing an issue in paraview?

@MaygurovMV
Copy link
Author

Hello, @elrnv.
Thanks for fast support. I can't try new features because I have problems with dependencies solving.
First error from cargo:
error: no matching package named `quick-xml` found location searched: https://github.com/elrnv/vtkio.git?branch=release-0.7#44da5dbd required by package `vtkio v0.7.0 (https://github.com/elrnv/vtkio.git?branch=release-0.7#44da5dbd)
I fixed it by manualy cloning your repository and quick-xml. But I've had another problem:

  --> D:\Code\vtkio\src\xml\se.rs:8:21
   |
8  | use quick_xml::{se::Write, DeError};
   |                     ^^^^^ private trait
   |
note: the trait Write is defined here
  --> D:\Code\quick-xml\src\se\mod.rs:92:5
   |
92 | use std::io::Write;
   |     ^^^^^^^^^^^^^^

error[E0603]: trait Write is private
    --> D:\Code\vtkio\src\xml.rs:3602:53
     |
3602 |     pub fn write(&self, writer: impl quick_xml::se::Write) -> Result<()> {
     |                                                     ^^^^^ private trait
     |
note: the trait Write is defined here
    --> D:\Code\quick-xml\src\se\mod.rs:92:5
     |
92   | use std::io::Write;
     |     ^^^^^^^^^^^^^^

Now I'am sorting through versions of quick-xml to fix it. Could you tell me version of quick-xml that you used? Or maybe can you fix upstairs issues by yourself?

@elrnv
Copy link
Owner

elrnv commented Feb 17, 2023

Thank you, that was my fault, I hadn't pushed my changes to quick-xml yet. I pushed them now, and updated the Cargo.toml dependency so it should work out of the box.

Edit: Also the changes to quick-xml are being upstreamed here

@elrnv elrnv mentioned this issue Feb 17, 2023
2 tasks
@MaygurovMV
Copy link
Author

Thanks for operating support. All works more good than I could think of

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants