Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Larger TDMS file return errors #15

Closed
jeremypmeyers opened this issue Jun 10, 2014 · 5 comments
Closed

Larger TDMS file return errors #15

jeremypmeyers opened this issue Jun 10, 2014 · 5 comments

Comments

@jeremypmeyers
Copy link

There seems to be an upper limit on how large a TDMS file I can extract information from with npTDMS.

I have several TDMS files, which were all generated from the same test system. They can all be opened using the Excel Importer, and the extraction through Excel returns no errors. The files of size 20.7 MB and 39 MB open fine, but the file of size 54 MB returns an error.

Are there some settings that I can change or suggestions about how to handle the larger files with npTDMS?

@adamreeve
Copy link
Owner

What is the error? There shouldn't be any issues opening a file that size. For very large files (gigabytes) the data can be loaded into memmapped files by passing a directory as the memmap_dir parameter when creating a TdmsFile object, but that shouldn't be necessary.

@jeremypmeyers
Copy link
Author

thanks for your help!

I've confirmed that the TDMS files can be opened with the Excel importer
without issue up to a significantly larger size.

The error returns

ValueError: Data size 18446744073709531243 is not a multiple of the chunk
size 2056

On Tue, Jun 10, 2014 at 4:11 PM, Adam Reeve notifications@github.com
wrote:

What is the error? There shouldn't be any issues opening a file that size.
For very large files (gigabytes) the data can be loaded into memmapped
files by passing a directory as the memmap_dir parameter when creating a
TdmsFile object, but that shouldn't be necessary.


Reply to this email directly or view it on GitHub
#15 (comment).

Jeremy Meyers
jeremypmeyers@gmail.com
+1.512.964.4288

@jeremypmeyers
Copy link
Author

Here is the entire error message on a different file

WARNING:nptdms.tdms:Last segment of file has unknown size

Traceback (most recent call last):

File "funwithtdms.py", line 8, in

dms_file = TdmsFile("zee.tdms") #dmc.tdms can be opened

File "/Library/Python/2.7/site-packages/nptdms/tdms.py", line 148, in
init

self._read_segments(tdms_file)

File "/Library/Python/2.7/site-packages/nptdms/tdms.py", line 160, in
_read_segments

previous_segment)

File "/Library/Python/2.7/site-packages/nptdms/tdms.py", line 364, in
read_metadata

self.calculate_chunks()

File "/Library/Python/2.7/site-packages/nptdms/tdms.py", line 391, in
calculate_chunks

"chunk size %d" % (total_data_size, data_size))

ValueError: Data size 18446744073709531163 is not a multiple of the chunk
size 2032

On Tue, Jun 10, 2014 at 4:21 PM, Jeremy P. Meyers jeremypmeyers@gmail.com
wrote:

thanks for your help!

I've confirmed that the TDMS files can be opened with the Excel importer
without issue up to a significantly larger size.

The error returns

ValueError: Data size 18446744073709531243 is not a multiple of the chunk
size 2056

On Tue, Jun 10, 2014 at 4:11 PM, Adam Reeve notifications@github.com
wrote:

What is the error? There shouldn't be any issues opening a file that
size. For very large files (gigabytes) the data can be loaded into
memmapped files by passing a directory as the memmap_dir parameter when
creating a TdmsFile object, but that shouldn't be necessary.


Reply to this email directly or view it on GitHub
#15 (comment).

Jeremy Meyers
jeremypmeyers@gmail.com
+1.512.964.4288

Jeremy Meyers
jeremypmeyers@gmail.com
+1.512.964.4288

@adamreeve
Copy link
Owner

What version of npTDMS are you using? It looks like the last segment of the file has an unknown size, which can happen if LabView crashes before closing the file properly. Version 0.5.0 fixed an issue where npTDMS would keep trying to read the file and would get an error like this, so I think updating to the latest version will fix your problem. However, npTDMS currently doesn't try to read the last segment if the size is unknown, so you will be missing a small amount of data.

@jeremypmeyers
Copy link
Author

I thought that I was using the most up-to-date version (0.6.1), but I did a
full uninstall and reinstall and it seems to be working now. Thank you for
your help!

On Tue, Jun 10, 2014 at 4:38 PM, Adam Reeve notifications@github.com
wrote:

What version of npTDMS are you using? It looks like the last segment of
the file has an unknown size, which can happen if LabView crashes before
closing the file properly. Version 0.5.0 fixed an issue where npTDMS would
keep trying to read the file and would get an error like this, so I think
updating to the latest version will fix your problem. However, npTDMS
currently doesn't try to read the last segment if the size is unknown, so
you will be missing a small amount of data.


Reply to this email directly or view it on GitHub
#15 (comment).

Jeremy Meyers
jeremypmeyers@gmail.com
+1.512.964.4288

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants