New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when downloading log from Aqualung i450T #1142
Comments
@rz83 seems that the parser in the libarary which we use to parse dives from the DC failed with: is there are a way for you to extract the dive data and share it here? |
I have extracted dive data in many formats from official Aqualung DiverLog software. I hope you can use one of them... |
thank you, i've examined the data you've provided. basically our libarary (libdivecomputer) for parsing RAW data from the dive computer complains that some samples (moment in time when the DC records information) have bad time stamps, for example:
i wrote a script to parse your data, yet i cannot find any samples that have times like the previous samples or samples that move back in time. it's either a case where the |
Can you attach a memory dump of your dive computer to the bug report? Simply enable both the libdivecomputer logfile and dumpfile checkboxes in the subsurface download dialog, and attach those two files. |
Here it is @jefdriesen : |
When I inspect the raw sample data, I see this:
It's the middle sample that is problematic. This looks like a completely bogus sample, with a timestamp that jumps forwards several hours. And thus when parsing the next sample, libdivecomputer detects that the time jumped backards. But it's actually the previous sample that is bogus. Unfortunately I don't really have a solution for this problem. |
@jefdriesen do you think you can turn this into a warning instead of an error and ignore such bogus samples? |
@jefdriesen, I agree with @neolit123. This error stops the download of previous dives, so I can't get all of them. |
If i could then I certainly would, but ignoring the bogus sample is not that easy. As you can see from my explanation, it's not the bogus sample that triggers the error, but the next good sample. At that point it's already too late, because the bogus sample has already been passed to the application. Simply logging the error and continue is also a bad idea. Because in that case, you end up with timestamps that are no longer increasing monotonously. That will no doubt lead to very interesting problems in the application. That's why the error is there in the first place. Note that the fact that the download is aborted when a dive fails to parse, is an issue in subsurface, and not in libdivecomputer. @rz83 I doubt Diving Log is able to parse your dive, because it's using libdivecomputer as well. |
understood. thanks for the explanation, @jefdriesen. |
Would it make sense to discard samples whose time difference to the previous is bigger than some threshold? |
I haven't looked at the specific libdivecomputer backed for that device, but if @jefdriesen says its hard to do I'd trust him. He knows that code. What I've done when a computer stomped some memory for me or some of my mates is that I've hacked libdivecomputer into either ignoring that specific sample or just emitting it anyway, and post-processed the xml in vim. It's in no way a "generic" or "user friendly" solution, but hey, it works for me. Wth. I dug into the code. 0x04 might be some other sample type? I'm just guessing here. Or we just could build a hacked libdivecomputer with a subsurface for you which just ignores that sample so you can download the rest of your dives and go on with life. I'm no use in building windows binaries but @neolit123 is. He might be able to hack up a "special" version for you. |
Hey, I had forgotten that travis build windows binaries for us. That way anyone (@jefdriesen ?) can hack up a test version for you and let travis build the binaries. |
Of course, I trust you @jefdriesen when you are saying it is difficult. If you want to compare with Subsurface, I attach here Diving log v6 dump + dump log (still on win10x64). Anyway @glance-, all the attempts of the developpers are welcome... Sorry for my low knowledge about GitHub but concerning the subsurface download interruption when parsing error occurs (as mentioned by @jefdriesen), do I have to create a new issue or this current one is enough ? If I compare Subsurface with Diving Log, all my previous dives are downloadable in Diving Log. |
yep, any PR that is submitted here will build Windows binaries that users can test. |
the current issue is enough. you will get a notification if someone mentions you (via @user-name) if we have binaries for you to test. |
@neolit123 Using some kind of threshold is also something that crossed my mind. But then the question is what should the value be? The data format allows large jumps, so any value you choose might trigger on valid data. So this risks filtering out valid data. And that's worse than failing on bad data. (This timestamped data format is just a bad idea in the first place. Along with the depth based sample rate that some other oceanic models support. But that's another story, and not something under our control.) @glance- The simplest way to hack up something is to remove the "return DC_STATUS_DATAFORMAT" statement, and then afterwards remove the sample with the bogus timestamp from the xml. @rz83 I checked you divinglog memory dump, and noticed something interesting. There is one dive (on 2018-02-18 12:00:00), where the raw data is not identical:
The middle sample is different, and this happens to be also the sample with the timestamp jump. That's probably not a coincidence! I think this means that either this sample got corrupted somehow after it was recorded (possibly due to a firmware bug overwriting the data) or it is still stored correct, but corrupted during the transmission somehow. (Note that the latter case may seem a bit strange, because the data packets are checksummed, and those are verified. But nevertheless, I have seen this happen before, especially with older models. Sometimes when downloading the data twice, just a few minutes apart, the result is not identical. I have no idea what causes this.) The fact that the Oceanic shows no problems, could be because it downloaded the data before it got corrupted. |
Can there really be several minute jumps in valid data? |
It's probably not very common, but certainly not impossible. The thing is that data corruption is also not very common. So you are basically trying to implement a workaround for a problem that is not supposed to be there, but which might end up failing on valid data in some other corner case. Difficult to tell what is worse... |
I'm getting a similar error when importing freedives from an Aqualung i750TC. My dump and log are below if they are of any help. |
Describe the issue:
Issue long description:
I have an error while I am trying to download log dive from my Aqualung i450T dive computer. Error while parsing the header.
Operating system:
I am using windows 10 family edition 64 bits.
My dive computer is Aqualung i450T.
I tried it on linux debian and it is the same issue.
Subsurface version:
Subsurface version 4.7.7.
It is an official release.
Steps to reproduce:
Current behavior:
I have an error when "parsing header" so I don't see my dives and I can't import them.
Expected behavior:
I should see my dives and I would be able to import them.
Additional information:
subsurface.log
Mentions:
The text was updated successfully, but these errors were encountered: