-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Address loss of GPS Time precision over time #6
Comments
A suggested alternative that I came up with here: If USGS or the LAS committee supplied a standard VLR to encode a hash table with GPS weeks indexed by point source ID then we could return to GPS Week Time. All of the shortcomings of storing time as a double-precision float would then be nullified. I have already implemented something along these lines in my software and encourage everyone in our offices to process with GPS Week Time encoding, but it will only be useful to others if it's integrated into a standard. Karl's reaction: I would have no problem with this approach. Then we could improve the precision of time as well. Martin's reaction: Interesting idea, but I have to disagree. This would only work when the LAS files store flight strips. But more often LAS files store tiled data and points from different flight strips that are tiled into the same tile are not necessarily from the same GPS week. I have spent a lot of time with the national LiDAR mapping project of the Philippines and the neighboring "blocks of flight lines" are often flown months or even years apart. I think the time stamp must be unique per return, not per file. Using differential coding as done by typical compressors (like open LASzip as well as by proprietary MrSID or zLAS) a continuously repeated GPS week will compress away to almost nothing. We also want to get away from storing the time as a floating point number. A second of the week number stored as a floating point number has drastically varying precision starting at near infinite at the beginning of the week (t close to 0.0) to much lower towards the end of the week (t close to 604,800.0). |
Martin's suggestion: Currently the LAS format supports a a spacing of 0.0596 microsecond or 59.6 nanoseconds. Let's assume we agree that a spacing of 10 nanoseconds will suffice then here is a very simple 64 bit representation based on the well understood GPS week + GPS time of week concept:
When they merely need a unique sort or hash key (most common use of GPS time) then folks may still decide to access these 64 bits of information as a single U64 integer or a single F64 double float using such a construct.
To store the 604,800 second of the week with 10 ns spacing we need to store numbers up to 60,480,000,000,000 in size. This can be done with 46 bits as 2^46 = 70,368,744,177,664. To get a 1 ns spacing (but that seems excessive to me) we could change the struct to:
In response, Lewis expressed concern that we should retain byte compatibility between revisions of a particular version. However, this might be a viable option for a new version of LAS (1.5/2.0?). |
From Martin: Can we introduce a "Newly Adjusted GPS Standard Time Stamp" using another of the global encoding bits that subtracts 1.2 or 1.3 billion instead of 1 billion and still do this as a LAS 1.4 revision? |
Some related discussions on this topic are happening in the LAStools user forum. |
A couple of comments here since this was brought up as relevance with potentially adding a LAS 1.5:
|
This came up on our last call in July, when @lgraham-geocue suggested the following (my words summarizing his):
I really like this idea because it's simple, easy to implement, and retains compatibility with existing point records. However, adding this option opens the possibility of having wildly different offset values entering the ecosystem, which would potentially make interoperability between datasets extremely painful. I see two main paths to prevent this: Option 1Encode the offset as some large increment to make consistency easier to accomplish. For example...
Option 2Use the wiki to publish recommended offset values at a regular interval, such as every decade. That way, anyone doing a survey in 2020-2030 will use the same offset value and data interoperability becomes straightforward, making it unlikely to have mixed offsets for a given project or application. These published values would be a recommendation, not a requirement, so, a data buyer with a multi-year contract could dictate a specific offset value that's optimized for their particular application. My thoughtsThe two options aren't mutually exclusive. Personally, think I like option 1.2 in combination with option 2. In this case the offset would be integer increments of 100M, but we'd publish recommended offset values for each decade that attempts to minimize precision loss for the decade. For example...
Values in between such as 6, 7, or 12 would be valid for users targeting specific time periods, but the defaults would be the recommended values. There's also nothing special about decades... we could also do 5-year recommendations. QuestionWhat do you guys think? I haven't done the math to know whether 100M seconds is sufficient to prevent measurable precision loss, but my hunch is that it'd be okay. |
Variant of option 1.2: encode the offset as a signed int8 ( |
I suggest to not make it a 8 bit field but a 32 bit field and specify the offset in million of seconds. For any LAS 1.4 to LAS 1.5 converted LiDAR using Standard Adjusted GPS time the number stored to that new field would then be 1000. If we need to subtract another billion seconds (or maybe only 500 million seconds) to regain precision the this number becomes 2000 (or maybe 1500). So 1000 or 500 would be the specification recommended increments. We could even add a simple table that suggests for which years which offsets should be used. However, should we ever need more resolution in the GPS time we can lower these 1000 or 500 increments to a smaller number without having to change the specification. Only the recommended offset table would need to change. |
Thanks for the input @rapidlasso ! Since you've done the math on precision loss and I haven't, I'll defer to your judgment on the recommended interval size. 1M is fine with me. I like the idea of having the option to modify precision without changing the specification. I don't have any real objections to a 32bit integer, but it seems to me that a 16bit field would be more than enough. Even at the max value, 32768 * 1e6 is ~1041 years. |
|
I finally took the time to try to recreate Martin's math from back in the day, and came up with pretty similar results. I'm relying pretty heavily on his explanation, but this spreadsheet will hopefully help folks trying to wrap their brains around this issue. Adjusted_Time_Table_20211221.xlsx As things stand in LAS 1.4 Adjusted Standard GPS Time, the 1e9 offset results in a zero value around September 14, 2011 (highlighted in yellow). On that date we had the maximum possible precision, and the maximum possible pulserate that can be discretely encoded. The spreadsheet shows that we currently have around 0.1192 microsecond precision (max 8.4MHz) and will degrade to 0.2384 microsecond precision (max 4.2MHz) around 2028. It doesn't take much imagination to picture 4MHz systems in seven years, given that systems with 2MHz are already on the market. Hope that helps people understand the problem, what we're doing about it, and how to determine which LAS 1.5 time value they'll want to use. Entering a value of 65535 results in a zero date of Sept 24, 4056. a uint16 will be more than enough. |
As a matter of fact, that spreadsheet leads us straight to our recommended offset table, too. If I arbitrarily pick 0.1 microseconds (100 nanoseconds) as the target precision, then we have the constraint we need to provide recommended offsets:
|
What do you all think about the name |
I like Offset GPS Time.
Lewis Graham
GeoCue Group, Inc.
|
During today's LWG call we completed a VERY rough first draft of the Offset GPS Time language for LAS 1.5. You can review the PDF here: https://github.com/ASPRSorg/LAS/actions/runs/5754635108 Specifically, we resolved that we do in fact want to add a new Global Encoding Bit to indicate that the Time Offset value in the header should be used, and it is to be used in concert with the existing GPS Time Type Bit. That is, both bits must be set if Offset GPS Time is being used in the LAS 1.5 file, and it's invalid to have the Offset Time Flag set and to have the GPS Time Type Bit NOT set. In other words.... We will continue this work next month. Please feel free to add any edits or comments between now and then. |
During today's LWG call we added the above table to the GPS Time definition and made some wording changes. I think this section is pretty close to being complete. You may review the PDF with these changes here: https://github.com/ASPRSorg/LAS/actions/runs/6113674176 Please do let me know if you see anything that needs to change. |
Rupesh noted at the end of today's LWG call that the Adj Std GPS Time example was unclear. That was clarified in this build: |
One more update in this month's LWG meeting to clarify the point's timestamp description: Last chance for comments! Will merge the PR #140 into the 1.5 Draft during the Nov 2023 meeting if there is no further discussion. |
Re: the above INVALID field of GPS week time + offset I don't see this as being the case, in fact receiving a file with GPS week time and no knowledge of the particular week is highly annoying and I'd argue there is a strong case for making the offset mandatory and making GPS week time - offset the invalid case. The 2 bytes for time offset as a week number can cover a millenium. It would be only a matter of choosing where the most suitable zero is, whether it be the week of 1 Jan 1970, 6 Jan 1980 or whenever the + 10^9 ends up |
While I like the idea of deprecating GPS Week Time and agree with all points, I think it's a lost cause for us to fully deprecate GPS Week Time in LAS 1.x at this point. We can review the language to ensure the preference for Offset GPS Time is as strong as possible. |
PR#140 has been merged into the 1.5 draft branch. |
In the current implementation, Adjusted Standard GPS Time is gradually losing precision. Currently it's able to store timestamps at the resolution of 0.0596 microseconds. In 3 years that goes down to 0.1192 microseconds.
Relevant discussion here: https://groups.google.com/forum/#!msg/lasroom/s3-OR4LP1IE/Br-PndbgCwAJ;context-place=forum/lasroom
The text was updated successfully, but these errors were encountered: