You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be useful if HashTrack could restore the visible state of the map so that sharing the URL with someone else or bookmarking it for yourself would restore the map as it looked before.
Currently, the following are tracked:
Map center and scale
The list of visible layers
To restore the map to its previous visible state, at least the following would be, in addition, required:
Layer Z-Order
Layer Opacity
Drawings
Query results
While all of this state may be technically trackable, doing so is a lot of data and would produce exceptionally long URLs that in many instances would exceed the lengths allowed by the browsers. This could be somewhat mitigated by compressing the data but that wouldn't remove the length limitation and would also make testing and manual inspection/changing of the URL significantly more painful and thus may not be worth the complexity. Thus, we will not be able to reliably track everything and will need to be selective about what is and isn't tracked.
The text was updated successfully, but these errors were encountered:
Is there a simple example of what can be bookmarked that we can include in the demo? At one point we had a bookmarking example, but I think that was back in 2.8-2.9 ?
The list @klassenjs mentioned is what is currently bookmarkable. I do have a request to recreate the "bookmark" tool which will put the URL into a textfield and offer to copy it to the clipboard.
It would be useful if HashTrack could restore the visible state of the map so that sharing the URL with someone else or bookmarking it for yourself would restore the map as it looked before.
Currently, the following are tracked:
To restore the map to its previous visible state, at least the following would be, in addition, required:
While all of this state may be technically trackable, doing so is a lot of data and would produce exceptionally long URLs that in many instances would exceed the lengths allowed by the browsers. This could be somewhat mitigated by compressing the data but that wouldn't remove the length limitation and would also make testing and manual inspection/changing of the URL significantly more painful and thus may not be worth the complexity. Thus, we will not be able to reliably track everything and will need to be selective about what is and isn't tracked.
The text was updated successfully, but these errors were encountered: