Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No longer getting GSV depth data #2362

Closed
misaugstad opened this issue Nov 12, 2020 · 6 comments
Closed

No longer getting GSV depth data #2362

misaugstad opened this issue Nov 12, 2020 · 6 comments

Comments

@misaugstad
Copy link
Member

misaugstad commented Nov 12, 2020

As of about 3 days ago, it seems that we've started receiving 404 errors on any requests for depth data for GSV panoramas. This is happening on the main Sidewalk websites.

The end result on the main website is that we are unable to compute latitude and longitude when someone places a label, which means that labels are not showing up on maps and such.

What's still working is that the audit interface seems to work normally. Everything that does not explicitly require lat/lngs seems to be working. This basically means that new labels are not being shown on maps and can't be included in nightly clustering, but the audit and validation interfaces seem to be functioning normally.

If at some point we are able to get the depth data for the panos that have new labels on them, we should be able to write some custom code to fill in the lat/lng values in the table, since we save all the other info we would need to compute the lat/lng, provided we have associated depth data.

I have yet to find anyone talking about this and a potential fix online. Additionally, this is not a part of GSV's formal API, which means that they really could just remove it at any time. And that is what I'm worried about right now.

@jonfroehlich I think the immediate question is whether we should make the audit page "closed for maintenance" for now. There is quite a bit of data coming in right now in both SPGG and Seattle. If we are able to acquire the depth data soon and backfill the tables, then I would hate to halt people's momentum by closing down the audit page. But if we are never able to get the depth data back and we have to completely rethink the audit interface, then we'll be wasting a lot of peoples' efforts in the near future.

@misaugstad
Copy link
Member Author

Here is an example link that should get us the depth data:
https://maps.google.com/cbk?output=json&cb_client=maps_sv&v=4&dm=1&pm=1&ph=1&hl=en&panoid=G-JGjkcSgQkSxCFqqoZv0w

And you can see that there is imagery available for this pano, because we can still successfully query for the a piece of the imagery:
http://maps.google.com/cbk?output=tile&zoom=5&x=5&y=6&cb_client=maps_sv&fover=2&onerr=3&renderer=spherical&v=4&panoid=G-JGjkcSgQkSxCFqqoZv0w

@jonfroehlich
Copy link
Member

jonfroehlich commented Nov 12, 2020

This is definitely not good but, perhaps, not as bad as it might seem (as I think we have some reasonable mitigating steps). As expressed over Slack, here are a few thoughts:

Estimating label position without depth data
We can still estimate lat, long position by (in decreasing granularity): (1) taking the lat,long of the pano and a fixed distance from the car (say avg distance of that label type in all of the data collected so far) along with user's POV heading; (2) barring that, we could just set the lat,long of the label to the pano lat,long (obviously, this is worst case and doesn't account for POV heading and certainly doesn't account for placing labels far in the distance).

Examining problem more thoroughly

  • How do we perform label type location inference backoff in the mini-map right now?
  • Is anyone else talking about this online. On a quick Google search (performed on my phone), there were a ton of blog posts and GitHub projects talking about using this "secret" method to obtain GSV depth data. And lots of example projects, artsy visualizations, and actual GIS-based projects like ours (see MIT Senseable City lab stuff). Did we check social media? Reddit? Discord? Twitter?
    • Update: we haven't seen anything on social media yet. Mikey did find this post on Stack Overflow but doesn't fit exactly our problem.
  • Related to above, can you @misaugstad make some posts to Reddit (e.g., communities like https://www.reddit.com/r/gis/) and a post to Stack Overflow about this. You might also be able to find pre-existing questions about depth data on Stack Overflow that you can respond to.
  • At some point, we may also want to reach out to my contacts at Google about this.

Some example projects that use GSV depth data

Here's a detailed explanation of obtaining and using GSV depth data from Marco Cavallo at Univ. Illinois Chicago (now at Apple):

image

GSV itself relies on depth data
We know that GSV itself relies on depth data to do things, I believe, like adapt the size of and shape of cursor and differentiate between "street" and "wall." See animated gif below. Unfortunately, I don't have a previous video that shows how this worked, say, last week before we identified this problem (so it's not easy to see if GSV itself changed in terms of its interactivity).

IMB_zJV1nt

@jonfroehlich
Copy link
Member

As an update, our internal sources at Google are signaling that this feature is now retired due to infrastructural upgrades and will not be coming back—at least not in the near future.

@jonfroehlich
Copy link
Member

jonfroehlich commented Nov 13, 2020

So, next steps:

  • We need to do a severity analysis to determine how and where this affects us. In a video meetup with @misaugstad, we think the most immediate impact, obviously, is the precision of each label's lat,long. This will affect clustering and our determination of whether multiple labels correspond to the same accessibility feature (e.g., consider labeling the same curb ramp from two different GSV car locations, our clustering code is typically able to determine that these multiple labels correspond to the same thing)
  • We need to fix the core Project Sidewalk code so that even with no depth data, each label is given a lat,long based on an estimate (there is currently some code to do this but this is just to calculate mini-map position and doesn't make it into the metadata for the label).
  • Related to the above, @misaugstad did a quick assessment of the mini-map and thinks he can improve on the location estimate. Improving this location estimate will not just improve the mini-map but the actual lat,lng of the labels stored in the database, which is crucial.
  • And then we may need to analyze and rethink our clustering code

@jonfroehlich
Copy link
Member

Given #2372, perhaps we close this ticket and split into appropriately smaller ones (I just made #2374, for example)

@misaugstad
Copy link
Member Author

Agreed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants