Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New feature: Geo-positioned AR Object #99

Open
rlazom opened this issue Dec 22, 2020 · 5 comments
Open

New feature: Geo-positioned AR Object #99

rlazom opened this issue Dec 22, 2020 · 5 comments
Labels
enhancement New feature or request

Comments

@rlazom
Copy link

rlazom commented Dec 22, 2020

The idea of this new feature is to place an object on a given geocoordinates an show it in the ARCoreView when the user points the camera in this position.

Could be convenient also take into account the minimum distance to show the object.
Thanks

@RobinLbt
Copy link

I've worked on it for a few days and managed to get something that works.
I need further testing though !

Here is how I've done it:

ARCore coord system is different each time. This is the main challenge when working on geo-positioned AR object.
Basically, we need to find the true north at a starting point for placing everything.

  • Convert the GPS coord (lat/long) to cartesian coord (look at UTM)
  • Calculate the coord in the local system (AR core) using the difference between the device UTM coord and the object UTM coord
  • Calculate the rotation using the new position and the azimuth angle:
    The idea is to get the azimuth angle from the north using sensors or the flutter_compass package (not the true north but it seems to work)
  • Convert the X,Y,Z rotation you've calculate before to a quaternion (because AR Core rotations use quaternion)
  • Finally, add the AnchorNode with the position and rotation calculated before to the scene

tips: add it when the user tap a plane, so the calibration is good before it adds the object

@giandifra giandifra added the enhancement New feature or request label Dec 23, 2020
@lFitzl
Copy link

lFitzl commented Nov 8, 2021

I've worked on it for a few days and managed to get something that works. I need further testing though !

Here is how I've done it:

ARCore coord system is different each time. This is the main challenge when working on geo-positioned AR object. Basically, we need to find the true north at a starting point for placing everything.

  • Convert the GPS coord (lat/long) to cartesian coord (look at UTM)
  • Calculate the coord in the local system (AR core) using the difference between the device UTM coord and the object UTM coord
  • Calculate the rotation using the new position and the azimuth angle:
    The idea is to get the azimuth angle from the north using sensors or the flutter_compass package (not the true north but it seems to work)
  • Convert the X,Y,Z rotation you've calculate before to a quaternion (because AR Core rotations use quaternion)
  • Finally, add the AnchorNode with the position and rotation calculated before to the scene

tips: add it when the user tap a plane, so the calibration is good before it adds the object

Hi, Can you share some sample code?

@cosmopolit
Copy link

Hello @RobinLbt I'm also interested in a geopositioned object view. I would like to use it as the basis for a team project at my university.

@mpapado3
Copy link

I am also interested on the GPS implementation for a project that I run.

@samatzp
Copy link

samatzp commented Apr 19, 2022

Any solution guys?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants