Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding coordinate systems and determining the yaw angle from bearing #42

Closed
stanathong opened this issue Oct 26, 2015 · 30 comments

Comments

@stanathong
Copy link

Hi, I would really appreciate your help in answering my questions below. I have been struggling with this for over three weeks now.

(1) I have a 5-camera mounted on a van with a GPS receiver (only 3 is used in the testing with CamOdoCal). I obtained position (Lat/Lon) and bearing angle from GPS. As I don't have an IMU, I computed the yaw angle which tells the orientation around the Z axis.

   double yaw_degree = -(bearing_degree_from_GPS - 90.0);

Here, I assume the coordinate system used in CamOdoCal is ENU, and the vehicle's positive x-axis points toward the direction of movement, positive y-axis on the the vehicle left. I wonder if my assumption is right, please help confirm.

In the testing to perform extrinsic calibration for multi-camera, we only made the yaw motion, ensuring that the ground is flat (making no pitch and roll).

(2) Using those yaw angles converted to rotation matrix then quaternion, and input into CamOdoCal in the form of
[timestamp] CAM [camera_id] [image filename]
[timestamp] IMU [quaternion x] [quaternion y] [quaternion z] [quaternion w]
[timestamp] GPS [latitude] [longitude] [alt]
and, despite changing some setting such as detector and descriptor, I still couldn't have CamOdoCal run past stage 3 of CamaeraRigBA::Run.

[PASSED] stage 1 - triangulate 3D points with feature correspondences from mono VO and run BA
[PASSED ] stage 2 - run robust pose graph SLAM and find inlier 2D-3D correspondences from loop closures
[PROGRAM STOPPED HERE] stage 3 - find local inter-camera 3D-3D correspondences
with the following log messages: (no crashed)

Please can you suggest what should I do to make it work with my system. Your help is really much appreciated. Thank you very much.
.......

Ceres Solver Report: Iterations: 31, Initial cost: 2.702882e+01, Final cost: 1.206417e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.693116 0.168301 0.700903 -0.100646
-0.144071 0.985086 -0.0940688 0.325688
-0.706282 -0.0357792 0.707026 0
0 0 0 1
scales =
-0.0966268
.#. INFO: Calibrating odometry - camera 2...
Rotation:
0.693116 0.168301 0.700903
-0.144071 0.985086 -0.0940688
-0.706282 -0.0357792 0.707026
Translation:
-0.100646 0.325688 0
Ceres Solver Report: Iterations: 25, Initial cost: 1.799722e+01, Final cost: 1.064531e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.713364 -0.280516 -0.642202 -0.113423
0.0840694 0.944029 -0.31897 0.294703
0.695733 0.173552 0.697019 0
0 0 0 1
scales =
0.0699959
.# INFO: Calibrating odometry - camera 1...
Rotation:
0.713364 -0.280516 -0.642202
0.0840694 0.944029 -0.31897
0.695733 0.173552 0.697019
Translation:
-0.113423 0.294703 0
Ceres Solver Report: Iterations: 75, Initial cost: 2.484698e+01, Final cost: 1.217647e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.0996435 -0.98357 0.150533 -0.111327
0.710327 -0.0356262 -0.702969 0.349939
0.696783 0.176974 0.695107 0
0 0 0 1
scales =
0.0813187
.# INFO: Calibrating odometry - camera 0...
Rotation:
0.0996435 -0.98357 0.150533
0.710327 -0.0356262 -0.702969
0.696783 0.176974 0.695107
Translation:
-0.111327 0.349939 0
.# INFO: Reprojection error for camera 2: avg = 0.516288 px | max = 801.914 px
.# INFO: Reprojection error for camera 0: avg = 0.565628 px | max = 469.029 px
.# INFO: Reprojection error for camera 1: avg = 0.304846 px | max = 84.1896 px
.# INFO: Completed camera-odometry calibration for all cameras.
.# INFO: Saving intermediate data...
Done. Took 24.46s.
.# INFO: Running camera rig calibration.
.# INFO: # segments = 1
.# INFO: Segment 0: # frame sets = 59
.# INFO: Reprojection error: avg = 0.45 px | max = 493.62 px | # obs = 389194
.# INFO: Triangulating feature correspondences...
.# INFO: Reprojection error after triangulation: avg = 9400363031163346944.00 px | max = 1178337959001900481249280.00 px | # obs = 269012
.# INFO: # 3D scene points: 101400
.# INFO: Checking the validity of the graph...
.# INFO: Finished checking the validity of the graph.
.# INFO: Reprojection error after pruning: avg = 632419244318232832.00 px | max = 146216321810127040544768.00 px | # obs = 245574
.# INFO: # 3D scene points: 97063
.# INFO: Running BA on odometry data...
.# INFO: Done.
.# INFO: Reprojection error after BA (odometry): avg = 45.00 px | max = 10016.56 px | # obs = 233509
.# INFO: # 3D scene points: 92924
.# INFO: Running robust pose graph optimization...
.# INFO: Building odometry edges...
.# INFO: Built 58 odometry edges.
.# INFO: Building loop closure edges...
.# INFO: Built 0 loop closure edges.
Ceres Solver Report: Iterations: 0, Initial cost: 2.572128e-31, Final cost: 2.572128e-31, Termination: PARAMETER_TOLERANCE.
.# INFO: # inlier 2D-3D correspondences: 0
.# INFO: Merged 0 3D scene points.
.# INFO: Running BA on odometry data...
.# INFO: Reprojection error after robust pose-graph optimization: avg = 43.25 px | max = 4145.38 px | # obs = 229033
.# INFO: # 3D scene points: 90947
.# INFO: Finding inter-map 3D-3D correspondences...
.# INFO: Entering frame set node 0-0 [ 1 0 1 ]

@hengli
Copy link
Owner

hengli commented Oct 26, 2015

Hi,

From this line: .# INFO: Reprojection error after triangulation: avg =
9400363031163346944.00 px | max = 1178337959001900481249280.00 px | # obs =
269012
and assuming that you are inputting correct latitude and longitude values,
it most likely seems that you are not entering the correct attitude
quaternion.

The correct attitude quaternion should correspond to a yaw around the
z-axis starting from north. A yaw of zero means that the vehicle is facing
north, while a yaw of M_PI / 2.0 means that the vehicle is facing west.

  • Lionel

On Mon, Oct 26, 2015 at 7:54 PM, Supannee Tanathong <
notifications@github.com> wrote:

Hi, I would really appreciate your help in answering my questions below. I
have been struggling with this for over three weeks now.

(1) I have a 5-camera mounted on a van with a GPS receiver (only 3 is used
in the testing with CamOdoCal). I obtained position (Lat/Lon) and bearing
angle from GPS. As I don't have an IMU, I computed the yaw angle which
tells the orientation around the Z axis.

double yaw_degree = -(bearing_degree_from_GPS - 90.0);

Here, I assume the coordinate system used in CamOdoCal is ENU, and the
vehicle's positive x-axis points toward the direction of movement, positive
y-axis on the the vehicle left. I wonder if my assumption is right, please
help confirm.

In the testing to perform extrinsic calibration for multi-camera, we only
made the yaw motion, ensuring that the ground is flat (making no pitch and
roll).

(2) Using those yaw angles converted to rotation matrix then quaternion,
and input into CamOdoCal in the form of
[timestamp] CAM [camera_id] [image filename]
[timestamp] IMU [quaternion x] [quaternion y] [quaternion z] [quaternion w]
[timestamp] GPS [latitude] [longitude] [alt]
and, despite changing some setting such as detector and descriptor, I
still couldn't have CamOdoCal run past stage 3 of CamaeraRigBA::Run.

[PASSED] stage 1 - triangulate 3D points with feature correspondences from
mono VO and run BA
[PASSED ] stage 2 - run robust pose graph SLAM and find inlier 2D-3D
correspondences from loop closures
[PROGRAM STOPPED HERE] stage 3 - find local inter-camera 3D-3D
correspondences
with the following log messages: (no crashed)

Please can you suggest what should I do to make it work with my system.
Your help is really much appreciated. Thank you very much.
.......

Ceres Solver Report: Iterations: 31, Initial cost: 2.702882e+01, Final
cost: 1.206417e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.693116 0.168301 0.700903 -0.100646
-0.144071 0.985086 -0.0940688 0.325688
-0.706282 -0.0357792 0.707026 0
0 0 0 1
scales =
-0.0966268
.#. INFO: Calibrating odometry - camera 2...
Rotation:
0.693116 0.168301 0.700903
-0.144071 0.985086 -0.0940688
-0.706282 -0.0357792 0.707026
Translation:
-0.100646 0.325688 0
Ceres Solver Report: Iterations: 25, Initial cost: 1.799722e+01, Final
cost: 1.064531e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.713364 -0.280516 -0.642202 -0.113423
0.0840694 0.944029 -0.31897 0.294703
0.695733 0.173552 0.697019 0
0 0 0 1
scales =
0.0699959
.# INFO: Calibrating odometry - camera 1...
Rotation:
0.713364 -0.280516 -0.642202
0.0840694 0.944029 -0.31897
0.695733 0.173552 0.697019
Translation:
-0.113423 0.294703 0
Ceres Solver Report: Iterations: 75, Initial cost: 2.484698e+01, Final
cost: 1.217647e+01, Termination: FUNCTION_TOLERANCE.
.# INFO: After refinement:
H_cam_odo =
0.0996435 -0.98357 0.150533 -0.111327
0.710327 -0.0356262 -0.702969 0.349939
0.696783 0.176974 0.695107 0
0 0 0 1
scales =
0.0813187
.# INFO: Calibrating odometry - camera 0...
Rotation:
0.0996435 -0.98357 0.150533
0.710327 -0.0356262 -0.702969
0.696783 0.176974 0.695107
Translation:
-0.111327 0.349939 0
.# INFO: Reprojection error for camera 2: avg = 0.516288 px | max =
801.914 px
.# INFO: Reprojection error for camera 0: avg = 0.565628 px | max =
469.029 px
.# INFO: Reprojection error for camera 1: avg = 0.304846 px | max =
84.1896 px
.# INFO: Completed camera-odometry calibration for all cameras.
.# INFO: Saving intermediate data...
Done. Took 24.46s.
.# INFO: Running camera rig calibration.
.# INFO: # segments = 1
.# INFO: Segment 0: # frame sets = 59
.# INFO: Reprojection error: avg = 0.45 px | max = 493.62 px | # obs =
389194
.# INFO: Triangulating feature correspondences...
.# INFO: Reprojection error after triangulation: avg =
9400363031163346944.00 px | max = 1178337959001900481249280.00 px | # obs =
269012
.# INFO: # 3D scene points: 101400
.# INFO: Checking the validity of the graph...
.# INFO: Finished checking the validity of the graph.
.# INFO: Reprojection error after pruning: avg = 632419244318232832.00 px
| max = 146216321810127040544768.00 px | # obs = 245574
.# INFO: # 3D scene points: 97063
.# INFO: Running BA on odometry data...
.# INFO: Done.
.# INFO: Reprojection error after BA (odometry): avg = 45.00 px | max =
10016.56 px | # obs = 233509
.# INFO: # 3D scene points: 92924
.# INFO: Running robust pose graph optimization...
.# INFO: Building odometry edges...
.# INFO: Built 58 odometry edges.
.# INFO: Building loop closure edges...
.# INFO: Built 0 loop closure edges.
Ceres Solver Report: Iterations: 0, Initial cost: 2.572128e-31, Final
cost: 2.572128e-31, Termination: PARAMETER_TOLERANCE.
.# INFO: # inlier 2D-3D correspondences: 0
.# INFO: Merged 0 3D scene points.
.# INFO: Running BA on odometry data...
.# INFO: Reprojection error after robust pose-graph optimization: avg =
43.25 px | max = 4145.38 px | # obs = 229033
.# INFO: # 3D scene points: 90947
.# INFO: Finding inter-map 3D-3D correspondences...
.# INFO: Entering frame set node 0-0 [ 1 0 1 ]


Reply to this email directly or view it on GitHub
#42.

@stanathong
Copy link
Author

@hengli
Thanks for getting back to me. It's clearer and I have recalculated my yaw angle. However, the results I got from "# INFO: Reprojection error after triangulation" are still large, although I see there are a large number of matches within its image sequence. I'm writing this to confirm my understanding, just in case I made a mistake, please help let me know. Or what else should I need to check? It always stoped at the middle of Stage 3.

Here is what I got from my GPS.
Image 1: Timestamp: 16, Lon:-1.04630424605523, Lat: 53.9488094156805, and Bearing 134
Image 2: Timestamp: 78, Lon:-1.04630291107824, Lat: 53.9488086513807, and Bearing 134

I have crossed check and the bearing from my GPS is ranged between [0-359] and is measured from North, and the positive goes in the east direction.

image

Here, I got my yaw_angle as
double yaw_degree = fmod(360.0-bearing_degree,360.0);

which for my data for Image 1, I got yaw_degree for 226 degree.
So, the quaternion x, y, z, w when roll and pitch = 0 is 0.0, 0.0, 0.92050485345244, -0.390731128489274

Here is what my input for Event.dat for CamOdoCal. Please let me know if things are wrong with me. Or would it be possible for me to see part of your input for CamOdocal (Event.dat) file? If you are not comfortable, that's fine. Thank you very much for your help.

16 CAM 0 Front_1.jpg
16 CAM 1 Left_1.jpg
16 CAM 2 Rear_1.jpg
16 IMU 0 0 0.92050485345244 -0.390731128489274
16 GPS 53.9488094156805 -1.04630424605523 0
78 CAM 0 Front_2.jpg
78 CAM 1 Left_2.jpg
78 CAM 2 Rear_2.jpg
78 IMU 0 0 0.92050485345244 -0.390731128489274
78 GPS 53.9488086513807 -1.04630291107824 0

@Hu20130201
Are you talking about my data? I can show you my full Event.dat as required for CamOdoCal but my system only has a GPS receiver so I only have yaw angle. However, I still cannot make it run either. I have my email address on my github page, if you want to contact me. Thanks.

@hengli
Copy link
Owner

hengli commented Oct 28, 2015

Hi,

Can you provide a plot of the GPS positions and bearing angles so that I
can have a rough idea of how good the GPS data is?

Also, is the GPS data time-synchronized to the camera data?

Thanks,
Lionel

On Wed, Oct 28, 2015 at 12:39 AM, Supannee Tanathong <
notifications@github.com> wrote:

@hengli https://github.com/hengli
Thanks for getting back to me. It's clearer and I have recalculated my yaw
angle. However, the results I got from "# INFO: Reprojection error after
triangulation" are still large, although I see there are a large number of
matches within its image sequence. I'm writing this to confirm my
understanding, just in case I made a mistake, please help let me know. Or
what else should I need to check? It always stoped at the middle of Stage 3.

Here is what I got from my GPS.
Image 1: Timestamp: 16, Lon:-1.04630424605523, Lat: 53.9488094156805, and
Bearing 134
Image 2: Timestamp: 78, Lon:-1.04630291107824, Lat: 53.9488086513807, and
Bearing 134

I have crossed check and the bearing from my GPS is ranged between [0-359]
and is measured from North, and the positive goes in the east direction.

[image: image]
https://cloud.githubusercontent.com/assets/8308557/10763873/87c88e4a-7cc3-11e5-91b4-64f9e9416e15.png

Here, I got my yaw_angle as
double yaw_degree = fmod(360.0-bearing_degree,360.0);

which for my data for Image 1, I got yaw_degree for 226 degree.
So, the quaternion x, y, z, w when roll and pitch = 0 is 0.0, 0.0,
0.92050485345244, -0.390731128489274

Here is what my input for Event.dat for CamOdoCal. Please let me know if
things are wrong with me. Or would it be possible for me to see part of
your input for CamOdocal (Event.dat) file? If you are not comfortable,
that's fine. Thank you very much for your help.

16 CAM 0 Front_1.jpg
16 CAM 1 Left_1.jpg
16 CAM 2 Rear_1.jpg
16 IMU 0 0 0.92050485345244 -0.390731128489274
16 GPS 53.9488094156805 -1.04630424605523 0
78 CAM 0 Front_2.jpg
78 CAM 1 Left_2.jpg
78 CAM 2 Rear_2.jpg
78 IMU 0 0 0.92050485345244 -0.390731128489274
78 GPS 53.9488086513807 -1.04630291107824 0

@Hu20130201 https://github.com/Hu20130201
Are you talking about my data? I can show you my full Event.dat as
required for CamOdoCal but my system only has a GPS receiver so I only have
yaw angle. However, I still cannot make it run either. I have my email
address on my github page, if you want to contact me. Thanks.


Reply to this email directly or view it on GitHub
#42 (comment).

@stanathong
Copy link
Author

@hengli
Hi Lionel,

Thank you for your response. Here is what the plot of GPS data input to CamOdoCal.

gps_plot

Plot of bearing angle and computed yaw angle is shown here. My bearing angle obtained from GPS is measured from North (0 is towards North, 90 is towards east). So, I determined my yaw angle as

double yaw_degree = 360.0 - bearing_degree

bearing_from_northing2_editted

More plot for yaw angle

plot_yaw angle

Also, is the GPS data time-synchronized to the camera data?
Answer: Yes, it is. At the same wheel trig, images are taken and GPS are recorded at the same time.

Could you please advise if there should be anything wrong with my data? the yaw angle and the trajectory of the collected data? Thank you very much for your help.

@hengli
Copy link
Owner

hengli commented Nov 7, 2015

The trajectory and bearing plots look ok to me, though I have never tested
the calibration on GPS data before; just odometry data, and GPS/INS data.
The calibration works very well with locally smooth poses; regular pose
jumps will give rise to high reprojection errors. However, with GPS data,
the overall reprojection errors shouldn't be too high, but here, we are
looking at very huge errors. I suspect it's an issue with the data that is
being fed into the calibration.

One thing I could recommend to aid debugging is to triangulate the feature
correspondences between 2 frames (>= 0.3m apart) using GPS data, and see
what the reprojection errors look like?

  • Lionel

On Thu, Nov 5, 2015 at 9:47 PM, Supannee Tanathong <notifications@github.com

wrote:

@hengli https://github.com/hengli
Hi Lionel,

Thank you for your response. Here is what the plot of GPS data input to
CamOdoCal.

[image: gps_plot]
https://cloud.githubusercontent.com/assets/8308557/10969845/b6955220-83c2-11e5-8916-06d6c8d92a96.png

Plot of bearing angle and computed yaw angle is shown here. My bearing
angle obtained from GPS is measured from North (0 is towards North, 90 is
towards east). So, I determined my yaw angle as

double yaw_degree = 360.0 - bearing_degree

[image: bearing_from_northing2]
https://cloud.githubusercontent.com/assets/8308557/10969924/2859ed1c-83c3-11e5-9d47-35f137afbb89.png

More plot for yaw angle

[image: plot_yaw angle]
https://cloud.githubusercontent.com/assets/8308557/10969932/33c0bf5a-83c3-11e5-89ae-2a720f9a1556.png

Also, is the GPS data time-synchronized to the camera data?
Answer: Yes, it is. At the same wheel trig, images are taken and GPS are
recorded at the same time.

Could you please advise if there should be anything wrong with my data?
the yaw angle and the trajectory of the collected data? Thank you very much
for your help.


Reply to this email directly or view it on GitHub
#42 (comment).

@Hu20130201
Copy link

@hengli
Hi,
Here I am confused of Data form. The system consists of GPS/INS. In my data, I provided (Roll Pitch Yaw Lat/Long), is that right? also, I want to know vocabulary tree. Do I need add code to detect feathers
? Thank you very much!

@stanathong
Copy link
Author

@hengli
Hi Lionel,

Thanks very much. I'm not sure if that's the case. Using a dataset from one camera, the program could finish its processing without no problem. I'm setting a clean ubuntu to run CamOdoCal on this. The reason the program broke at Stage 3 is due to its use of Surf GPU, while I have no GPU on my VirtualBox. I will keep you post once I have finished this. Thanks.

Cheers,
Supannee

@stanathong
Copy link
Author

@Hu20130201

Hi, please allow me to answer your question. CamOdoCal is very easy to use. What you have to do is

  1. Run intrinsic_calib in order to get the camera parameters, stored in the file [camera-name]_camera_calib.yaml, which will be further used in the extrinsic calibration step
  2. Run extrinsic_calib, and as mentioned in the build instruction on the main page of this github, you have to copy data/vocabulary/surf64.yml.gz into the same directory as extrinsic_calib.

The importance thing is that you have to make sure that you put things in the correct format and folder structure. Here is an example of my data structure:

structure2

The GPS/INS data should be formatted correctly, please refer to my previous question here:
#30

I hope this help.

@Hu20130201
Copy link

@stanathong Is it the picture data continous frame picture, not keyfame picture?
thank you

@stanathong
Copy link
Author

It's a sequence of images ordered by time. Thanks.

On Dec 12, 2558 BE, at 7:14 AM, Hu20130201 notifications@github.com wrote:

@stanathong Is it the picture data continous frame picture, not keyfame picture?
thank you


Reply to this email directly or view it on GitHub.

@ethcamera
Copy link

Hi Stanathong,

Would it be possible to share your data to run the CamOdoCal. Dropbox is fine. Thanks alot.

@stanathong
Copy link
Author

@ethcamera
Hi, I have my data shared on my GoogleDrive. If you still want to try, please could you let me know your email address so that I can share it with you. I believe you know how to run the program with my data. If not, please let me know. Thanks very much.

@ethcamera
Copy link

Hi Stanathong,

May I know how to access you google drive!
My email is ethcameras@gmail.com
Could you also help to write some instructions on how to run the program with data.

Thank you alot!

@ahundt
Copy link
Collaborator

ahundt commented Jan 5, 2016

@ethcamera if you write instructions please submit a pull request! I will be happy to merge it

@stanathong
Copy link
Author

I think my question has been answered so close.

@YenJuChou0305
Copy link

@stanathong
Hi, Can I try your dataset from your GoogleDrive ?
Also, can I ask you how to set the instructions ?
My email is [beargg0305@gmail.com]

Thank you so much for considering my issue.

@Unlingius
Copy link

Unlingius commented Aug 14, 2018

Hi! Thank you for this rep! I do not understand my extrinsic calibration results. Can you help me, please? I have only one front camera, it looks exactly forward, I use syntetic dataset. And I suppose, that result extrinsic matrix must be almost identity, because yaw pitch roll are zeros. But the result is different. What extrinsic matrix must be in my case? Not identity?

@yuyadanyadan
Copy link

@Unlingius my result is also not good. But When I use the simulation data, I can get the exact result! Do you solved your problem? If you can help me, I'm very appreciated! Thank you

@Unlingius
Copy link

@yuyadanyadan I try to execute several times on the same simulation dataset and get results angles +- 1 degree. But the main problem is coordinate system understanding. I looked source:
file CamRigOdoCalibration.cc, lines 142-155
file CamOdoThread.cc, lines 378-393, And it seems finally body coordinate system is: x right, y back, z down, and optical system is z forward, x right, y down. And the result is rotation from body to optical. I'm not sure about all that. What angles accuracy did you get on simulation datasets?

@yuyadanyadan
Copy link

Hi Unlingius
I get the roll and pitch is below 1 degree. But the yaw is 12 degree difference with the exact result. And x and y is not very exact, about 3~4cm from the exact result. I only use the CamOdoCalibration.cc to calibrate the camera and odometry. your result are exact???

@MohrMa2
Copy link

MohrMa2 commented Dec 14, 2018

@stanathong I would be interested in your google drive link as well

@ManyuChang
Copy link

ManyuChang commented Sep 16, 2019

@stanathong
Hi, Can I try your dataset from your GoogleDrive ?
My e-mail is [ changmanyu@stu.xmu.edu.cn ]
Thanks!

@wk199
Copy link

wk199 commented Apr 29, 2020

@stanathong @yuyadanyadan .
Hi, I'm using this project, but many probems, donnot konw the dataset format... Can you share your data? My e-mail is 1019220367@qq.com.
Thanks!

@Chatoyant19
Copy link

@stanathong @yuyadanyadan .
Hi, I'm using this project, but many probems, donnot konw the dataset format... Can you share your data? My e-mail is 1019220367@qq.com.
Thanks!

Hi,if you got the data, can you share it with me? Thank you very much!
My email is 634866510@qq.com

@kunnalparihar
Copy link

@stanathong @yuyadanyadan @Chatoyant19 @wk199
Hi, if you got the data, can you share it with me? Thank you very much!
My email is kunalsinghparihar.17@gmail.com

@jd110145derek
Copy link

@stanathong @yuyadanyadan @Chatoyant19 @wk199 @kunnalparihar

Hi, if you got the data, can you also share with me? Thank you very much!
My email is jd110145derek@gmail.com

@chunxiaoqiao
Copy link

chunxiaoqiao commented May 17, 2021

@stanathong
Hi, Can I try your dataset from your GoogleDrive ?
My e-mail is [chxqiao@gmail.com ]
Thanks!

@ynma-hanvo
Copy link

@stanathong
Hi, can you share the data with me , 017@hanvo.com
THanks!

@wxz1996
Copy link

wxz1996 commented Feb 22, 2022

@stanathong
Hi, can you share the data with me , 156220821@qq.com, thank you very much

@bereze
Copy link

bereze commented Nov 14, 2022

@stanathong
Hi, Can I try your dataset from your GoogleDrive ?
My e-mail is luomh666@qq.com
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests