Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please help me modify code to fit my sensor! #17

Closed
SlimeVRX opened this issue May 4, 2022 · 5 comments
Closed

Please help me modify code to fit my sensor! #17

SlimeVRX opened this issue May 4, 2022 · 5 comments

Comments

@SlimeVRX
Copy link

SlimeVRX commented May 4, 2022

Hi!

I'm trying to reproduce paper results using lower-cost IMU.

Since I am not tall enough (1m63) and have not edited the leg_length information, the estimate is not accurate.
I tried and tested a lot to get the first results: Video Link

IMU: I'm supported six Hi229 9-dof from HiPNUC (datasheet)
Frequency: 50Hz (record 100Hz step=2)

I recorded the data(data_align, data_T_Pose) and modified live_hi229.py based live_demo.py:

  • Put imu (x = Left, y = Up, z = Forward)
    • record data_align in 10s length: 500 frame
    • data_align[50:300]: align
  • Stand straight in T-pose
    • record data_T_Pose in 60s length: 2999 frame
    • data_T_Pose[50:300]: T_Pose
  • Demo
    • data_T_Pose[500:2980]: action

The acceleration read from Noitom is global. But in my data: the acceleration is local, and the quaternion is local.
I don't know how to convert local to global acceleration, the local accelerometer is passed to the model. This could be the reason the body estimate is not accurate.

Can you help me revise the code to fit my data? (live_hi229.py)
If you are too busy, read through the code once and give me some advice. Thanks a lot!

@SlimeVRX SlimeVRX changed the title Please modified with my sensor! Please modified to fit my sensor! May 5, 2022
@SlimeVRX SlimeVRX changed the title Please modified to fit my sensor! Please help me modify to fit my sensor! May 5, 2022
@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented May 5, 2022

Hi!

I have several questions. Could you find out these things:

  1. Which coordinate frame is the sensor "quaternion" expressed in? Is it the global inertial frame (shared by all IMUs)?

  2. Which coordinate frame is the sensor acceleration expressed in? Is it the sensor frame (rotated together with the sensor)?

  3. Make sure the unit of the acceleration? Is it g or m/s^2? Is the acceleration = g when the sensor is still (or = 0)?

If you can find out these things, I can help you with your code.

p.s. If you speak Chinese, you can send your wechat to me.

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented May 5, 2022

To transform the acceleration in the sensor frame (local) to the inertial frame (global), just multiply the sensor rotation with the column-vector acceleration (Ra). Maybe you can try this.

@SlimeVRX
Copy link
Author

SlimeVRX commented May 5, 2022

Hi!

I have several questions. Could you find out these things:

  1. Which coordinate frame is the sensor "quaternion" expressed in? Is it the global inertial frame (shared by all IMUs)?
  2. Which coordinate frame is the sensor acceleration expressed in? Is it the sensor frame (rotated together with the sensor)?
  3. Make sure the unit of the acceleration? Is it g or m/s^2? Is the acceleration = g when the sensor is still (or = 0)?

I set the sensor in 9-DOF AHRS mode. I record the data of each sensor, without a combination of sensors together.

  1. Quaternion is local, and not shared by all IMUs.
  2. Acceleration is expressed in the sensor frame (rotated together with the sensor).
  3. Unit of acceleration is m/s^2. Unit of raw acceleration is mG but I converted it to m/s^2 using the formula *9.81/2^10. When the sensor is still, the acceleration value is shown in the figure below.

image

@Xinyu-Yi
Copy link
Owner

Xinyu-Yi commented May 6, 2022

You just need to convert the acceleration to global. Then check whether your code can work.

You can debug by:

  1. put all 6 imus on the desk, keep them still.
  2. run your live demo (you do not need to wear or align the imus), and print the calibrated measurements sent to the network. You should get all identity rotation matrix and nearly-zero accelerations, Then throw a sensor into the air (if you can do this), and you will get g acceleration in y axis

@SlimeVRX
Copy link
Author

SlimeVRX commented May 6, 2022

I followed your instructions to debug:

  1. put all 6 imus on the desk, keep them still. Recorded the data Stand.
  2. run live_debug.py. I get the values rotation matrix and nearly-zero accelerations.

Acc [6,3]
image

Ori [6,3,3]
image

image

image

image

image

image

@SlimeVRX SlimeVRX closed this as completed May 6, 2022
@SlimeVRX SlimeVRX changed the title Please help me modify to fit my sensor! Please help me modify code to fit my sensor! May 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants