Skip to content

Sensor Fusion Linear Acceleration

Pedro Simão edited this page Feb 19, 2018 · 3 revisions

Tilt vs Linear Acceleration

An accelerometer can measure the static gravitation field of earth (like a tilt sensor) or it can measure linear acceleration (like accelerating in a vehicle), but it cannot measure both at the same time. When talking about linear acceleration in reference to an acceleration sensor, what we really mean is Linear Acceleration = Measured Acceleration - Gravity (so we can determine the actual acceleration of the device no matter how the device is oriented/tilted). The hard part is determining what part of the signal is gravity.

The Underlying Problem

It is difficult to sequester the gravity component of the signal from the linear acceleration. Some Android devices implement Sensor.TYPE_LINEAR_ACCELERATION and Sensor.TYPE_GRAVITY which perform the calculations for you. Most of these devices are new, high-end and equipped with a gyroscope. It is also worth noting that Sensor.TYPE_LINEAR_ACCELERATION does not actually work very well while the device is accelerating on many devices I have tested. This is because the acceleration sensor is often used to compensate for the drift of the gyroscope, which causes a feedback loop between the two sensors. When the device is accelerating, the drift compensating algorithm assumes the device is actually tilted, instead of accelerating, and begins to compensate the gyroscope erroneously. We can see this effect from this project.

If you have and older device and do not have a gyroscope, you are going to face some limitations with Sensor.TYPE_ACCELERATION. The tilt of the device can only be measured accurately assuming the device is not experiencing any linear acceleration. The linear acceleration can only be measured accurately if the tilt of the device is known. The approach taken with Cardan Linear Acceleration is to use Sensor.TYPE_ACCELERATION in conjunction with Sensor.TYPE_MAGNETIC_FIELD to determine the linear acceleration. This has some advantages over using a gyroscope in that almost all Android devices have acceleration and magnetic sensors (where most do not have gyroscopes) and it does not rely on circular feedback via a complementary filter. The acceleration and magnetic sensor fusion also has some advantages over a low-pass filter in that there is no lag time in the tilt compensations.

There are some limitations to the magnetic sensor which make the algorithm, overall, a poor choice in determining linear acceleration. An accelerometer alone will tell you pitch and roll; not yaw because gravity is parallel to yaw (z) and a magnetometer alone will tell you roll and yaw; not pitch because the Earth’s magnetic field is parallel to pitch (x). Note that the earth’s magnetic field is only horizontal to the earth’s surface at the equator. In North America and UK the earth’s magnetic field may be between 50 to 70 degrees to the horizontal. As a result, pitch, roll and yaw will all be erroneous. The orientation may only be considered correct around the 2 perpendicular rotational axis that are orthogonal to the true direction of the Earth’s magnetic field. Thus, the magnetic sensor cannot be used to determine the pitch and roll of the device, only the yaw, and we will not use the yaw to perform any of our calculations. It is only because the Android API requires both the magnetic and acceleration sensor to determine the orientation that the magnetic sensor is used (which is still a sensor fusion), but we will only use the accelerometer to determine the the gravity components of the acceleration of the device. There are more simple methods of doing this, one of which is discussed in Simple Linear Acceleration. However, this article does give an explanation of the coordinate systems used, and demonstrates how to determine the gravity components of the acceleration once the orientation of the device is known.

Gyroscopes

Most gyroscopes on Android devices are vibrational and measure the rotation of a device with a pair of vibrating arms that take advantage of what is known as the Coriolis effect, which is caused by the Earth's rotation. By measuring changes in the direction of the vibrating arms caused by a rotation and the Coriolis effect, an estimation of the rotation can be produced. The gyroscope is one of three sensors that are always hardware based (the other two are the magnetic and the acceleration sensors) on Android devices. In conjunction with the acceleration sensor, the gyroscope can be used to create other sensors like gravity, linear acceleration or rotation sensors. These sensors are all useful for detecting the movement of the device, which can either be a result of the user inputs (moving the device to control a character on a game) or an external physical environment (like the movement of a car). They can also be used indirectly to determine the position of a device, like tilt-compensation on the magnetic sensor for a compass.

Gyroscopes Drift Over Time

Like all sensors, a gyroscope is not perfect and has small errors in each measurement. Since the measurements from a gyroscope are integrated over time, these small errors start to add up and result in what is known a drift. Over time, the results of the integration can become unreliable and some form of compensation is required to help compensate for the drift. This requires another sensor to provide a second measurement of the devices orientation that can then be used to augment the gyroscopes integration back towards the actual rotation of the device. This second sensor is usually a acceleration or magnetic sensor, or sometimes both. A weighted average, Kalman filter or complementary filter are common implementations of fusing other sensors to the gyroscope sensor, each with their own advantages and disadvantages. When you really get down into the implementations, you also run into real limitations with the "support" sensors as well. For instance, an acceleration sensor cannot determine the difference between the tilt of the device and linear acceleration, which makes for a vicious circular reference when trying to implement a linear acceleration sensor. In fact, the Android Sensor.TYPE_LINEAR_ACCELERATION is terrible at measuring linear acceleration under the influence of a physical environment such as the acceleration of a car because of the circular reference. The magnetic sensor is another option, but it is limited by the effects of hard and soft iron offsets and it can only measure roll and yaw, so it isn't perfect, either. It can take a lot of effort, fine tuning and possibly multiple sensor fusions and calibrations to get reliable estimations.

An Aside into Coordinate Systems

There are a number of coordinate systems to be aware of when developing with Android devices that need to be covered before going further.

World Coordinate System

The world coordinate system in Android is the ENU (east, north, up) coordinate system. This is different from the NED (north, east, down) coordinate system that is commonly used in aviation.

Local Coordinate System

The local coordinate system describes the coordinate system of the device. For most sensors, the coordinate system is defined relative to the device's screen when the device is held in its default orientation (see figure 1). When a device is held in its default orientation, the X axis is horizontal and points to the right, the Y axis is vertical and points up, and the Z axis points toward the outside of the screen face. In this system, coordinates behind the screen have negative Z values. The most important point to understand about this coordinate system is that the axes are not swapped when the device's screen orientation changes—that is, the sensor's coordinate system never changes as the device moves.

Other Coordinate Systems

The method SensorManager.getOrientation(), which is commonly used to get the orientation vector of the device from a rotation matrix, uses a third reference coordinate system that is not the world coordinate system. A WND (west, north, down) coordinate system is used, which is different from both the ENU and NED coordinate systems that are more common. Also worth noting is that the order of the axis returned in the method are different from those returned by the sensors.

When SensorManager.getOrientation() returns, the array values is filled with the result:

  • values[0]: azimuth, rotation around the Z axis.
  • values[1]: pitch, rotation around the X axis.
  • values[2]: roll, rotation around the Y axis.

Gyroscope Integration Over Time

Gyroscopes integrate measurements over time to create an estimation of orientation over time. Android does not do the heavy lifting for you. You have to integrate the measurements yourself.

The gist of the integration comes from the Android Developer documentation. First, the axis-angle representation is normailzed (a requirement of quaternions) by taking the magnitude of the vector and then dividing each element in the vector by the magnitude. The derivative is then taken with respect to time and then the axis-angle representation is converted into a quaternion. The quaternion is then converted into a rotation matrix (via SensorManager.getRotationMatrixFromVector()) which can be applied to another rotation matrix representing the current orientation of the device. Finally, the matrix representing the current orientation is converted into a vector representing the orientation of the device (via SensorManager.getOrienation()).

Some things that aren't explained very well in the documentation...

Why divide each element in the axis-angle vector by the magnitude of the vector? SensorManager.getRotationMatrixFromVector() requires a unit quaternion and, by definition, the three elements of the rotation vector are equal to the last three components of a unit quaternion <cos(θ/2), xsin(θ/2), ysin(θ/2), z*sin(θ/2)>. A vector, 'u' is a unit quanternion if its norm is one, | u | = 1. We normalize the axis-angle vector so we can convert it into a quaternion.

Why is the sensors delta time (sensor period, dt, time between updates, etc...) divided by 2? This comes from the θ/2 in each of the quaternions elements because we want half angles for our quaternions. Instead of dividing theta by 2 in each of the four elements, we divide it once when we determine the sensors delta time.

What is a good value for EPSILON? A value that is just large enough to normal the vector, 'u', so very small, just larger than 0.

private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
 
public void onSensorChanged(SensorEvent event) {
     // This timestep's delta rotation to be multiplied by the current rotation
     // after computing it from the gyro sample data.
     if (timestamp != 0) {
         final float dT = (event.timestamp - timestamp) * NS2S;
         // Axis of the rotation sample, not normalized yet.
         float axisX = event.values[0];
         float axisY = event.values[1];
         float axisZ = event.values[2];
 
         // Calculate the angular speed of the sample
         float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
 
         // Normalize the rotation vector if it's big enough to get the axis
         if (omegaMagnitude > EPSILON) {
             axisX /= omegaMagnitude;
             axisY /= omegaMagnitude;
             axisZ /= omegaMagnitude;
         }
 
         // Integrate around this axis with the angular speed by the timestep
         // in order to get a delta rotation from this sample over the timestep
         // We will convert this axis-angle representation of the delta rotation
         // into a quaternion before turning it into the rotation matrix.
         float thetaOverTwo = omegaMagnitude * dT / 2.0f;
         float sinThetaOverTwo = sin(thetaOverTwo);
         float cosThetaOverTwo = cos(thetaOverTwo);
         deltaRotationVector[0] = sinThetaOverTwo * axisX;
         deltaRotationVector[1] = sinThetaOverTwo * axisY;
         deltaRotationVector[2] = sinThetaOverTwo * axisZ;
         deltaRotationVector[3] = cosThetaOverTwo;
     }
     timestamp = event.timestamp;
     float[] deltaRotationMatrix = new float[9];
     SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
     // User code should concatenate the delta rotation we computed with the current rotation
     // in order to get the updated rotation.
     // rotationCurrent = rotationCurrent * deltaRotationMatrix;
}

We will also need to get the initial rotation matrix from the acceleration and magnetic sensors. Most folks will want to start with a standard basis oriented to the world frame (East, North, Up) based on the current orientation of the device. This requires determining the current orientation of the device in the world coordinate system and can be done with the acceleration and magnetic sensors with a call to SensorManager.getRotationMatrix(). The matrix that is returned will then be multiplied by the delta rotation matrix that is produced from the gyroscope measurements to produce a new rotation matrix representing the current orientation.

If you do not take this step, you can create an initial orientation based on the rotation of the device when the algorithm starts with the identity matrix. This will create a world frame that is oriented to the initial rotation of the device and all further device frame rotations will be relative to the initial world frame. The identity matrix will be multiplied by the delta rotation matrix that is produced from the gyroscope measurements and the integration method. The identity matrix will be multiplied by the delta rotation matrix that is produced from the gyroscope measurements to produce a new rotation matrix representing the current orientation.

The Initial Orientation

private void calculateInitialOrientation()
{
    hasInitialOrientation = SensorManager.getRotationMatrix(
            initialRotationMatrix, null, acceleration, magnetic);
 
}

Concatenate the Rotation

We can apply a rotation matrix to another rotation matrix by multiplying the two rotation matrices together. This is how the orientations of the axis-angle vectors produced by the gyroscope are integrated. A rotation matrix representing each axis-angle vector is produced by the gyroscope and then applied to the rotation matrix representing the last known orientation of the device. The easiest way to do this is to create a function to multiply two 3x3 matrices together.

private float[] matrixMultiplication(float[] a, float[] b)
{
     float[] result = new float[9];
 
     result[0] = a[0] * b[0] + a[1] * b[3] + a[2] * b[6];
     result[1] = a[0] * b[1] + a[1] * b[4] + a[2] * b[7];
     result[2] = a[0] * b[2] + a[1] * b[5] + a[2] * b[8];
 
     result[3] = a[3] * b[0] + a[4] * b[3] + a[5] * b[6];
     result[4] = a[3] * b[1] + a[4] * b[4] + a[5] * b[7];
     result[5] = a[3] * b[2] + a[4] * b[5] + a[5] * b[8];
 
     result[6] = a[6] * b[0] + a[7] * b[3] + a[8] * b[6];
     result[7] = a[6] * b[1] + a[7] * b[4] + a[8] * b[7];
     result[8] = a[6] * b[2] + a[7] * b[5] + a[8] * b[8];
 
     return result;
}

The Orientation

Once we have our new rotation matrix, all we have to do is make a call to SensorManager.getOrientation() to get the orientation of the device. Note that the reference coordinate-system used is different from the world coordinate-system defined for the rotation matrix and the order of the axis returned by the method are different from the order returned by the sensors.

''' SensorManager.getOrientation(currentRotationMatrixCalibrated, gyroscopeOrientationCalibrated); '''

Rotating Gravity into Device Frame

Now comes the core of the algorithm, determining the gravity components of the acceleration based on the tilt of the device. Trigonometry, linear algebra and rotation matrices are used to determine the functions.

To determine the functions, we will perform a composite rotation of the devices orientation in the order of yaw, pitch and then roll. We will assume that the device is initially oriented along the global coordinate frame, so that gravity is gvec:={0,0,-g}.

We have to determine our rotation matrices. These are the coordinate system rotations of the x-, y-, and z-axes in a counterclockwise direction when looking towards the origin give the matrices.

X-Axis/Roll =

{1,0,0}

{0, cos(roll), sin(roll)}

{0, -sin(roll), cos(roll)}

Y-Axis/Pitch =

{cos(pitch), 0, -sin(pitch)}

{0,1,0}

{sin(pitch, 0, cos(pitch)}

Z-Axis/Yaw/Azimuth =

{cos(yaw), sin(yaw), 0}

{-sin(yaw), cos(yaw), 0}

{0, 0, 1}

Once we have the rotation matrices, we must create the composite rotation by multiplying the rotation matrices by our initial gvec= {0,0,-g} in the order of yaw, pitch and then roll. We are essentially rotating gravity into the orientation of our device.

yaw_matrix * gvec = {0, 0, -g}

pitch_matrix * yaw_matrix * gvec = {g*sin(pitch), -g*cos(pitch)}

roll_matrix * pitch_matrix * yaw_matrix * gvec = {-g sin(pitch),-g cos(pitch) sin(roll),g cos(pitch) cos(roll)}

We now have the functions we need to determine the gravity components of each axis.

X-Axis/Pitch = -g * cos(pitch)  * sin(roll)

Y-Axis/Roll = -g * sin(pitch)

Z-Axis/Yaw = g * cos(pitch) * cos(roll)

The implementation in Android would work as follows:

Find the gravity components of each axis...

// Find the gravity component of the X-axis
// = g*-cos(pitch)*sin(roll);
components[0] = (float) (-SensorManager.GRAVITY_EARTH * Math.cos(values[1]) * Math
                        .sin(values[2]));
                 
// Find the gravity component of the Y-axis
// = g*-sin(pitch);
components[1] = (float) (SensorManager.GRAVITY_EARTH * -Math.sin(values[1]));
 
// Find the gravity component of the Z-axis
// = g*cos(pitch)*cos(roll);
components[2] = (float) (SensorManager.GRAVITY_EARTH * Math.cos(values[1]) * Math
                        .cos(values[2]));

When we have the gravity components, the final step is subtracting them from the acceleration. Note the conversion from units of m/s2 to g's (gravities of earth).

// Subtract the gravity component of the signal
// from the input acceleration signal to get the
// tilt compensated output.
linearAcceleration[0] = (this.acceleration[0] - components[0])/SensorManager.GRAVITY_EARTH;
linearAcceleration[1] = (this.acceleration[1] - components[1])/SensorManager.GRAVITY_EARTH;
linearAcceleration[2] = (this.acceleration[2] - components[2])/SensorManager.GRAVITY_EARTH;