Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Hardware Sensor Fusion Solutions
Clone this wiki locally
One of the great innovations in motion control in the last few years is the embedding of a dedicated processor along with 6 and 9 DoF motion sensors in a single package to provide sensor fusion solutions where the resulting quaternions are read directly by a microcontroller via I2C or SPI. I call this hardware sensor fusion but in reality the algorithms are executed in software on the embedded processor rather than on the host microcontroller. The advantages are many: the sensor fusion code doesn't take up limited flash memory on the host, the sensor fusion is off-loaded from the host to the embedded processor, which is optimized for sensor fusion, and highly efficient and sophisticated filtering and autocalibration of the sensors can be accomplished automagically without the user having to become an expert in sensor fusion theory. I will study three of these hardware sensor fusion solutions here: the BNO055 from Bosch, the MAX21100 from Maxim Integrated, and the EM7180 from EM Microelectronics. Although Invensense was among the first to produce a hardware sensor solution with its MPU6050, I am ignoring the 9 DoF MPU9250 here since it can only provide a 6 DoF solution with its embedded DMP; a 9 DoF solution still requires host processing. Likewise, I just became aware of Fairchild's FIS1100 IMU with embedded motion co-processor but it looks like this also requires software run on the host to perform the full 9 DoF sensor fusion.
The BNO055 combines Bosch's flagship 9 DoF motion sensor the BMX055 (itself an agglomeration of the BMA055 accelerometer, BMI055 gyro, and BMM055 magnetometer, all packaged as separate sensors also) along with a ARM Cortex M0 processor. The board I am using is this one.The sensor fusion relies on an extended Kalman filter for the fusion proper plus low- and high-pass filtering, autocalibration, temperature compensation, etc to fuse the three sensor data streams into a quaternion representation of absolute orientation as well as providing linear acceleration, gravity, and Euler angle outputs directly readable via I2C or UART from internal registers.
The MAX21100 is an accelerometer/gyro combination that has an embedded motion merging engine that does the same sensor fusion tasks as the BNO055 but, in this case, the magnetometer data must come from an external magnetometer. On the board I am using for the following tests I pair the MAX21100 with ST Microelectronics' fine LIS3MDL 16-bit magnetometer.
The EM7180 is not a motion sensor, but rather a motion sensor hub that takes external sensor data from a variety of motion sensors and performs 9 DoF sensor fusion using an extended Kalman filter, autocalibration, low-and high-pass filtering, magnetic anomaly detection all at a very low power consumption. For the following tests I am using a board with the BMX055 as the sensor data input to the EM7180. This allows a certain amount of cross comparison since for both the BNO055 and the EM7180 I am using the same BMX055 motion sensor for input data to the respective sensor fusion engines.
I am just beginning detailed comparative studies of these hardware motion sensor solutions and so I will begin with the simplest of tests. I will start the sensors in the hardware fusion mode, each being controlled by a Teensy 3.1 microcontroller and mounted on a breadboard, and capture the Yaw (Heading), Roll, and Pitch via Serial output to my laptop. The procedure is to orient the breadboard parallel to the edge of my desk (which should be 45 degrees from true North), then every 120 seconds rotate the breadboard by ninety degrees. The first series of test data is shown below. I'll start with the MAX21100 and since the data is rather busy let's take some time to understand what is happening.
The nearly ideal behavior of the MAX21100 hardware sensor fusion solution. Madgwick MARG is doing at least as well!
Since the breadboard remains relatively flat on my desk during this experiment the Roll and Pitch are not very interesting. Besides, an accelerometer/gyro combination is sufficient to get accurate roll and pitch; the hardest part of sensor fusion is to get an accurate Yaw or heading. Here I am plotting the heading in degrees versus time in seconds. The light blue is the heading derived from the quaternions produced from the MAX21100 motion merging engine (hardware sensor fusion) while the dark blue is the heading derived from the quaternions produced from the open-source Madgwick MARG (software sensor fusion) algorithm using the MAX21100 reported scaled sensor data. Both hardware and software solutions use the same underlying scaled sensor data, they just differ in the particulars of the fusion filter and where the filtering is taking place. Hardware sensor fusion on the processor inside the MAX21100, software sensor fusion on the Freescale Kinetis ARM Cortex M4 of the Teensy 3.1.
It takes several tens of seconds for the sensors to initialize and for basic bias calibration to complete. Once the heading starts to spew to the serial monitor at 1 Hz, I moved the board to the edge of the table. After some initial motion, the heading quickly settles down to ~143 degrees for the hardware solution and -40 degrees for the software solution. There is definitely an iteration roll up to the software solution where it approaches the stable heading over several seconds whereas the hardware solution responds much more rapidly. Notice that the hardware and software solutions are almost exactly 180 degrees different; this results from a difference in the orientation reference frame. The Madgwick frame is chosen such that the heading is zero when the x-axis of the MAX21100 accelerometer is aligned with true North. The Max21100 hardware solution apparently uses a different convention. I have asked Maxim about this but was told this information could not be provided. It should be straightforward to figure out what the convention is but I haven't been able to yet! It won't stop us from learning about the quality of the sensor fusion solution though.
At the 120 second mark, I rotated the breadboard (sensor) by ninety degrees. The software solution takes a few seconds but does indeed settle in at ~51 degrees while the hardware sensor solution transitions almost immediately to -131 degrees. This is excellent performance, the total difference is 91 degrees for the software solution and 88 degrees for the hardware solution. At 240 seconds another ninety degree rotation and the Madgwick filters shows ~139 degrees heading and the MAX21100 motion merging engine shows -43 degrees. Again this is an 88 degree difference for the software solution versus 88 degrees for the hardware solution. I repeated this pattern until the 800 second mark where I picked up the breadboard and waved it vigorously in all directions for twenty seconds then set it back down in the same orientation it started from. We can see that the headings both returned to their previous values within the one or two degree margin I am able to maintain by controlling the board manually.
Here is a plot of the estimated heading change as a function of time for the same experiment.
The heading changes should be 90 +/- 1 degree; deviations are a measure of the quality of the sensor (data and) fusion solution.
Both the Madgwick sensor fusion filter and the MAX21100 motion merging engine are doing an excellent job here. The ideal change in heading solution after each of the six ninety degree turns would be 90 degrees, of course! And after the last jostling we should expect the difference should be zero if the heading returns to where it was before the breadboard was picked up and waved about. You can see clearly how well each of these sensor fusion solutions does. The standard deviation of the difference between averaged headings for each turn (after a few seconds of initial settling) is 2.2 degrees for Madgwick and 4.3 degrees for the MAX21100 motion merging engine. That is, the average change in heading when the sensor is rotated by ninety degrees is 90.2 +/- 2.2 and 89.0 +/- 4.3 degrees for the software and hardware sensor fusion filters, respectively, with the same scaled MAX21100+LIS3MDL data source. This is within the expected 90 +/- 1 degree ideal response given the accuracy with which I can manually place the breadboard on the desk.
That the simple Madgwick filter does as well (or better) than the dedicated and (presumably) optimized MAX21100 sensor fusion solution is somewhat surprising. Can we do better by optimizing the filter parameters, data sample rates, low-pass filtering? Perhaps. We could almost certainly do better with a test fixture not subject to the inaccuracies of manual placement of the breadboard platform as done here. Given the (somewhat) sloppy experiment, basic bias calibration, and no real attempt to optimize sensor performance these results are remarkably accurate. Can the other two hardware sensor fusion solutions match this accuracy?
The same experiment for the BNO055 was performed and the results are shown below:
The BNO055 hardware sensor fusion solution displays a few degrees of heading drift even when turned through ninety degrees laying flat on the desk.
Again, the BNO055 hardware sensor fusion heading stabilized rather quickly after a second or two transition to a steady value. (For some reason I wasn't able to get magnetometer data out to simultaneously calculate the Madgwick quaternion). The average difference between the headings is 87.1 +/-1.1 degrees. How can the average difference of a series of ninety degree turns be less than 90 degrees? Because there is drift in the result. Compare the heading at 150 seconds with the heading at 650 seconds after a 360 degree turn. Somehow ten degrees have been lost! This kind of drift shouldn't occur and is not observed with the MAX21100. I think the reason is because the BNO055 has an automatic sensor calibration algorithm that is constantly updating the bias registers such that the heading estimate is continually "improving". But here it is constantly drifting! This can be more clearly seen at the 800 second mark where several times I picked up the breadboard containing the BNO055 sensor and waved it around, then set it back down in the same orientation (to the best of my ability to do so manually). Each time I did this the heading changed first from ~350 to ~330, then to ~325, and finally to ~20 degrees. What is going on here!?
Part of the answer lies in the behavior of the underlying BMX055 sensor data. It appears that I can't get magnetometer readings when the BNO055 sensor fusion is running in the forced mode (a fixed setting when running sensor fusion). So I ran the BNO055 in the non-fusion AMG (accel/gyro/magnetometer) mode and used the scaled sensor data to calculate quaternions, and heading, with the Madgwick MARG fusion filter. The results are similar to what I already measured here for the BMX055. I plot them below:
Results of applying our turn test to the BMX055 embedded inside the BNO055 are similar to those already measured for a standalone BMX055.
The ninety-degree turns result in heading changes of between ~79 and ~98 degrees with an average heading change of 88.8 degrees and a standard deviation of 7.4 degrees. The large standard deviation is an indication of a large skew in the magnetometer response surface among other possibilities. The skew coupled with the automatic calibration is somehow causing the heading drift seen in the BNO055 in hardware sensor fusion mode.
Can we get better performance out of this sensor with a different sensor fusion engine? We've already seen that the Kalman filter running on the Cortex M0 of the BNO055 and the Madgwick filter running on the Cortex M4 of the Teensy 3.1 produce very similar results when fed the same sensor data from the BMX055 motion sensor. Shouldn't we expect similar results with yet a third fusion engine? Below we see the results of sensor fusion using the Kalman filter of the EM7180 sensor hub with BMX055 source data as input.
Heading produced by the EM7180 sensor hub using a Kalman sensor fusion filter with BMX055 source data. Notice the near instantaneous change in heading upon ninety degrees turns of the sensor board.
I started the experiment by moving the sensor board around to allow the autocalibration routines in the EM7180 to perform whatever bias corrections they might in an effort to see if the heading drift we saw with the BNO055 might be corrected at the beginning of the experiment. At 30 seconds I placed the sensor board at the edge of the table and got the first heading reading of 28.42 degrees. The heading output by the EM7180 is remarkably stable showing changes less than +/- 0.01 degree when the board is at rest. It also settles nearly instantaneously when the heading is changed (meaning much less than 1 s with 1 Hz output). This is outstanding performance. The problem is that ninety degree turns of the board do not register as ninety degree changes in heading output from the EM7180. The subsequent reading of the heading (at ~500 s) after a 360 degree integrated turn was 7.04 degrees. Somehow a drift of 21 degrees is exhibited even after we tried to pre-calibrate by 30 seconds of initial board motion. The average heading change for all six turns was 93 +/- 10 degrees. If we throw out the first heading change the average is a more reasonable 89.4 +/- 4.8 degrees but still not particularly good and similar to the result we observed with the BNO055. The Madgwick sensor fusion filter performed about the same with an average heading change of 90.2 +/- 6.9 degrees. At the end of the test I moved the board around for ten seconds at a time then returned the board to the edge of the table to check global stability. The results were somewhat better here, with the EM7180 trying to reach a stable solution near -170 degrees. But on the whole this is disappointing performance.
Maybe the disappointment can be laid at the feet of the underlying motion sensor rather than the sensor fusion solution. Support for this hypothesis can be found in the results of our first test, where the Madgwick sensor fusion did as well as or better than the excellent performance shown by the MAX21100 Motion Merging Engine both using the same scaled gyro/accel data from the MAX21100 and magnetometer data from the LIS3MDL. But the same Madgwick sensor fusion algorithm is not doing well with the BMX055 data. I think some residual bias or jitter in the underlying BMX055 sensor is throwing off the sensor fusion algorithms here and producing the unsatisfactory results. Is there a way to further test this hypothesis? Fortunately, there is.
The EM7180 is a sensor fusion hub and can, in principle, accept data sources from almost any motion sensor. In this case, I have designed boards that use the LSM9DS0 or MPU6500+AK8963C as motion sensor sources to test the same EM7180 sensor fusion algorithms with different data sources. If the sensor fusion algorithm itself dominates heading determination accuracy, we should see the same results no matter which sensor provides the data to the fusion engine. If, on the other hand, the quality of the sensor data matters at least as much as the sensor fusion engine (as I suspect), we should see different (I hope better) performance when we use different motion sensor input with the same hardware sensor fusion engine. Let's start with the LSM9DS0.
For this test I simply designed a breakout board very similar to the EM7180+BMX055 board used above but with the LSM9DS0 replacing the BMX055. The turn test was identical to that just discussed and the results are shown below.
Results from both Madgwick (software) and EM7180 (hardware) sensor fusion using the same underlying LSM9DS0 motion sensor data.
The EM7180 again produces very steady headings that stay within +/- 0.01 of the value recorded when the board becomes stationary after each turn. The Madgwick fusion result has a lot more jitter and often slowly asymptotes to its less steady solution. The same procedure was followed where data was spewed to the serial monitor while the board underwent ninety degree turns every 120 seconds. At the 800 second mark I picked up the board and moved it around vigorously for ten seconds then replaced the board to its previous position to check reproducibility. Unlike the case of the BNO055 and EM7180+BMX055, but like the MAX21100, the LSM9DS0+EM7180 is performing pretty well. The heading is repeatable after 360 degree turns and doesn't suffer from large heading drifts seen with the BMX055 sensor. The average heading change after six ninety-degree turns was 88.5+/- 4.1 degrees for the EM7180 (hardware) sensor fusion and 88.8 +/- 6.2 degrees for the Madgwick (software) sensor fusion solutions. At least the hardware sensor fusion solution result is comparable to that obtained with the MAX21100, and is approaching ideal behavior.
So far we can draw at least two conclusions. One is that the quality of the underlying motion sensor data does seem to matter to the accuracy of the heading derived from sensor fusion. And Two, even the best of the available hardware sensor fusion engines can only manage to produce ~4 degree heading accuracy. Is it possible to do better than this?
More to come...