Improving orientation tracking – handling sensor fusion

One of the limitations with sensor-based tracking is the sensors. As we introduced before, some of the sensors are inaccurate, noisy, or have drift. A technique to compensate their individual issue is to combine their values to improve the overall rotation you can get with them. This technique is called sensor fusion. There are different methods for fusing the sensors, we will use the method presented by Paul Lawitzki with a source code under MIT License available at http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/. In this section, we will briefly explain how the technique works and how to integrate sensor fusion to our JME AR application.

Sensor fusion in a nutshell ...

Get Augmented Reality for Android Application Development now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.