Understanding Sensor Fusion and Tracking, Part 2: Fusing a Mag, Accel, & Gyro Estimate
Summary
TLDRIn this video, Brian from MATLAB Tech Talk explains how sensor fusion can be used to estimate an object's orientation, also known as attitude or heading. He covers the use of magnetometers, accelerometers, and gyros, focusing on their roles and the challenges involved. The video discusses common issues like accelerometer linear accelerations and magnetometer disturbances, and introduces fusion algorithms like the complementary and Kalman filters to improve accuracy. Practical demonstrations using an Arduino and MATLAB are provided to illustrate these concepts in action.
Takeaways
- 📐 Sensor fusion is used to estimate an object's orientation by combining data from multiple sensors.
- 🌐 Orientation can also be referred to as attitude or heading, depending on the context.
- 📱 Common sensors for orientation estimation in phones and autonomous systems include a magnetometer, accelerometer, and gyro.
- 🛩️ Examples of orientation estimation include satellites using star trackers and airplanes using angle of attack sensors.
- 🔄 Orientation can be represented using roll, pitch, and yaw, or more complex methods like Direction Cosine Matrix (DCM) and quaternions.
- 📏 The goal is to determine the rotation between an object's coordinate frame and an external reference frame.
- 🔧 For a phone on a table, orientation can be estimated using a magnetometer and accelerometer, with cross-products used to calculate north, east, and down vectors.
- 🧲 Magnetometers can be affected by magnetic disturbances, which can be calibrated out if they rotate with the magnetometer.
- 📊 Linear accelerations and rotations can corrupt accelerometer data, which can be mitigated by predicting and removing linear acceleration or using a gyro.
- 🤖 Combining sensor data using algorithms like complementary filter or Kalman filter can provide more accurate orientation estimates by balancing the strengths and weaknesses of each sensor.
Q & A
What is sensor fusion and how is it used to estimate an object's orientation?
-Sensor fusion is the process of combining data from multiple sensors to estimate an object's orientation, also known as attitude or heading. It uses various sensors like magnetometers, accelerometers, and gyros to determine how an object is facing relative to a reference frame.
What are some alternative names for orientation in different contexts?
-In different contexts, orientation can be referred to as attitude or heading. Heading specifically refers to the direction along a 2D plane.
What is an attitude and heading reference system (AHRS)?
-An Attitude and Heading Reference System (AHRS) is a system that uses sensor fusion algorithms to determine an object's orientation or attitude relative to an inertial reference frame, which is often the stars for satellites or the local horizon for aircraft.
What are the three main sensors typically used in a sensor fusion system for orientation estimation?
-The three main sensors typically used in a sensor fusion system for orientation estimation are a magnetometer, an accelerometer, and a gyroscope.
How can a magnetometer and accelerometer be used to determine the absolute orientation of a stationary phone on a table?
-A magnetometer and accelerometer can determine the absolute orientation of a stationary phone by measuring the direction of gravity and the magnetic field in the body frame. The cross-products of these vectors with the 'down' vector (opposite to gravity) can be used to find the 'north', 'east', and 'up' directions, forming the basis for the orientation.
What is the issue with using a magnetometer for orientation estimation when the system is in motion?
-When the system is in motion, the magnetometer not only measures the Earth's magnetic field but also any disturbances caused by the motion, which can corrupt the estimate of the orientation.
What are hard iron and soft iron sources in the context of magnetometer calibration?
-Hard iron sources are objects that generate their own magnetic fields, like magnets in an electric motor or coils with current. Soft iron sources are magnetic materials that distort the magnetic field as it passes through them. Both types can affect the magnetometer readings and need to be calibrated out.
How can linear accelerations affect the accuracy of an accelerometer's measurement of gravity?
-Linear accelerations can affect the accuracy of an accelerometer's measurement of gravity because the accelerometer measures all linear accelerations, not just gravity. If the system is moving or accelerating, the accelerometer will sense this acceleration in addition to gravity, which can throw off the estimate of 'down'.
What is the concept of dead reckoning in the context of gyroscope-based orientation estimation?
-Dead reckoning is the process of estimating the orientation of an object by integrating the angular rate measurements from a gyroscope over time. It assumes that the initial orientation is known and updates the orientation based on the change in angle due to the angular rate.
How does sensor fusion help to address the limitations of using a gyroscope alone for orientation estimation?
-Sensor fusion helps to address the limitations of using a gyroscope alone by combining the gyroscope's measurements with those from other sensors like the magnetometer and accelerometer. This combination can correct for the drift that occurs in the gyroscope's estimation due to sensor bias and noise, providing a more accurate and stable orientation estimate.
What are some common sensor fusion algorithms mentioned in the script?
-Some common sensor fusion algorithms mentioned in the script include the complementary filter, the common filter, and the more specialized Mahony filter.
How does the complementary filter work in blending the estimates from the magnetometer-accelerometer and the gyroscope?
-The complementary filter works by placing a 'slider' at a position on a scale that represents trust in each solution. The designer manually decides the position of the slider, which determines the weighting of the magnetometer-accelerometer solution versus the gyroscope's dead reckoning solution. This blending emphasizes the strengths of each while minimizing their weaknesses.
What is the purpose of the common filter in sensor fusion?
-The common filter, also known as the Kalman filter, automatically calculates the optimal gain or position of the trust slider based on the noise levels in the measurements and the accuracy of the system model. It performs a kind of weighted averaging between the two solutions to provide a more accurate orientation estimate.
How does the integration of gyroscope measurements lead to drift in orientation estimation over time?
-The integration of gyroscope measurements, which acts like a low-pass filter, smooths out high-frequency noise but also accumulates errors over time due to sensor bias and random walk. This results in a gradual drift of the estimated orientation away from the true value.
Outlines
🔍 Introduction to Sensor Fusion and Orientation Estimation
This video discusses how sensor fusion is used to estimate an object's orientation, also known as attitude or heading. The focus is on using a magnetometer, accelerometer, and gyro, common in modern phones and autonomous systems, to conceptualize the system and explain the contribution of each sensor. The video aims to provide a basic understanding rather than a complete inertial measurement system.
📐 Defining and Representing Orientation
Orientation refers to how far an object is rotated from a reference frame. Methods to represent orientation include roll, pitch, yaw, direction cosine matrix (DCM), YZ, and quaternion. These methods describe the rotation between an object's coordinate frame and an external coordinate frame. The video explains the importance of choosing the right reference frame and representation method.
📱 Using Magnetometer and Accelerometer for Orientation
To determine a phone's orientation, the video explains using a magnetometer and accelerometer. The accelerometer measures gravity to find the 'down' direction, while the magnetometer determines 'north'. However, magnetic field lines point up or down depending on the hemisphere, requiring cross-products to find the true north. The video demonstrates building a DCM from these vectors.
🔧 Implementing and Visualizing the Fusion Algorithm
The video showcases an implementation of the fusion algorithm using an MPU 9250 IMU connected to an Arduino and MATLAB. It highlights how the accelerometer and magnetometer readings are used to visualize orientation. Problems with linear accelerations and magnetic disturbances are discussed, along with potential solutions such as calibration and thresholding accelerometer readings.
🧲 Calibrating Magnetometers
The video addresses the issue of magnetic disturbances from hard iron and soft iron sources. It explains how to calibrate a magnetometer by collecting measurements in different orientations and fitting them to an ideal sphere. Calibration helps remove biases and distortions, resulting in more accurate measurements.
🔄 Using Gyros for Orientation Estimation
Gyros measure angular rates and can estimate orientation through dead reckoning. The video explains the integration process and its limitations, such as bias and drift over time. It emphasizes the need to combine gyro data with accelerometer and magnetometer readings for more accurate orientation estimation.
🔀 Combining Sensors with Sensor Fusion
Sensor fusion combines the strengths of different sensors to estimate orientation. The video describes complementary and Kalman filters, which blend accelerometer, magnetometer, and gyro data to correct for drift and biases. The approach involves adjusting trust levels between sensors to achieve a balanced and accurate estimation.
🔧 Practical Sensor Fusion Implementation
The video concludes with a high-level overview of sensor fusion implementation using a Kalman filter. It suggests using MATLAB's ahrsfilter function for practical exercises. The next video will cover integrating GPS data with IMU for enhanced position estimation. Viewers are encouraged to subscribe for more tutorials and check out additional control theory topics on the presenter's channel.
Mindmap
Keywords
💡Sensor Fusion
💡Orientation
💡Magnetometer
💡Accelerometer
💡Gyro
💡Dead Reckoning
💡Direction Cosine Matrix (DCM)
💡Hard Iron Source
💡Soft Iron Source
💡Kalman Filter
Highlights
The video discusses sensor fusion for estimating an object's orientation, also known as attitude or heading.
Orientation is determined relative to a reference frame using various sensors like star trackers and angle of attack sensors.
The focus is on using a magnetometer, accelerometer, and gyro found in modern phones and autonomous systems.
The video aims to conceptualize the sensor fusion system without developing a full inertial measurement system.
Different methods to represent rotation, such as roll, pitch, yaw, and quaternion, are mentioned.
The orientation of a stationary phone can be estimated using a magnetometer and accelerometer.
The magnetic field's direction is influenced by the gravity direction, requiring cross-products to determine north.
An implementation example using an IMU (MP u 9250) connected to MATLAB is demonstrated.
The script visualizes orientation and updates using a MATLAB viewer, performing cross products and building a direction cosine matrix.
Problems with simple implementations, such as the effect of linear accelerations on the accelerometer's readings, are highlighted.
Magnetometer readings can be corrupted by magnetic disturbances, which can be mitigated through calibration.
Hard and soft iron sources affect magnetometer readings and can be calibrated out if they rotate with the system.
Calibration involves measuring distortion and applying a transformation matrix to correct measurements.
Dead reckoning, using gyro measurements, is introduced as a method to estimate orientation for rotating objects.
Integration of gyro measurements can lead to drift due to sensor bias and noise.
Sensor fusion algorithms, such as complementary and Kalman filters, are used to combine estimates from different sensors.
The video concludes with a teaser for the next video, which will include GPS and its integration with IMU for improved positioning.
Transcripts
in this video we're gonna talk about how we can use sensor fusion to estimate an
object's orientation now you may call orientation by other names like attitude
or maybe heading if you're just talking about direction along a 2d plane this is
why the fusion algorithm can also be referred to as an attitude and heading
reference system but it's all the same thing we want to figure out which way an
object is facing relative to some reference and we can use a number of
different sensors to do this for example satellites can use star trackers to
estimate attitude relative to the inertial star field whereas an airplane
could use an angle of attack sensor to measure orientation of the wing relative
to the incoming free air stream now in this video we're gonna focus on using a
very popular set of sensors that you're gonna find in every modern phone and a
wide variety of autonomous systems a magnetometer and accelerometer and a
gyro and the goal of this video is not to develop a fully fleshed-out inertial
measurement system there's just too much to cover to really do a thorough job
instead I want to conceptually build up the system and explain what each sensor
is bringing to the table and a few things to watch out for along the way
I'll also call out some other really good resources that I've linked to below
where you can dive into more of the details so let's get to it
I'm Brian and welcome to a MATLAB Tech Talk when we're talking about
orientation we're really describing how far an object is rotated away from some
known reference frame for example the pitch of an aircraft is how far the
longitudinal axis is rotated off of the local horizon so in order to define an
orientation we need to choose the reference frame that we want to describe
the orientation against and then specify the rotation from that frame using some
representation method and we have several different ways to represent a
rotation and perhaps the easiest to visualize and understand at first is the
idea of roll pitch and yaw and this representation works great in some
situations however it has some widely-known drawbacks in others so we
have other ways to define rotations for different situations things like
Direction cosine matrix YZ and the quaternion now the important thing for
this discussion is not what a quaternion is or how it
is formulated but rather just to understand that these groups of numbers
all represent a three dimensional rotation between two different
coordinate frames the object's own coordinate frame that is fixed to the
body and rotates with it and some external coordinate frame and it's this
rotation or these sets of numbers that were trying to estimate by measuring
some quantity with sensors so let's get to our specific problem let's say we
want to know the orientation of a phone that's sitting on a table so the phone's
body coordinate frame relative to the local northeast and down coordinate
frame we can find the absolute orientation using just a magnetometer
and an accelerometer now a little later on we're gonna add a gyro to improve
accuracy and correct for problems that occur when the system is moving but for
now we'll just stick with these two sensors simply speaking we could measure
the phone's acceleration which would just be due to gravity since it's
sitting stationary on the table and we would know that that direction is up the
direction opposite the direction of gravity and then we can measure the
magnetic field in the body frame to determine north but here's something to
think about the mag field points north but it also points up or down depending
on the hemisphere year-end and it's not just a little bit in north america the
field lines are angled around sixty to eighty degrees down which means it's
mostly in the gravity direction now the reason a compass points north and not
down is that the needle is constrained to rotate within a 2d plane however our
mag sensor has no such constraint so it's going to return a vector that's
also in the direction of gravity so to get north we need to do some cross
products we can start with our measured mag and
Excel vectors in the body frame down is the opposite direction of the
acceleration vector and then East is the cross-product of down and the magnetic
field and finally north is the cross-product of East and down so the
orientation of the body is simply the rotation between the body frame and the
Northeast down frame and I can build the direction cosine matrix directly from
the north east and down vectors that I just calculated let's go check out an
implementation of this fusion algorithm I have a physical IMU it's the MP u 9250
and it has an accelerometer magnetometer and gyro although for now we're not
going to use the gyro I've connected it to an Arduino through I squared C which
is then connected to MATLAB through USB I've pretty much just followed along
with this example from the math works website which provides some of the
functions that I'm using and I've linked to below if you want to do the same but
let me show you my simple script first I connect to the Arduino and the IMU and
I'm using a MATLAB viewer to visualize the orientation and I update the viewer
each time I read the sensors this is a built-in function with the sensor fusion
and tracking toolbox the small amount of math here is basically reading the
sensors performing the cross products and building the DCM and that's pretty
much the whole of it so if I run this we can watch the
algorithm in action
notice that when it's sitting on the table it does a pretty good job of
finding down it's in the positive x-axis and if I rotate it to another
orientation you can see that it follows pretty well with my physical movements
so overall pretty easy and straightforward right
well there are a few problems with this simple implementation I want to
highlight two of them the first is that accelerometers aren't just measuring
gravity they measure all linear accelerations so if the system is moving
around a lot it's gonna throw off the estimate of where down is you can see
here that I'm not really rotating the sensor much but the viewer is jumping
all over the place and this might not be much of a problem if your system is
largely not accelerating like a plane while it's cruising at altitude or a
phone that's sitting on a table but linear accelerations aren't the only
problem even rotations can throw off the estimate because an accelerometer that's
not located at the center of rotation will sense an acceleration when the
system rotates so we have to figure out a way to deal with these corruptions the
second problem is that magnetometers are affected by disturbances in the magnetic
field obviously you can see that if I get a magnet near the IMU the estimate
is corrupted so what can we do about these two problems well let's start with
this magnetometer problem if the magnetic disturbance is part of the
system and rotates with the magnetometer then it can be calibrated out these are
the so-called hard iron and soft iron sources a hard iron source is something
that generates its own magnetic field this would be an actual magnet like the
ones in an electric motor or it could be a coil that has a current running
through it from the electronics themselves and if you tried to measure
an external magnetic field a hard iron source near the magnetometer would
contribute to the measurement and if we rotate the system around a single axis
and measure the magnetic field the result would be a circle that is offset
from the origin so your magnetometer would read a larger intensity in one
direction and a smaller intensity in the opposite direction a soft iron source is
something that doesn't generate its own magnetic field
but is what you would call magnetic you know like a nail that is attracted to a
magnet or the metallic structure of your system this type of metal will bend the
magnetic field as it passes through and around it and the amount of bending
changes as that metal rotates so a soft iron source that rotates with the
magnetometer would distort the measurement creating an oval rather than
a circle so even if you had a perfect noiseless magnetometer it's gonna still
return an incorrect measurement simply because of the hard and soft iron
sources that are near it and your phone and pretty much all systems have both of
those so let's talk about what we can do with calibration if the system had no
hard or soft iron sources and you rotated the magnetometer all around in
for Paiste Radian directions then the magnetic field vector would trace out a
perfect sphere with the radius being the magnitude of the field now a hard iron
source would offset the sphere and a soft iron source would distort it into
some odd shaped spheroid if we can measure this distortion ahead of time we
could calibrate the magnetometer by finding the offset and transformation
matrix that would convert it back into a perfect sphere centered at the origin
this transformation matrix and bias would then be applied to each
measurement essentially removing the hard and soft iron sources this is
exactly what your phone does when it asks you to spin it around in all
directions before using the compass here I'm demonstrating this by calibrating my
IMU using the MATLAB function mag Cal I'm collecting a bunch of measurements
in different orientations and then finding the calibration coefficients
that will fit them to an ideal sphere and now that I have an a matrix that
will correct for soft iron sources and a B matrix that will remove hard iron bias
I can add a calibration step to the fusion algorithm that I showed you
previously and this will produce a more accurate result than what I had before
all right now let's go back to solving the other problem of the corrupting
linear accelerations and one way to address this is by predicting linear
acceleration and removing it from the measurement prior to using it and this
might sound difficult to do but it is possible if the acceleration is the
result of the system actuators you know rather than an unpredictable external
disturbance what we can do is take the commands that are sent to the actuators
and play it through a model of the system to estimate the expected linear
acceleration and then subtract that value from the measurement this is
something that is possible if say your system is a drone and you're flying it
around by commanding the four propellers now if we can't predict the linear
acceleration or the external disturbances are too high another option
is to ignore accelerometer readings that are outside of some threshold from a 1g
measurement if the magnitude of the reading is not close to the magnitude of
gravity then clearly the sensor is picking up on other movement and it
can't be trusted this keeps corrupted measurements from getting into our
fusion algorithm but it's not a great solution because we stopped estimating
orientation during these times and we lose track of the state of the system
again it's not really a problem if we're trying to estimate orientation for a
static object this algorithm would work perfectly fine however often we want to
know the orientation of something that is rotating and accelerating so we need
something else here to help us out what we can do is add a gyro into the mix to
measure the angular rate of the system in fact the combination of magnetometer
accelerometer and gyro are so popular that they're often packaged together as
an inertial measurement unit like I have with my MP you 9250 so the question is
how does the gyro help to start I think it's useful to think about how we can
estimate orientation for a rotating object with just the gyro on its own no
accelerometer and no magnetometer for this we can multiply the angular rate
measurement by the sample time to get the change in angle during that time and
then if we knew the orientation of the phone at the previous sample time we
could just add this Delta angle to it and have an updated estimate of the
current orientation and if the object isn't wrote
then the Delta angle would be zero and the orientation wouldn't change so it
all works out and by repeating this process for the next sample time and the
one after that we're going to know the orientation of the phone over time this
process is called dead reckoning and essentially it's just integrating the
gyro measurement now there are downsides to dead reckoning one you still need to
know the initial orientation before you begin so we need to figure that out and
two sensors aren't perfect they have bias and other high frequency
noises that will corrupt our estimation now integration acts like a low-pass
filter so that high frequency noise is smoothed out a little bit which is good
but the result drifts away from the true position due to random walk as well as
integrating any bias and the measurements so over time the
orientation will smoothly drift away from the truth
so at this point we have two different ways to estimate orientation one using
the accelerometer and the magnetometer and the other using just the gyro and
each way have their own respective benefits and problems and this is where
sensor fusion comes in once again we can use it to combine these two estimates in
a way that emphasizes each of their strengths and minimizes their weaknesses
now there's a number of sensor fusion algorithms that we can use like a
complementary filter or a common filter or the more specialized but common magic
or Mahony filters but at their core every one of them does essentially the
same thing they initialize the attitude either by setting manually or using the
initial results of the Magon accelerometer and then over time they
used the direction of the mag field and gravity to slowly correct for the drift
in the gyro now I go into a lot more detail on this in my video on the
complimentary filter and MathWorks has a series on the mechanics of the common
filter both are linked below but just in case you don't go and watch them right
away let me go over a really high-level concept of how this blending works let's
put our two solutions at opposite ends of a scale that represents our trust in
each one and we can place a slider that specifies which
solution we trust more if the slider is all the way to the left then we trust
our mag and Excel solution 100% and we just use that value for our orientation
all the way to the right and we use the dead reckoning solution 100% when the
slider is in between this is saying that we trust both solutions some amount and
therefore want to take a portion of one and add it to the complimentary portion
of the other by putting the slider almost entirely to the dead reckoning
solution were mostly trusting the smoothness and quick updates of the
integrated gyro measurements which gives us good estimates during rotations and
linear accelerations but we are ever so gently correcting that solution back
towards the absolute measurement of the mag in Excel to remove the bias before
it has a chance to grow too large so these two approaches complement each
other now for the complimentary filter you as the designer figure out manually
where to place this slider how much you trust one measurement over the other but
with a common filter the optimal gain or the optimal position of the slider is
calculated for you after you specify things like how much noise there is in
the measurements and how good you think your system model is so the bottom line
is that we're doing some kind of fancy averaging between the two solutions
based on how much trust we have in each of them now if you want to practice this
yourself the MATLAB tutorial I used earlier goes through a common filter
approach using the function a hrs filter and that's where I'm going to leave this
video in the next video we're gonna take this one step further and add GPS and
show how our IMU and orientation estimate can help us improve the
position that we get from the GPS sensor so if you don't want to miss that or
other future Tech Talk videos don't forget to subscribe to this channel also
if you want to check out my channel control system lectures I cover more
control theory topics there as well I'll see you next time
Parcourir plus de vidéos associées
![](https://i.ytimg.com/vi/hN8dL55rP5I/hq720.jpg)
Understanding Sensor Fusion and Tracking, Part 3: Fusing a GPS and IMU to Estimate Pose
![](https://i.ytimg.com/vi/6qV3YjFppuc/hq720.jpg)
Understanding Sensor Fusion and Tracking, Part 1: What Is Sensor Fusion?
![](https://i.ytimg.com/vi/whSw42XddsU/hq720.jpg)
Drone Control and the Complementary Filter
![](https://i.ytimg.com/vi/IIt1LHIHYc4/hq720.jpg)
Understanding Sensor Fusion and Tracking, Part 5: How to Track Multiple Objects at Once
![](https://i.ytimg.com/vi/hJG08iWlres/hq720.jpg)
Understanding Sensor Fusion and Tracking, Part 4: Tracking a Single Object With an IMM Filter
![](https://i.ytimg.com/vi/iug_d-PxLio/hqdefault.jpg?sqp=-oaymwEXCJADEOABSFryq4qpAwkIARUAAIhCGAE=&rs=AOn4CLDaFhekqIWE38c6XOPc_DBGT6YYTQ)
DAA101: Randomized Algorithms in DAA| Las Vegas Algorithm | Monte Carlo Algorithm
5.0 / 5 (0 votes)