Understanding Sensor Fusion and Tracking, Part 2: Fusing a Mag, Accel, & Gyro Estimate

MATLAB
22 Oct 201916:00

Summary

TLDRIn this video, Brian from MATLAB Tech Talk explains how sensor fusion can be used to estimate an object's orientation, also known as attitude or heading. He covers the use of magnetometers, accelerometers, and gyros, focusing on their roles and the challenges involved. The video discusses common issues like accelerometer linear accelerations and magnetometer disturbances, and introduces fusion algorithms like the complementary and Kalman filters to improve accuracy. Practical demonstrations using an Arduino and MATLAB are provided to illustrate these concepts in action.

Takeaways

  • 📐 Sensor fusion is used to estimate an object's orientation by combining data from multiple sensors.
  • 🌐 Orientation can also be referred to as attitude or heading, depending on the context.
  • 📱 Common sensors for orientation estimation in phones and autonomous systems include a magnetometer, accelerometer, and gyro.
  • 🛩️ Examples of orientation estimation include satellites using star trackers and airplanes using angle of attack sensors.
  • 🔄 Orientation can be represented using roll, pitch, and yaw, or more complex methods like Direction Cosine Matrix (DCM) and quaternions.
  • 📏 The goal is to determine the rotation between an object's coordinate frame and an external reference frame.
  • 🔧 For a phone on a table, orientation can be estimated using a magnetometer and accelerometer, with cross-products used to calculate north, east, and down vectors.
  • 🧲 Magnetometers can be affected by magnetic disturbances, which can be calibrated out if they rotate with the magnetometer.
  • 📊 Linear accelerations and rotations can corrupt accelerometer data, which can be mitigated by predicting and removing linear acceleration or using a gyro.
  • 🤖 Combining sensor data using algorithms like complementary filter or Kalman filter can provide more accurate orientation estimates by balancing the strengths and weaknesses of each sensor.

Q & A

  • What is sensor fusion and how is it used to estimate an object's orientation?

    -Sensor fusion is the process of combining data from multiple sensors to estimate an object's orientation, also known as attitude or heading. It uses various sensors like magnetometers, accelerometers, and gyros to determine how an object is facing relative to a reference frame.

  • What are some alternative names for orientation in different contexts?

    -In different contexts, orientation can be referred to as attitude or heading. Heading specifically refers to the direction along a 2D plane.

  • What is an attitude and heading reference system (AHRS)?

    -An Attitude and Heading Reference System (AHRS) is a system that uses sensor fusion algorithms to determine an object's orientation or attitude relative to an inertial reference frame, which is often the stars for satellites or the local horizon for aircraft.

  • What are the three main sensors typically used in a sensor fusion system for orientation estimation?

    -The three main sensors typically used in a sensor fusion system for orientation estimation are a magnetometer, an accelerometer, and a gyroscope.

  • How can a magnetometer and accelerometer be used to determine the absolute orientation of a stationary phone on a table?

    -A magnetometer and accelerometer can determine the absolute orientation of a stationary phone by measuring the direction of gravity and the magnetic field in the body frame. The cross-products of these vectors with the 'down' vector (opposite to gravity) can be used to find the 'north', 'east', and 'up' directions, forming the basis for the orientation.

  • What is the issue with using a magnetometer for orientation estimation when the system is in motion?

    -When the system is in motion, the magnetometer not only measures the Earth's magnetic field but also any disturbances caused by the motion, which can corrupt the estimate of the orientation.

  • What are hard iron and soft iron sources in the context of magnetometer calibration?

    -Hard iron sources are objects that generate their own magnetic fields, like magnets in an electric motor or coils with current. Soft iron sources are magnetic materials that distort the magnetic field as it passes through them. Both types can affect the magnetometer readings and need to be calibrated out.

  • How can linear accelerations affect the accuracy of an accelerometer's measurement of gravity?

    -Linear accelerations can affect the accuracy of an accelerometer's measurement of gravity because the accelerometer measures all linear accelerations, not just gravity. If the system is moving or accelerating, the accelerometer will sense this acceleration in addition to gravity, which can throw off the estimate of 'down'.

  • What is the concept of dead reckoning in the context of gyroscope-based orientation estimation?

    -Dead reckoning is the process of estimating the orientation of an object by integrating the angular rate measurements from a gyroscope over time. It assumes that the initial orientation is known and updates the orientation based on the change in angle due to the angular rate.

  • How does sensor fusion help to address the limitations of using a gyroscope alone for orientation estimation?

    -Sensor fusion helps to address the limitations of using a gyroscope alone by combining the gyroscope's measurements with those from other sensors like the magnetometer and accelerometer. This combination can correct for the drift that occurs in the gyroscope's estimation due to sensor bias and noise, providing a more accurate and stable orientation estimate.

  • What are some common sensor fusion algorithms mentioned in the script?

    -Some common sensor fusion algorithms mentioned in the script include the complementary filter, the common filter, and the more specialized Mahony filter.

  • How does the complementary filter work in blending the estimates from the magnetometer-accelerometer and the gyroscope?

    -The complementary filter works by placing a 'slider' at a position on a scale that represents trust in each solution. The designer manually decides the position of the slider, which determines the weighting of the magnetometer-accelerometer solution versus the gyroscope's dead reckoning solution. This blending emphasizes the strengths of each while minimizing their weaknesses.

  • What is the purpose of the common filter in sensor fusion?

    -The common filter, also known as the Kalman filter, automatically calculates the optimal gain or position of the trust slider based on the noise levels in the measurements and the accuracy of the system model. It performs a kind of weighted averaging between the two solutions to provide a more accurate orientation estimate.

  • How does the integration of gyroscope measurements lead to drift in orientation estimation over time?

    -The integration of gyroscope measurements, which acts like a low-pass filter, smooths out high-frequency noise but also accumulates errors over time due to sensor bias and random walk. This results in a gradual drift of the estimated orientation away from the true value.

Outlines

00:00

🔍 Introduction to Sensor Fusion and Orientation Estimation

This video discusses how sensor fusion is used to estimate an object's orientation, also known as attitude or heading. The focus is on using a magnetometer, accelerometer, and gyro, common in modern phones and autonomous systems, to conceptualize the system and explain the contribution of each sensor. The video aims to provide a basic understanding rather than a complete inertial measurement system.

05:01

📐 Defining and Representing Orientation

Orientation refers to how far an object is rotated from a reference frame. Methods to represent orientation include roll, pitch, yaw, direction cosine matrix (DCM), YZ, and quaternion. These methods describe the rotation between an object's coordinate frame and an external coordinate frame. The video explains the importance of choosing the right reference frame and representation method.

10:04

📱 Using Magnetometer and Accelerometer for Orientation

To determine a phone's orientation, the video explains using a magnetometer and accelerometer. The accelerometer measures gravity to find the 'down' direction, while the magnetometer determines 'north'. However, magnetic field lines point up or down depending on the hemisphere, requiring cross-products to find the true north. The video demonstrates building a DCM from these vectors.

15:06

🔧 Implementing and Visualizing the Fusion Algorithm

The video showcases an implementation of the fusion algorithm using an MPU 9250 IMU connected to an Arduino and MATLAB. It highlights how the accelerometer and magnetometer readings are used to visualize orientation. Problems with linear accelerations and magnetic disturbances are discussed, along with potential solutions such as calibration and thresholding accelerometer readings.

🧲 Calibrating Magnetometers

The video addresses the issue of magnetic disturbances from hard iron and soft iron sources. It explains how to calibrate a magnetometer by collecting measurements in different orientations and fitting them to an ideal sphere. Calibration helps remove biases and distortions, resulting in more accurate measurements.

🔄 Using Gyros for Orientation Estimation

Gyros measure angular rates and can estimate orientation through dead reckoning. The video explains the integration process and its limitations, such as bias and drift over time. It emphasizes the need to combine gyro data with accelerometer and magnetometer readings for more accurate orientation estimation.

🔀 Combining Sensors with Sensor Fusion

Sensor fusion combines the strengths of different sensors to estimate orientation. The video describes complementary and Kalman filters, which blend accelerometer, magnetometer, and gyro data to correct for drift and biases. The approach involves adjusting trust levels between sensors to achieve a balanced and accurate estimation.

🔧 Practical Sensor Fusion Implementation

The video concludes with a high-level overview of sensor fusion implementation using a Kalman filter. It suggests using MATLAB's ahrsfilter function for practical exercises. The next video will cover integrating GPS data with IMU for enhanced position estimation. Viewers are encouraged to subscribe for more tutorials and check out additional control theory topics on the presenter's channel.

Mindmap

Keywords

💡Sensor Fusion

Sensor fusion involves combining data from multiple sensors to estimate an object's orientation. It is crucial in applications where accurate orientation data is needed, such as in smartphones and autonomous systems. In the video, sensor fusion is used to integrate data from a magnetometer, accelerometer, and gyro to improve orientation accuracy.

💡Orientation

Orientation refers to the direction an object is facing relative to a reference frame. It is also known as attitude or heading, especially when describing direction along a 2D plane. The video focuses on estimating orientation using various sensors, highlighting its importance in determining an object's position in space.

💡Magnetometer

A magnetometer measures magnetic fields and is used to determine an object's orientation relative to Earth's magnetic field. In the video, the magnetometer helps find the north direction, which is essential for calculating the phone's orientation on a table.

💡Accelerometer

An accelerometer measures acceleration forces, which can include gravity. It is used in the video to determine the direction opposite to gravity, helping to find the 'down' direction for orientation estimation. The accelerometer's role is crucial in static situations but can be corrupted by linear accelerations.

💡Gyro

A gyro measures angular rate or rotational speed. It is used in the video to help estimate orientation by integrating angular rate measurements over time. The gyro is introduced to improve accuracy and handle movements that affect the accelerometer and magnetometer readings.

💡Dead Reckoning

Dead reckoning is a method of estimating orientation by integrating angular rate measurements from a gyro over time. In the video, dead reckoning is described as a process where the orientation is updated based on previous measurements and current angular rates, though it can suffer from drift over time.

💡Direction Cosine Matrix (DCM)

A DCM is a matrix used to represent the rotation between two coordinate frames. In the video, the DCM is built from north, east, and down vectors to represent the phone's orientation relative to the local frame. It is a key element in sensor fusion algorithms.

💡Hard Iron Source

A hard iron source generates its own magnetic field and can cause distortions in magnetometer readings. In the video, examples include magnets in electric motors and current-carrying coils. Calibration is needed to correct for these disturbances.

💡Soft Iron Source

A soft iron source does not generate its own magnetic field but can distort the magnetic field around it. In the video, metallic structures of a system can bend the magnetic field, creating distortions that need to be calibrated out to ensure accurate magnetometer readings.

💡Kalman Filter

A Kalman filter is an advanced sensor fusion algorithm that optimally combines measurements from multiple sensors to estimate the state of a system. In the video, the Kalman filter is mentioned as a method to blend the estimates from the accelerometer, magnetometer, and gyro, adjusting the trust in each sensor based on noise and system model accuracy.

Highlights

The video discusses sensor fusion for estimating an object's orientation, also known as attitude or heading.

Orientation is determined relative to a reference frame using various sensors like star trackers and angle of attack sensors.

The focus is on using a magnetometer, accelerometer, and gyro found in modern phones and autonomous systems.

The video aims to conceptualize the sensor fusion system without developing a full inertial measurement system.

Different methods to represent rotation, such as roll, pitch, yaw, and quaternion, are mentioned.

The orientation of a stationary phone can be estimated using a magnetometer and accelerometer.

The magnetic field's direction is influenced by the gravity direction, requiring cross-products to determine north.

An implementation example using an IMU (MP u 9250) connected to MATLAB is demonstrated.

The script visualizes orientation and updates using a MATLAB viewer, performing cross products and building a direction cosine matrix.

Problems with simple implementations, such as the effect of linear accelerations on the accelerometer's readings, are highlighted.

Magnetometer readings can be corrupted by magnetic disturbances, which can be mitigated through calibration.

Hard and soft iron sources affect magnetometer readings and can be calibrated out if they rotate with the system.

Calibration involves measuring distortion and applying a transformation matrix to correct measurements.

Dead reckoning, using gyro measurements, is introduced as a method to estimate orientation for rotating objects.

Integration of gyro measurements can lead to drift due to sensor bias and noise.

Sensor fusion algorithms, such as complementary and Kalman filters, are used to combine estimates from different sensors.

The video concludes with a teaser for the next video, which will include GPS and its integration with IMU for improved positioning.

Transcripts

play00:00

in this video we're gonna talk about how we can use sensor fusion to estimate an

play00:04

object's orientation now you may call orientation by other names like attitude

play00:09

or maybe heading if you're just talking about direction along a 2d plane this is

play00:14

why the fusion algorithm can also be referred to as an attitude and heading

play00:18

reference system but it's all the same thing we want to figure out which way an

play00:23

object is facing relative to some reference and we can use a number of

play00:27

different sensors to do this for example satellites can use star trackers to

play00:31

estimate attitude relative to the inertial star field whereas an airplane

play00:35

could use an angle of attack sensor to measure orientation of the wing relative

play00:39

to the incoming free air stream now in this video we're gonna focus on using a

play00:45

very popular set of sensors that you're gonna find in every modern phone and a

play00:48

wide variety of autonomous systems a magnetometer and accelerometer and a

play00:53

gyro and the goal of this video is not to develop a fully fleshed-out inertial

play00:57

measurement system there's just too much to cover to really do a thorough job

play01:01

instead I want to conceptually build up the system and explain what each sensor

play01:06

is bringing to the table and a few things to watch out for along the way

play01:09

I'll also call out some other really good resources that I've linked to below

play01:13

where you can dive into more of the details so let's get to it

play01:17

I'm Brian and welcome to a MATLAB Tech Talk when we're talking about

play01:23

orientation we're really describing how far an object is rotated away from some

play01:27

known reference frame for example the pitch of an aircraft is how far the

play01:31

longitudinal axis is rotated off of the local horizon so in order to define an

play01:36

orientation we need to choose the reference frame that we want to describe

play01:39

the orientation against and then specify the rotation from that frame using some

play01:44

representation method and we have several different ways to represent a

play01:47

rotation and perhaps the easiest to visualize and understand at first is the

play01:51

idea of roll pitch and yaw and this representation works great in some

play01:55

situations however it has some widely-known drawbacks in others so we

play02:00

have other ways to define rotations for different situations things like

play02:05

Direction cosine matrix YZ and the quaternion now the important thing for

play02:09

this discussion is not what a quaternion is or how it

play02:13

is formulated but rather just to understand that these groups of numbers

play02:17

all represent a three dimensional rotation between two different

play02:21

coordinate frames the object's own coordinate frame that is fixed to the

play02:25

body and rotates with it and some external coordinate frame and it's this

play02:30

rotation or these sets of numbers that were trying to estimate by measuring

play02:34

some quantity with sensors so let's get to our specific problem let's say we

play02:41

want to know the orientation of a phone that's sitting on a table so the phone's

play02:45

body coordinate frame relative to the local northeast and down coordinate

play02:48

frame we can find the absolute orientation using just a magnetometer

play02:52

and an accelerometer now a little later on we're gonna add a gyro to improve

play02:56

accuracy and correct for problems that occur when the system is moving but for

play03:01

now we'll just stick with these two sensors simply speaking we could measure

play03:06

the phone's acceleration which would just be due to gravity since it's

play03:09

sitting stationary on the table and we would know that that direction is up the

play03:14

direction opposite the direction of gravity and then we can measure the

play03:17

magnetic field in the body frame to determine north but here's something to

play03:23

think about the mag field points north but it also points up or down depending

play03:28

on the hemisphere year-end and it's not just a little bit in north america the

play03:32

field lines are angled around sixty to eighty degrees down which means it's

play03:36

mostly in the gravity direction now the reason a compass points north and not

play03:41

down is that the needle is constrained to rotate within a 2d plane however our

play03:46

mag sensor has no such constraint so it's going to return a vector that's

play03:50

also in the direction of gravity so to get north we need to do some cross

play03:55

products we can start with our measured mag and

play03:59

Excel vectors in the body frame down is the opposite direction of the

play04:04

acceleration vector and then East is the cross-product of down and the magnetic

play04:09

field and finally north is the cross-product of East and down so the

play04:15

orientation of the body is simply the rotation between the body frame and the

play04:19

Northeast down frame and I can build the direction cosine matrix directly from

play04:23

the north east and down vectors that I just calculated let's go check out an

play04:28

implementation of this fusion algorithm I have a physical IMU it's the MP u 9250

play04:34

and it has an accelerometer magnetometer and gyro although for now we're not

play04:38

going to use the gyro I've connected it to an Arduino through I squared C which

play04:43

is then connected to MATLAB through USB I've pretty much just followed along

play04:48

with this example from the math works website which provides some of the

play04:51

functions that I'm using and I've linked to below if you want to do the same but

play04:56

let me show you my simple script first I connect to the Arduino and the IMU and

play05:00

I'm using a MATLAB viewer to visualize the orientation and I update the viewer

play05:04

each time I read the sensors this is a built-in function with the sensor fusion

play05:08

and tracking toolbox the small amount of math here is basically reading the

play05:13

sensors performing the cross products and building the DCM and that's pretty

play05:18

much the whole of it so if I run this we can watch the

play05:22

algorithm in action

play05:25

notice that when it's sitting on the table it does a pretty good job of

play05:28

finding down it's in the positive x-axis and if I rotate it to another

play05:33

orientation you can see that it follows pretty well with my physical movements

play05:37

so overall pretty easy and straightforward right

play05:41

well there are a few problems with this simple implementation I want to

play05:45

highlight two of them the first is that accelerometers aren't just measuring

play05:49

gravity they measure all linear accelerations so if the system is moving

play05:54

around a lot it's gonna throw off the estimate of where down is you can see

play05:58

here that I'm not really rotating the sensor much but the viewer is jumping

play06:01

all over the place and this might not be much of a problem if your system is

play06:05

largely not accelerating like a plane while it's cruising at altitude or a

play06:09

phone that's sitting on a table but linear accelerations aren't the only

play06:14

problem even rotations can throw off the estimate because an accelerometer that's

play06:18

not located at the center of rotation will sense an acceleration when the

play06:22

system rotates so we have to figure out a way to deal with these corruptions the

play06:28

second problem is that magnetometers are affected by disturbances in the magnetic

play06:32

field obviously you can see that if I get a magnet near the IMU the estimate

play06:37

is corrupted so what can we do about these two problems well let's start with

play06:42

this magnetometer problem if the magnetic disturbance is part of the

play06:46

system and rotates with the magnetometer then it can be calibrated out these are

play06:51

the so-called hard iron and soft iron sources a hard iron source is something

play06:57

that generates its own magnetic field this would be an actual magnet like the

play07:01

ones in an electric motor or it could be a coil that has a current running

play07:05

through it from the electronics themselves and if you tried to measure

play07:08

an external magnetic field a hard iron source near the magnetometer would

play07:12

contribute to the measurement and if we rotate the system around a single axis

play07:17

and measure the magnetic field the result would be a circle that is offset

play07:21

from the origin so your magnetometer would read a larger intensity in one

play07:25

direction and a smaller intensity in the opposite direction a soft iron source is

play07:31

something that doesn't generate its own magnetic field

play07:33

but is what you would call magnetic you know like a nail that is attracted to a

play07:38

magnet or the metallic structure of your system this type of metal will bend the

play07:43

magnetic field as it passes through and around it and the amount of bending

play07:47

changes as that metal rotates so a soft iron source that rotates with the

play07:52

magnetometer would distort the measurement creating an oval rather than

play07:55

a circle so even if you had a perfect noiseless magnetometer it's gonna still

play08:01

return an incorrect measurement simply because of the hard and soft iron

play08:04

sources that are near it and your phone and pretty much all systems have both of

play08:09

those so let's talk about what we can do with calibration if the system had no

play08:16

hard or soft iron sources and you rotated the magnetometer all around in

play08:20

for Paiste Radian directions then the magnetic field vector would trace out a

play08:24

perfect sphere with the radius being the magnitude of the field now a hard iron

play08:30

source would offset the sphere and a soft iron source would distort it into

play08:34

some odd shaped spheroid if we can measure this distortion ahead of time we

play08:40

could calibrate the magnetometer by finding the offset and transformation

play08:44

matrix that would convert it back into a perfect sphere centered at the origin

play08:48

this transformation matrix and bias would then be applied to each

play08:52

measurement essentially removing the hard and soft iron sources this is

play08:57

exactly what your phone does when it asks you to spin it around in all

play09:01

directions before using the compass here I'm demonstrating this by calibrating my

play09:06

IMU using the MATLAB function mag Cal I'm collecting a bunch of measurements

play09:10

in different orientations and then finding the calibration coefficients

play09:14

that will fit them to an ideal sphere and now that I have an a matrix that

play09:19

will correct for soft iron sources and a B matrix that will remove hard iron bias

play09:24

I can add a calibration step to the fusion algorithm that I showed you

play09:27

previously and this will produce a more accurate result than what I had before

play09:34

all right now let's go back to solving the other problem of the corrupting

play09:39

linear accelerations and one way to address this is by predicting linear

play09:43

acceleration and removing it from the measurement prior to using it and this

play09:47

might sound difficult to do but it is possible if the acceleration is the

play09:51

result of the system actuators you know rather than an unpredictable external

play09:56

disturbance what we can do is take the commands that are sent to the actuators

play09:59

and play it through a model of the system to estimate the expected linear

play10:04

acceleration and then subtract that value from the measurement this is

play10:08

something that is possible if say your system is a drone and you're flying it

play10:12

around by commanding the four propellers now if we can't predict the linear

play10:16

acceleration or the external disturbances are too high another option

play10:21

is to ignore accelerometer readings that are outside of some threshold from a 1g

play10:26

measurement if the magnitude of the reading is not close to the magnitude of

play10:29

gravity then clearly the sensor is picking up on other movement and it

play10:33

can't be trusted this keeps corrupted measurements from getting into our

play10:37

fusion algorithm but it's not a great solution because we stopped estimating

play10:41

orientation during these times and we lose track of the state of the system

play10:45

again it's not really a problem if we're trying to estimate orientation for a

play10:49

static object this algorithm would work perfectly fine however often we want to

play10:53

know the orientation of something that is rotating and accelerating so we need

play10:57

something else here to help us out what we can do is add a gyro into the mix to

play11:03

measure the angular rate of the system in fact the combination of magnetometer

play11:08

accelerometer and gyro are so popular that they're often packaged together as

play11:12

an inertial measurement unit like I have with my MP you 9250 so the question is

play11:18

how does the gyro help to start I think it's useful to think about how we can

play11:23

estimate orientation for a rotating object with just the gyro on its own no

play11:28

accelerometer and no magnetometer for this we can multiply the angular rate

play11:33

measurement by the sample time to get the change in angle during that time and

play11:38

then if we knew the orientation of the phone at the previous sample time we

play11:42

could just add this Delta angle to it and have an updated estimate of the

play11:45

current orientation and if the object isn't wrote

play11:48

then the Delta angle would be zero and the orientation wouldn't change so it

play11:52

all works out and by repeating this process for the next sample time and the

play11:57

one after that we're going to know the orientation of the phone over time this

play12:01

process is called dead reckoning and essentially it's just integrating the

play12:05

gyro measurement now there are downsides to dead reckoning one you still need to

play12:11

know the initial orientation before you begin so we need to figure that out and

play12:15

two sensors aren't perfect they have bias and other high frequency

play12:20

noises that will corrupt our estimation now integration acts like a low-pass

play12:25

filter so that high frequency noise is smoothed out a little bit which is good

play12:29

but the result drifts away from the true position due to random walk as well as

play12:35

integrating any bias and the measurements so over time the

play12:38

orientation will smoothly drift away from the truth

play12:42

so at this point we have two different ways to estimate orientation one using

play12:48

the accelerometer and the magnetometer and the other using just the gyro and

play12:52

each way have their own respective benefits and problems and this is where

play12:57

sensor fusion comes in once again we can use it to combine these two estimates in

play13:03

a way that emphasizes each of their strengths and minimizes their weaknesses

play13:07

now there's a number of sensor fusion algorithms that we can use like a

play13:12

complementary filter or a common filter or the more specialized but common magic

play13:17

or Mahony filters but at their core every one of them does essentially the

play13:22

same thing they initialize the attitude either by setting manually or using the

play13:27

initial results of the Magon accelerometer and then over time they

play13:32

used the direction of the mag field and gravity to slowly correct for the drift

play13:36

in the gyro now I go into a lot more detail on this in my video on the

play13:41

complimentary filter and MathWorks has a series on the mechanics of the common

play13:45

filter both are linked below but just in case you don't go and watch them right

play13:50

away let me go over a really high-level concept of how this blending works let's

play13:55

put our two solutions at opposite ends of a scale that represents our trust in

play14:00

each one and we can place a slider that specifies which

play14:04

solution we trust more if the slider is all the way to the left then we trust

play14:09

our mag and Excel solution 100% and we just use that value for our orientation

play14:14

all the way to the right and we use the dead reckoning solution 100% when the

play14:20

slider is in between this is saying that we trust both solutions some amount and

play14:25

therefore want to take a portion of one and add it to the complimentary portion

play14:29

of the other by putting the slider almost entirely to the dead reckoning

play14:33

solution were mostly trusting the smoothness and quick updates of the

play14:37

integrated gyro measurements which gives us good estimates during rotations and

play14:41

linear accelerations but we are ever so gently correcting that solution back

play14:46

towards the absolute measurement of the mag in Excel to remove the bias before

play14:51

it has a chance to grow too large so these two approaches complement each

play14:55

other now for the complimentary filter you as the designer figure out manually

play15:01

where to place this slider how much you trust one measurement over the other but

play15:05

with a common filter the optimal gain or the optimal position of the slider is

play15:10

calculated for you after you specify things like how much noise there is in

play15:14

the measurements and how good you think your system model is so the bottom line

play15:18

is that we're doing some kind of fancy averaging between the two solutions

play15:22

based on how much trust we have in each of them now if you want to practice this

play15:27

yourself the MATLAB tutorial I used earlier goes through a common filter

play15:30

approach using the function a hrs filter and that's where I'm going to leave this

play15:36

video in the next video we're gonna take this one step further and add GPS and

play15:41

show how our IMU and orientation estimate can help us improve the

play15:45

position that we get from the GPS sensor so if you don't want to miss that or

play15:49

other future Tech Talk videos don't forget to subscribe to this channel also

play15:52

if you want to check out my channel control system lectures I cover more

play15:56

control theory topics there as well I'll see you next time

Rate This

5.0 / 5 (0 votes)

Related Tags
Sensor FusionOrientation EstimationAttitude HeadingInertial MeasurementMagnetometerAccelerometerGyroscopeIMUDead ReckoningMATLAB Tech Talk