Understanding Sensor Fusion and Tracking, Part 3: Fusing a GPS and IMU to Estimate Pose

MATLAB
23 Oct 201914:00

Summary

TLDRThis MATLAB Tech Talk explores sensor fusion for precise positioning and localization. Brian demonstrates how integrating IMU and GPS sensors can enhance object tracking, especially in high-speed scenarios. The video visually compares the performance of a GPS-only system to one augmented with IMU sensors, highlighting the importance of sensor fusion in maintaining accurate position estimates. It delves into the workings of an extended Kalman filter, emphasizing the need for proper initialization and the impact of asynchronous sensor data on the estimation process.

Takeaways

  • 📐 The video discusses the use of sensor fusion for positioning and localization, particularly combining IMU and GPS sensors to estimate an object's orientation, position, and velocity.
  • 🌐 GPS sensors provide absolute measurements for position and velocity, which can be integrated with IMU sensors to correct for drift and improve accuracy in dynamic environments.
  • 🔍 The fusion algorithm's goal is to provide an intuitive understanding of how each sensor contributes to the final estimation of an object's state, rather than a detailed technical description.
  • 🤖 The video uses MATLAB's sensor fusion and tracking toolbox to demonstrate pose estimation from asynchronous sensors, showing how different sensor configurations affect estimation accuracy.
  • 🔄 The script in the toolbox allows for adjusting sensor sample rates and even removing sensors to observe the impact on the estimation process, providing a hands-on learning experience.
  • 🔢 In scenarios with slow movement and less stringent accuracy requirements, GPS alone may suffice for position estimation, but for high-speed motion or when accuracy is critical, additional sensors like IMU are necessary.
  • ⏱ The video illustrates the limitations of relying solely on GPS for fast-moving objects, where the velocity changes rapidly and the one-second interval between GPS updates can lead to significant errors.
  • 🔧 The importance of sensor bias estimation is highlighted, as biases can drift over time and affect the accuracy of the sensor fusion algorithm if not accounted for.
  • 🛠 The fusion algorithm used in the video is a continuous-discrete extended Kalman filter (EKF) that can handle asynchronous sensor measurements, which is beneficial for systems with different sensor sample rates.
  • 🔄 The EKF operates on a predict-correct cycle, using the model of the system's dynamics to predict state changes and then correcting these predictions with actual sensor measurements.
  • 🚀 The video encourages viewers to experiment with the example code, adjusting parameters and observing the effects on the state estimation to deepen their understanding of sensor fusion.

Q & A

  • What is the main topic of the video?

    -The video discusses the use of sensor fusion for positioning and localization, particularly focusing on how GPS and IMU sensors can be combined to estimate an object's orientation, position, and velocity.

  • Why is it necessary to correct the drift from the gyro using absolute measurements from the accelerometer and magnetometer?

    -The gyro can accumulate errors over time, causing a drift in the estimated orientation. Absolute measurements from the accelerometer and magnetometer help correct this drift by providing a reference to the actual orientation with respect to gravity and the Earth's magnetic field.

  • What is the role of GPS in the sensor fusion algorithm discussed in the video?

    -GPS is used to measure the position and velocity of an object. It provides absolute measurements that can be integrated into the fusion algorithm to improve the estimation of the object's state.

  • How does the video demonstrate the impact of different sensor sample rates on the estimation process?

    -The video uses a MATLAB example that allows changing the sample rates of the sensors or removing them from the solution to visually demonstrate how these changes affect the estimation of orientation and position.

  • What is the significance of the IMU sensors in improving the position estimate when the system is moving fast?

    -IMU sensors, which include accelerometers and gyros, provide high-frequency updates that can capture rapid changes in motion. When the system is moving fast, these sensors help in estimating the velocity and rotation more accurately, which is crucial for maintaining an accurate position estimate.

  • Why is sensor bias estimation important in the fusion algorithm?

    -Sensor bias estimation is important because biases can drift over time, affecting the accuracy of the measurements. If not corrected, these biases can cause the estimate to deviate from the true state, especially when relying on the affected sensor for an extended period.

  • What is the purpose of the predict and correct steps in a Kalman filter?

    -The predict step uses the current state estimate and a model of the system dynamics to forecast the state at the next time step. The correct step then updates this prediction with new measurements, taking into account the uncertainty in both the prediction and the measurement to produce a more accurate state estimate.

  • How does the video illustrate the limitations of using GPS alone for fast-moving systems?

    -The video shows an example where, with only GPS data and a slow update rate, the position estimate quickly diverges from the actual path when the system is moving fast and changing directions rapidly.

  • What is the benefit of using an Extended Kalman Filter (EKF) for sensor fusion?

    -An EKF is beneficial for sensor fusion because it can handle nonlinear systems by linearizing the models around the current estimate. This allows for the integration of measurements from different sensors, even when their update rates are asynchronous.

  • How does the video suggest initializing the filter for accurate state estimation?

    -The video suggests initializing the filter close to the true state to ensure accurate linearization. In simulations, this can be done knowing the ground truth, but in real systems, it may involve using direct sensor measurements or allowing the filter to converge on a stationary system.

  • What advice does the presenter give for better understanding the concepts discussed in the video?

    -The presenter encourages viewers to experiment with the MATLAB example, changing sensor settings and observing the effects on estimation. They also suggest diving into the code to understand how the EKF is implemented and how different functions update the state vector.

Outlines

00:00

📍 Introduction to Sensor Fusion for Positioning and Localization

This paragraph introduces the topic of using sensor fusion for accurate positioning and localization. It explains the previous video's focus on combining IMU sensors to estimate an object's orientation and correct gyro drift with accelerometer and magnetometer data. The current video aims to extend this concept by incorporating GPS data to estimate position and velocity, enhancing the fusion algorithm. The goal is to provide an intuitive understanding of how each sensor contributes to the final solution, rather than a technical explanation of the fusion algorithm itself. The video will use MATLAB's sensor fusion and tracking toolbox to demonstrate the impact of different sensors on position and orientation estimation through a visual example.

05:05

🚀 GPS and IMU Integration for Enhanced Positioning

The second paragraph delves into the practical application of GPS in conjunction with IMU sensors for precise positioning. It contrasts the use of GPS alone for slow-moving systems with the need for additional sensors like IMU for fast-moving systems requiring high update rates and accuracy. The paragraph uses a MATLAB example to illustrate the difference in position and orientation estimation with and without IMU sensors. It highlights the limitations of GPS in fast-moving scenarios and the benefits of sensor fusion, especially in dynamic situations like drone navigation. The example also demonstrates how the fusion algorithm handles asynchronous sensor data and the importance of sensor bias estimation for maintaining accuracy over time.

10:10

🔍 Understanding the Continuous-Discrete EKF for Sensor Fusion

The final paragraph provides an in-depth look at the underlying algorithm of the sensor fusion process, which is a continuous-discrete extended Kalman filter (EKF). It explains the filter's ability to handle asynchronous sensor measurements and its large state vector that includes not only the primary states like orientation and velocity but also secondary states like sensor biases. The paragraph emphasizes the importance of initializing the filter correctly and the challenges of doing so without knowing the true state. It outlines the two-step process of the EKF, which involves predicting the state based on the model and then correcting it with new measurements. The explanation also covers the filter's approach to handling changing velocities and the significance of quick updates from IMU sensors in maintaining accurate state estimation.

Mindmap

Keywords

💡Sensor Fusion

Sensor fusion is the process of intelligently combining data from multiple sensors to achieve more accurate and reliable information than any single sensor could provide alone. In the context of the video, sensor fusion is used to enhance the accuracy of an object's position and orientation estimation by integrating data from an IMU (Inertial Measurement Unit) and a GPS sensor. The script discusses how sensor fusion can correct for drift in the gyroscope and improve position estimation, especially in dynamic scenarios.

💡IMU (Inertial Measurement Unit)

An IMU is a device that measures and reports a craft's velocity, orientation, and gravitational forces, using a combination of accelerometers and gyroscopes. The video emphasizes the role of an IMU in providing data for orientation estimation and how it can be combined with GPS data to improve the accuracy of position and velocity measurements, especially in situations requiring high update rates and precision.

💡GPS (Global Positioning System)

GPS is a satellite-based navigation system that provides location and time information in all weather conditions, anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. The script explains how GPS can be used to measure an object's position and velocity, and how its integration with IMU data can enhance the accuracy of position estimation in fast-moving systems.

💡Position Estimation

Position estimation refers to the process of determining an object's location in space. The video script discusses how position estimation is crucial for applications like drone navigation and how it can be improved by combining GPS and IMU data through sensor fusion, particularly in scenarios where high precision and update rates are necessary.

💡Localization

Localization is the process of determining the position of an object within a known environment. In the video, the concept is discussed in the context of using sensor fusion for accurate positioning and orientation, which is essential for applications like autonomous vehicles or drones navigating through a known space.

💡Drift Correction

Drift correction is the process of compensating for errors that accumulate over time in a system, such as the drift in a gyroscope's measurements. The video script illustrates how the absolute measurements from the accelerometer and magnetometer in an IMU can be used to correct for the drift in the gyroscope, improving the accuracy of orientation estimation.

💡Gyroscope

A gyroscope is a device used for measuring or maintaining orientation and angular velocity. In the context of the video, the gyroscope is part of the IMU and is used to detect and measure angular motion. The script mentions how gyroscopes are prone to drift, which can be corrected using sensor fusion techniques.

💡Accelerometer

An accelerometer is a sensor that measures acceleration forces. It is used in conjunction with gyroscopes in an IMU to provide data on an object's linear acceleration and help correct for gyroscopic drift. The video script discusses how accelerometer data is used in sensor fusion to improve the accuracy of an object's orientation and position estimation.

💡Magnetometer

A magnetometer is an instrument that measures the strength and direction of the magnetic field in the vicinity of the sensor. In the video, the magnetometer is part of the IMU and is used in conjunction with the accelerometer and gyroscope to provide absolute orientation measurements that help correct for drift in the gyroscope.

💡Extended Kalman Filter (EKF)

An Extended Kalman Filter is a type of Kalman filter that is used for systems with non-linear dynamics. The video script explains that the fusion algorithm used in the example is a continuous-discrete EKF that can accept sensor measurements asynchronously and estimate a wide range of states, including sensor biases, which are crucial for maintaining the accuracy of the estimation over time.

💡State Estimation

State estimation is the process of determining the state of a system based on noisy or incomplete measurements. The video script discusses how state estimation is performed using an EKF in the context of sensor fusion, where the filter estimates not only the primary states like position and velocity but also secondary states such as sensor biases to improve the overall accuracy of the system.

💡Asynchronous Measurements

Asynchronous measurements refer to the process where different sensors provide data at different rates or times. The video script explains how the fusion algorithm can handle asynchronous measurements from various sensors, such as a gyroscope running at 100 Hz, an accelerometer and magnetometer at 50 Hz, and a GPS at 1 Hz, allowing for more flexible and efficient data processing.

💡Bias Estimation

Bias estimation is the process of determining and compensating for systematic errors in sensor measurements. The video script highlights the importance of estimating sensor biases in the EKF, as these biases can drift over time and affect the accuracy of the state estimation if not accounted for.

💡Predict-Correct Cycle

The predict-correct cycle is a fundamental process of the Kalman filter, where the state of the system is first predicted based on a model and then corrected using new measurements. The video script describes how this cycle is applied in the EKF to continuously improve the state estimation by combining predictions with actual sensor measurements.

Highlights

The video discusses sensor fusion for positioning and localization, focusing on the integration of IMU and GPS sensors.

IMU sensors are used to estimate an object's orientation, and GPS sensors measure position and velocity for enhanced estimation.

The fusion algorithm structure is explained, emphasizing the visual contribution of each sensor to the final solution.

GPS is sufficient for some applications with low accuracy requirements and slow motion systems.

High-accuracy and high-update-rate applications, such as drones, may require additional sensors like IMU to complement GPS.

MATLAB's sensor fusion and tracking toolbox is demonstrated with an example of pose estimation from asynchronous sensors.

The example shows how the fusion algorithm uses GPS, gyro, and magnetometer data to estimate orientation and position.

The script allows for adjusting sensor sample rates and removing sensors to observe the impact on estimation.

Removing all sensors except GPS results in significant orientation error but reasonable position accuracy.

Adding IMU sensors to the GPS-only setup shows minor improvements in position estimation for slow movements.

For faster movements and rapid changes in velocity, the IMU's high update rate is crucial for accurate estimation.

The video explains the continuous-discrete extended Kalman filter used in the fusion algorithm, which handles asynchronous sensor measurements.

The filter's state vector includes 28 elements, estimating not only the main states but also sensor biases.

Estimating sensor bias is crucial due to their drift over time, which can affect the accuracy of the estimation.

The video discusses the importance of filter initialization and its impact on convergence and accuracy.

The predict-correct process of the EKF is explained, highlighting how it uses predictions and measurements to refine state estimation.

The video concludes by encouraging viewers to experiment with the example and explore the code for a deeper understanding.

Transcripts

play00:00

let's continue our discussion on using sensor fusion for positioning and

play00:05

localization in the last video we combined the sensors in an IMU to

play00:09

estimate an object's orientation and showed how the absolute measurements of

play00:13

the accelerometer and magnetometer were used to correct the drift from the gyro

play00:17

now in this video we're going to do sort of a similar thing but we're going to

play00:22

add a GPS sensor GPS can measure position and velocity and so in this way

play00:28

we can extend the fusion algorithm to estimate them as well and just like the

play00:32

last video the goal is not to fully describe the fusion algorithm it's again

play00:36

too much for one video instead I mostly want to go over the structure of the

play00:41

algorithm and show you visually how each sensor contributes to the final solution

play00:45

so you have a more intuitive understanding of the problem so I hope

play00:49

you stick around for it I'm Brian and welcome to a MATLAB Tech Talk now it

play00:55

might seem obvious to use a GPS if you want to know the position of something

play00:59

relative to the surface of the earth just strap a GPS sensor on to your

play01:03

system and you've got latitude longitude and altitude simple enough and this is

play01:08

perfectly fine in some situations like when the system is accelerating and

play01:11

changing directions relatively slowly and you only need position accuracy to a

play01:16

few meters this might be the case for a system that's determining directions in

play01:20

your car as long as the GPS locates you to within a few meters of your actual

play01:25

spot the map application can figure out which road you're on and therefore where

play01:29

to go next on the other hand imagine if the system requires position information

play01:34

to a few feet or less and it needs position updates at hundreds of times

play01:39

per second to keep up with the fast motion of your system like for example

play01:42

trying to follow a fast trajectory through obstacles with a drone in this

play01:47

case GPS might have to be paired with additional sensors like the sensors in

play01:52

an IMU to get the accuracy that you need to give you a more visual sense of what

play01:57

I'm talking about here let's run an example from the MATLAB sensor fusion

play02:01

and tracking tool box called pose estimation from asynchronous sensors

play02:05

this example uses a GPS excell gyro and magnetometer to estimate

play02:11

which is both orientation and position as well as a few other states now the

play02:16

script generates a true path and orientation profile that the system

play02:20

follows the true orientation is the red cube and the true position is the red

play02:26

diamond now the pose algorithm is using the available sensors to estimate

play02:30

orientation and position and it shows the results of that as the blue cube and

play02:35

the blue diamond respectively so that's what we want to watch how closely do the

play02:39

blue objects follow the red objects and the graph on the right plots the error

play02:43

if you just want to see a more quantitative result and the cool thing

play02:47

about this is that while the script is running the interface allows us to

play02:50

change the sample rates of each of the sensors or remove them from the solution

play02:55

altogether so that we can see how it impacts the estimation so let's start by

play03:00

removing all of the sensors except for the GPS and we'll read the GPS five

play03:05

times a second the default trajectory in the script is

play03:09

to follow a circle with a radius of about 15 meters and you can see that

play03:14

it's moving around this circle pretty slowly now the orientation estimate is

play03:18

way off as you'd expect since we don't have any

play03:21

orientation sensors active but the position estimate isn't too bad

play03:26

after the algorithm settles and removes that initial bias we see position errors

play03:31

of around plus and minus 2 meters in each axis so now let me add back in the

play03:37

IMU sensors and we can see if our result is improved well it's taking several

play03:45

seconds for the orientation to converge but you can see that it's slowly

play03:48

correcting itself back to the true orientation also the position estimate

play03:53

is well it's about the same plus or minus two meters maybe a little less

play03:59

than that this is a relatively slow movement and it's such a large

play04:03

trajectory that the IMU sensors that are modeled here are only contributing a

play04:08

minor improvement over the GPS alone the GPS velocity measurement is enough to

play04:13

predict how the object moves over the point two seconds between measurements

play04:16

since the object isn't accelerating too quickly this setup is kind of analogous

play04:21

to using GPS to get directions from a map

play04:24

your phone while you're driving adding those additional sensors from the IMU

play04:28

aren't really going to help too much so now let's go in the opposite direction

play04:32

and create a trajectory that is much faster in the trajectory generation

play04:37

script I'll just speed up the velocity of the object going around the circle

play04:41

from two point five to twelve point five meters per second this is going to

play04:46

create more angular acceleration in a shorter amount of time

play04:52

and to really emphasize the point I'm trying to make here I'm going to slow

play04:55

the GPS sample time down to once per second so let's give this a shot

play05:05

okay so what's happening here is that when we get a GPS measurement we get

play05:09

both position and velocity so once a second we get a new position update that

play05:14

puts the estimate within a few meters of the truth but we also get the current

play05:18

velocity and so for one second the algorithm propagates that velocity

play05:22

forward to predict what the object is doing between measurements and this

play05:26

works really well if the velocity is near constant for that one second

play05:30

but poorly as you can see when the velocity is rapidly changing this is the

play05:35

type of situation that is similar to a drone that has to make rapid turns and

play05:39

avoid obstacles and it's here where the addition of the IMU will help because we

play05:44

won't have to rely on propagating a static velocity for one second we can

play05:48

estimate velocity and rotation using the IMU sensors now to see the improvement

play05:55

I've placed two different runs next to each other the left is the GPS only that

play05:59

we just saw and the right is with the addition of the IMU you can see at least

play06:04

visually how the GPS with the IMU is different than the GPS alone it's able

play06:09

to follow the position of the object more closely and creates a circular

play06:13

result rather than a saw blade so adding an IMU seems to help estimate position

play06:19

so the question at this point might be why is this the case I mean how does the

play06:24

algorithm combine these sensors to get this result in the first place well

play06:28

again intuitively we can imagine that the IMU is allowing us to dead rec in

play06:33

the state of the system between GPS updates you know similar to how we use

play06:37

the gyro to dead reckon between the mag and Excel updates in the last video and

play06:41

this is true except it's not as cut and dry as that it's a lot more intertwined

play06:47

than you might think and to understand why this is the case we need to explore

play06:51

the code a little bit the fusion algorithm is a continuous

play06:56

discrete extended Kalman filter and this particular one is set up to accept the

play07:01

sensor measurements asynchronously which means that each of the sensors can be

play07:05

read at their own rate and this is beneficial if you want to run say your

play07:09

gyro at 100 Hertz your mag and accelerometer at 50 Hertz and your GPS

play07:13

at 1 Hertz you're gonna see below how this is handled but the thing I want to

play07:18

point out here is that this is a massive common filter the state vector has 28

play07:24

elements in it that are being estimated simultaneously there's the obvious

play07:28

states like orientation angular velocity linear position velocity and

play07:32

acceleration but the filter is also estimating the sensor biases and the mag

play07:38

field vector estimating sensor bias is extremely important because bias drifts

play07:45

over time this means that even if you calculate sensor bias before you operate

play07:49

your system and you hard-code that calibration value into your software

play07:53

it's not going to be accurate for long and any bias that we don't remove will

play07:58

be integrated and cause the estimate to walk away from the truth when we rely on

play08:03

that sensor now if you don't have a good initial estimate of sensor bias when you

play08:08

start your system then you can't just turn on your filter and Trust it right

play08:12

away you have to give it some time to not just estimate the main states that

play08:15

you care about like position and velocity but also to estimate some of

play08:19

the secondary states like bias usually you let the common filter converge on

play08:24

the correct solution when the system is stationary and not controlled or maybe

play08:28

while you're controlling it using a different estimation algorithm or maybe

play08:32

you just let it run and you don't really care that the system is performing

play08:35

poorly while the filter converges but this is one of the things that you need

play08:39

to consider during initialization of your system

play08:43

now another thing we need to talk about here is how to initialize the filter

play08:47

this is an EKF and it can estimate state for nonlinear systems it does this by

play08:53

linearizing the models around its current estimate and then using that

play08:57

linear model to predict the state into the future so if the filter is not

play09:01

initialized close enough to the true state the linearization process can be

play09:06

so far off that it causes the filter to never actually converge now this isn't

play09:11

really a problem for this example because the ground truth is known in the

play09:15

simulation so the filter is simply initialized to a state close to truth

play09:20

but in a real system you need to think about how to initialize the filter when

play09:24

you don't know that truth now often this can be done by just using the

play09:29

measurements from the sensors directly like using the last GPS reading to

play09:33

initialize position and velocity and just using the gyro to initialize your

play09:37

angular rate and so on all right with the filter initialized we can start

play09:44

running it and every common filter consists of the same two-step process

play09:48

predict and correct to understand why we can think about it like this if we want

play09:55

it to estimate the state of something you know where it is or how fast it's

play09:59

going there's two general ways to do this we could just measure it directly

play10:04

or we could use our knowledge of dynamics and kinematics and predict

play10:10

where it is for example imagine a car driving down the road and we want to

play10:14

know its location we could use GPS to measure its position directly that's one

play10:19

way but if we knew where it started and its average speed we could also predict

play10:25

where it'll be after a certain amount of time with some accuracy and using those

play10:30

predictions alongside a measurement can produce a better estimate so the

play10:35

question might be why wouldn't we just trust our measurement completely here

play10:38

it's probably better than our prediction well as sort of an extreme example what

play10:43

if you checked your watch and it said it was 3:00 p.m. and then you waited a few

play10:47

seconds and checked it again and it said 4:00 p.m. what you wouldn't

play10:52

automatically assume an hour has passed just because your measurement said so

play10:56

this is because you have a basic understanding of time right that is you

play11:00

have this internal model that you can use to predict how much time has passed

play11:03

and that would cause you to be sceptical of your watch if you thought seconds

play11:07

passed and it said an hour now on the other hand if you thought about an hour

play11:12

has passed but the watch said 65 minutes you'd probably be more inclined to

play11:17

believe the watch over your own estimate since you'd be less confident in your

play11:20

prediction and this is precisely what a common filter is doing it's predicting

play11:25

how the states will change over time based on a model that it has and along

play11:30

with the states it's also keeping track of how trustworthy the prediction is

play11:33

based on the process noise that you've given it and the longer the filter has

play11:38

to predict the state the less confidence it has in the result then whenever a new

play11:44

measurement comes in which has its own measurement noise associated with it the

play11:48

filter compares the prediction with the measurement and then corrects its

play11:51

estimate based on the relative confidence in both and this is what the

play11:57

scripts doing also the simulation runs at 100 Hertz and at every time step it

play12:01

predicts forward the estimate of the states and then if there's a new

play12:05

measurement from any of the sensors it runs the update portion of the common

play12:09

filter adjusting the states based on the relative confidence in the prediction

play12:13

and the specific measurement so it's in this way that the filter can run with

play12:18

asynchronous measurements now with the GPS only solution that we started with

play12:23

the prediction step could only assume that the velocity isn't changing over

play12:27

the one second and since there were no updates to correct that assumption the

play12:31

estimate would drastically run away from truth however with the IMU

play12:35

the filter is updating a hundred times a second and looking at the accelerometer

play12:39

in seeing that the velocity is in fact changing so in this way the filter can

play12:44

react to a changing state faster with the quick updates of the IMU

play12:47

then it can with the slower updates of the GPS and once the filter converges

play12:52

and it has a good estimate of sensor biases then that will give us an overall

play12:56

better prediction and therefore a better overall state estimation and this is the

play13:02

power of sensor fusion now I know this explanation might not have been

play13:07

perfectly clear and probably a bit fast but I think it's hard to really grasp

play13:12

the topic by watching a video so I would encourage you to play around with this

play13:17

example you know turn sensors on and off change the rates noise characteristics

play13:21

and the trajectory to see how the estimation is affected yourself you can

play13:25

even dive further into the code and see how the EKF is implemented I found it

play13:30

was helpful to place breakpoints and pause the execution of the script so

play13:34

that I could see how the different functions update the state vector okay

play13:39

this is where I'm going to leave this in the next video we'll start to look at

play13:43

estimating the state of other objects when we talk about tracking algorithms

play13:47

so if you don't want to miss that in future Tech Talk videos don't forget to

play13:50

subscribe to this channel and if you want you can check out my channel

play13:54

control system lectures where I cover more control theory topics there as well

play13:58

thanks for watching

Rate This

5.0 / 5 (0 votes)

Related Tags
Sensor FusionIMUGPSPositioningLocalizationMATLABTech TalkDrone TrackingEstimation AlgorithmVelocity TrackingSensor Bias