Understanding Sensor Fusion and Tracking, Part 1: What Is Sensor Fusion?

MATLAB
21 Oct 201912:35

Summary

TLDRThis video script delves into the concept of sensor fusion, pivotal for autonomous systems like self-driving cars and IoT devices. It explains how combining data from multiple sensors or even mathematical models enhances the accuracy, reliability, and coverage of system measurements. The script outlines four key benefits: improving data quality, increasing system reliability, estimating unmeasured states, and expanding coverage areas. It also touches on common filter algorithms and promises deeper exploration in subsequent videos, making it an engaging introduction to sensor fusion for new learners.

Takeaways

  • 🧩 Sensor fusion is a critical component in the design of autonomous systems, such as self-driving cars and IoT devices, by combining data from multiple sources to achieve a more accurate and reliable understanding of the system's environment.
  • 🔍 The high-level definition of sensor fusion is the combination of two or more data sources to create a solution that is more consistent, accurate, and dependable than any single data source alone.
  • 📏 Sensor data is often noisy and unreliable on its own; sensor fusion helps to reduce noise and uncertainty by averaging readings from multiple sensors or using different types of sensors to cross-validate measurements.
  • 🔄 Sensor fusion can increase the reliability of data by providing backup measurements in case one sensor fails, and by using algorithms to detect and exclude outlier data from a set of sensors.
  • 📍 Localization and positioning are key responsibilities within the 'perceive' step of autonomous systems, and sensor fusion plays a role in enhancing these capabilities by combining sensor data with mathematical models.
  • 🔎 Sensor fusion aids in situational awareness by detecting and tracking objects in the environment, which is essential for planning and decision-making in autonomous systems.
  • 🛠️ Kalman filters are mentioned as a common method for sensor fusion, which not only blends sensor measurements but also incorporates a mathematical model of the system for improved accuracy.
  • 🚫 Sensor fusion must consider the potential for single failure modes that could affect all sensors simultaneously, emphasizing the importance of redundancy and diverse sensor types.
  • 📊 Sensor fusion can estimate unmeasured states that individual sensors cannot measure directly, such as using two optical sensors to determine the three-dimensional information of a scene.
  • 🌐 Sensor fusion increases the coverage area by combining the measurements from multiple sensors with limited individual ranges to create a comprehensive understanding of the surroundings.
  • 🔑 The overarching benefit of sensor fusion is its ability to improve measurement quality, reliability, and coverage, as well as estimate states that are not directly measured by any single sensor.

Q & A

  • What is sensor fusion and why is it important in autonomous systems?

    -Sensor fusion is the process of combining data from two or more sources to generate a better understanding of the system. It is crucial in autonomous systems like self-driving cars and IoT devices because it enhances the accuracy, consistency, and reliability of the system's perception of its environment.

  • How does sensor fusion contribute to the 'perceive' step in autonomous systems?

    -Sensor fusion aids in the 'perceive' step by interpreting raw sensor data into meaningful information that the system can understand and act upon. It combines multiple sensor measurements and possibly mathematical models to achieve a more comprehensive and accurate understanding of the environment.

  • What are the four main capabilities that autonomous systems need to have?

    -Autonomous systems need to have the capabilities to perceive, plan, act, and self-awareness. Perceive involves collecting information from the environment, plan is about determining what to do and how to do it, act is executing the plan, and self-awareness involves knowing the system's own state and position.

  • How can sensor fusion improve the quality of data in autonomous systems?

    -Sensor fusion can improve data quality by reducing noise and uncertainty. For example, by averaging readings from multiple accelerometers or combining different types of sensors, the system can achieve cleaner data with less deviation from the true values.

  • What is a common filter and how does it relate to sensor fusion?

    -A common filter, such as a Kalman filter, is a mathematical algorithm that fuses sensor measurements with a model of the system to estimate its state. It is used in sensor fusion to reduce noise and improve the accuracy of the measurements by incorporating the system's physical model.

  • How does sensor fusion increase the reliability of sensor data?

    -Sensor fusion increases reliability by providing redundancy. If one sensor fails, the system can still function using the data from other sensors. Moreover, a fusion algorithm can discard data from a sensor that significantly deviates from the others, ensuring the integrity of the overall measurement.

  • Can you give an example of how sensor fusion can handle sensor failure?

    -In the case of an aircraft using pitot tubes for airspeed measurement, if one tube fails, the system can rely on the remaining two tubes. If all tubes are affected by a single failure mode, like freezing, the system can use alternative sensors or models, such as GPS and wind models, to estimate airspeed.

  • What is the role of sensor fusion in estimating unmeasured states?

    -Sensor fusion can estimate unmeasured states by combining data from different sensors that individually cannot measure the state of interest. For example, two optical sensors can be used to extract 3D information about a scene, estimating distances between objects that are not directly measurable by a single sensor.

  • How can sensor fusion increase the coverage area of a sensor suite?

    -Sensor fusion can increase coverage by integrating measurements from multiple sensors with different fields of view. For instance, ultrasonic sensors for parking assist on a car can be combined to provide a larger, more comprehensive field of view around the vehicle.

  • What are some challenges in sensor fusion related to correlated noise?

    -Correlated noise is a challenge in sensor fusion because it affects multiple sensors in the same way. For example, if multiple magnetometers in a phone are affected by the phone's internal magnetic fields, averaging their readings won't reduce the noise. In such cases, fusing with sensors that measure different quantities or are less susceptible to the noise source is necessary.

  • How does sensor fusion relate to the concept of localization in autonomous systems?

    -In autonomous systems, sensor fusion plays a critical role in localization by combining various sensor inputs to determine the system's position and orientation accurately. This is essential for the system to understand where it is and how it is moving within its environment.

Outlines

00:00

🤖 Introduction to Sensor Fusion in Autonomous Systems

This paragraph introduces the concept of sensor fusion, which is essential for the design of autonomous systems such as self-driving cars, radar tracking stations, and the Internet of Things. It aims to explain what sensor fusion is and how it aids in the development of autonomous systems. The video will provide a broad definition and illustrate various use cases. Brian, the presenter, welcomes viewers to a MATLAB Tech Talk and emphasizes that sensor fusion involves combining multiple data sources to achieve a more consistent, accurate, and dependable understanding of the system than any single data source could offer.

05:03

🔍 The Role of Sensor Fusion in Autonomous System Design

This section delves into the high-level definition of sensor fusion and its role in the autonomous system's ability to perceive, plan, and act. It explains that sensor fusion is about combining sensor data with information from mathematical models to enhance the system's understanding of its environment. The paragraph outlines the four main capabilities required for autonomous systems: perception, planning, and action, with sensor fusion playing a crucial part in the perception step. It also discusses the importance of self-awareness and situational awareness, which sensor fusion helps to achieve by combining multiple sensor inputs and models of the physical world.

10:04

📈 Enhancing Data Quality and Reliability Through Sensor Fusion

This paragraph explores how sensor fusion can improve the quality and reliability of data in autonomous systems. It uses the example of accelerometers to illustrate how combining data from multiple sensors can reduce noise and uncertainty in measurements. The paragraph also discusses the use of different sensor types, such as magnetometers and gyros, to mitigate the effects of correlated noise sources. It introduces the concept of Kalman filters as a common fusion algorithm that incorporates mathematical models of the system to improve sensor measurements. The benefits of having backup sensors and the ability to estimate states when primary sensors are unavailable are also highlighted.

📐 Estimating Unmeasured States and Increasing Coverage with Sensor Fusion

The final paragraph discusses two additional benefits of sensor fusion: estimating unmeasured states and increasing the coverage area of sensor systems. It explains how the combination of different optical sensors can provide three-dimensional information that individual sensors cannot capture alone. The paragraph also addresses how sensor fusion can be used to expand the sensing range by combining measurements from multiple short-range sensors, such as ultrasonic sensors on a car for parking assistance. The video concludes with a teaser for upcoming videos that will cover sensor fusion for localization and multi-object tracking in more detail.

Mindmap

Keywords

💡Sensor Fusion

Sensor fusion is the process of combining data from multiple sensors to achieve a more accurate, consistent, and dependable understanding of the system and its environment. It is integral to the design of autonomous systems such as self-driving cars, where it helps in improving the quality of data by reducing noise and uncertainty. The script illustrates sensor fusion as a method to enhance the system's ability to perceive its surroundings and make informed decisions.

💡Autonomous Systems

Autonomous systems are self-directed entities that can perform tasks without human intervention. In the context of the video, autonomous systems like self-driving cars rely on sensor fusion to interact with the world around them. These systems need to perceive, plan, and act based on the data they collect and interpret, which is where sensor fusion plays a crucial role in enhancing their capabilities.

💡Perceive

In the video, 'perceive' refers to the system's ability to measure the environment directly with sensors and collect information from both the system and the external world. It is a critical step in the autonomous system's operation, where sensor fusion helps in interpreting the raw data into meaningful insights that the system can act upon.

💡Localization

Localization is the process of determining the position or location of an object within a known environment. In the script, it is mentioned as part of the 'perceive' step, where sensor fusion aids in self-awareness by helping the system answer questions like 'where am I?' and 'what state am I in?'.

💡Situational Awareness

Situational awareness involves detecting and tracking other objects in the environment, which is another responsibility of the 'perceive' step. Sensor fusion contributes to situational awareness by combining multiple sensor inputs to provide a comprehensive understanding of the surroundings, which is essential for safe and effective operation of autonomous systems.

💡Noise Reduction

Noise reduction in sensor fusion refers to the process of minimizing the random fluctuations or errors in sensor data. The script uses the example of accelerometers to illustrate how averaging the readings from multiple sensors can reduce noise and provide cleaner, more reliable data for the autonomous system to make decisions.

💡Reliability

Reliability in the context of sensor fusion means the dependability of the data and the system's ability to function correctly even when one sensor fails. The script explains how duplicating sensors and using a fusion algorithm can provide backup measurements and maintain the system's operation, thus increasing its overall reliability.

💡Unmeasured States

Unmeasured states are those that cannot be directly measured by a single sensor but can be estimated through sensor fusion. The video script gives the example of using two optical sensors to estimate the three-dimensional information and distances of objects in the scene, which neither sensor could do individually.

💡Coverage Area

Coverage area refers to the total spatial range within which an autonomous system can detect and monitor objects. The script discusses how sensor fusion can be used to combine the measurements from multiple short-range sensors to create a larger field of view, which is vital for applications like parking assist in cars.

💡Kalman Filter

Although not explicitly mentioned in the script, the Kalman filter is a common sensor fusion algorithm that is often used to combine sensor measurements and a mathematical model of the system to estimate the system's state. It is an example of how sensor fusion can leverage both sensor data and physical knowledge to improve the accuracy of the measurements.

Highlights

Sensor fusion is crucial for autonomous systems like self-driving cars, radar tracking stations, and the Internet of Things.

Sensor fusion combines multiple data sources to achieve a more consistent, accurate, and dependable understanding of the system.

Data sources include both sensor measurements and mathematical models based on physical world knowledge.

Autonomous systems require capabilities in perceiving, planning, and acting, with sensor fusion playing a role in perceiving the environment.

Localization and positioning are key aspects of self-awareness in autonomous systems, facilitated by sensor fusion.

Situational awareness involves detecting and tracking objects in the environment, aided by sensor fusion.

Sensor fusion can increase data quality by reducing noise and uncertainty through the combination of multiple sensor readings.

Fusing different types of sensors can help in reducing correlated noise and improve measurement accuracy.

Common filters, which incorporate mathematical models, are a common method for sensor fusion, blending sensor measurements with physical knowledge.

Sensor fusion increases reliability by providing backup measurements in case of sensor failure.

Fusing sensors that measure different quantities can help maintain measurements when primary sensors are temporarily unavailable.

Sensor fusion can estimate unmeasured states, such as distance, by combining data from multiple sensors.

Estimating orientation using a combination of an accelerometer, magnetometer, and gyro is a practical application of sensor fusion.

Sensor fusion increases the coverage area by combining measurements from multiple sensors with limited individual ranges.

The general idea behind sensor fusion is to improve measurement quality, reliability, and coverage, and estimate unmeasured states.

Upcoming videos will delve into sensor fusion for localization and multi-object tracking, showcasing practical implementations.

The broad appeal of sensor fusion across different types of autonomous systems makes it an interesting and rewarding field of study.

Transcripts

play00:00

sensor fusion is an integral part of the design of autonomous systems things like

play00:05

self-driving cars radar tracking stations and the Internet of Things all

play00:09

rely on sensor fusion of one sort or another so the questions I would like to

play00:14

answer in this video are what is sensor fusion and how does it help in the

play00:19

design of autonomous systems and this will be a good first video if you're new

play00:23

to the topic because we're gonna go over a broad definition of what it is and

play00:26

then show a few scenarios that illustrate the various ways sensor

play00:30

fusion can be used so I hope you stick around I'm Brian and welcome to a MATLAB

play00:35

Tech Talk the high-level definition is that sensor fusion is combining two or

play00:40

more data sources in a way that generates a better understanding of the

play00:44

system better here refers to the solution being more consistent over time

play00:48

more accurate and more dependable than it would be with a single data source

play00:52

for the most part we can think of data is coming from sensors and what they're

play00:57

measuring provides the understanding of the system for example things like how

play01:02

fast it's accelerating or the distance to some object but a data source could

play01:07

also be a mathematical model because as designers we have some knowledge of the

play01:11

physical world and we can encode that knowledge into the fusion algorithm to

play01:15

improve the measurements from the sensors to understand why let's start

play01:20

with the big picture autonomous systems need to interact with the world around

play01:25

them and in order to be successful there are certain capabilities that the system

play01:29

needs to have we can divide these into four main areas since perceive plan and

play01:35

act since refers to directly measuring the environment with sensors it's

play01:41

collecting information from the system and the external world for a

play01:45

self-driving car for example this sensor suite might include radar lidar visible

play01:50

cameras and a whole bunch more but simply gathering data with sensors isn't

play01:56

good enough because the system needs to be able to interpret the data and turn

play02:00

it into something that can be understood and acted on by the autonomous system

play02:05

this is the role of the perceive step to make sense of well the sensed data

play02:13

for example let's say that this is an image from a vehicle camera sensor the

play02:17

car has to ultimately interpret the blob of pixels as a road with lane lines and

play02:22

that there's something off to the side that could be a pedestrian about to

play02:26

cross the street or it's may be a stationary mailbox this level of

play02:31

understanding is critical in order for the system to determine what to do next

play02:35

this is the planning step where it figures out what it would like to do and

play02:40

finds a path to get there and lastly the system calculates the best actions that

play02:46

get the system to follow that path this last step is what the controller and the

play02:50

control system is doing alright let's go back to the perceive step because I want

play02:55

to go into a little more detail here this step has two different but equally

play03:00

important responsibilities it's responsible for self-awareness which is

play03:05

referred to as localization or positioning you know answering questions

play03:09

like where am i what am i doing and what state am I in but it's also responsible

play03:14

for situational awareness things like detecting other objects out in the

play03:18

environment and tracking them so where does sensor fusion come in well it sort

play03:25

of straddles sense and perceive as it has a hand in both of these capabilities

play03:30

it's the process of taking multiple sensor measurements combining them in

play03:35

mixing in additional information from mathematical models with the goal of

play03:39

having a better understanding of the world with which the system can use to

play03:43

plan and act so with that in mind let's walk through four different ways that

play03:48

sensor fusion can help us do a better job at localization and positioning of

play03:52

our own system as well as detecting and tracking other objects all right for the

play03:59

first scenario let's talk about possibly one of the more common reasons for

play04:03

sensor fusion and that's to increase the quality of the data we always want to

play04:08

work with data that has less noise and less uncertainty and fewer deviations

play04:12

from the truth just overall nice clean data and as a simple example let's take

play04:17

a single accelerometer and place it on a table so that it's only measuring the

play04:21

acceleration due to gravity if this was a perfect sensor the output

play04:25

would breed a constant 9.81 m/s^2 however the actual measurement will be

play04:31

noisy how noisy depends on the quality of the sensor and this is unpredictable

play04:36

noise so we can't get rid of it through calibration but we can reduce the

play04:41

overall noise in the signal if we add a second accelerometer and average the two

play04:45

readings as long as the noise isn't correlated across the sensors fusing

play04:49

them together like this reduces the combined noise by a factor of the square

play04:53

root of the number of sensors so for identical sensors fuse together we'll

play04:58

have half the noise of a single one so in this case all that makes up this very

play05:03

simple fusion algorithm is an averaging function now we can also reduce noise by

play05:09

combining measurements from two or more different types of sensors and this can

play05:13

help if we have to deal with correlated noise sources for example let's say

play05:17

we're trying to measure the direction your phone is facing relative to north

play05:21

we could use the phone magnetometer to measure the angle from magnetic north so

play05:26

that's pretty easy however just like with the accelerometer this sensor

play05:30

measurement will be noisy and if we want to reduce that noise then we may be

play05:33

tempted to add a second magnetometer however at least some contribution of

play05:38

noises coming from the moving magnetic fields created by the electronics within

play05:42

the phone itself this means that every magnetometer will be affected by this

play05:46

correlated noise source and so averaging the sensors won't remove it now two ways

play05:52

to solve this problem are to simply move the sensors away from the corrupting

play05:56

magnetic fields which is hard to do with a phone or you can just filter the

play06:01

measurement through some form of a low-pass filter which would add lag and

play06:04

make the measurement less responsive but another option is to fuse the

play06:09

magnetometer with an angular rate sensor a gyro the gyro will be noisy as well

play06:14

but by using two different sensor types we're reducing the likelihood that that

play06:18

noise is correlated and so they can be used to calibrate each other the basic

play06:23

gist is that if the magnetometer measures a change in the magnetic field

play06:26

the gyro can be used to confirm if that rotation came from the phone physically

play06:31

moving or if it's just from noise there's several different fusion

play06:35

algorithms that can accomplish this blending but a common filter is

play06:39

probably one of the more common methods and the interesting thing about common

play06:43

filters is that a mathematical model of the system is already built into the

play06:46

filter so you're getting the benefit of fusing together sensor measurements and

play06:50

your knowledge of the physical world if you want to learn more about common

play06:55

filters check out the MATLAB Tech Talk video that I've linked to in the

play06:58

description alright the second benefit of sensor fusion is that it can increase

play07:03

reliability if we have two identical sensors fused together like we have with

play07:09

the averaged accelerometers then we have a backup in case one fails of course

play07:14

with this scenario we lose quality if one sensor fails but at least we don't

play07:18

lose the whole measurement we can also add a third sensor into the mix and the

play07:22

fusion algorithm could throw out the data of any single sensor that's

play07:25

producing a measurement that differs from the other two an example here could

play07:30

be using three pitot tubes to have a reliable measure of an aircraft's air

play07:34

speed if one breaks or reads incorrectly then the air speed is still known using

play07:39

the other two so duplicating sensors is an effective way to increase reliability

play07:44

however we have to be careful of single failure modes that can affect all of the

play07:49

sensors at the same time an aircraft that flies through freezing rain might

play07:54

find that all three pitot tubes freeze up and no amount of voting or sensor

play07:59

fusion will save the measurement again this is where fusing together sensors

play08:04

that measure different quantities can help the situation the aircraft can be

play08:08

set up to supplement the airspeed measurements from the pitot tubes with

play08:12

an airspeed estimate using GPS and atmospheric wind models in this case

play08:17

airspeed can still be estimated when the primary sensor suite is unavailable

play08:21

again quality may be reduced but the airspeed can still be determined which

play08:25

is important for the safety of the aircraft now losing a sensor doesn't

play08:31

always mean that the sensor failed it could mean that the quantity that

play08:34

they're measuring drops out momentarily for example take a radar system that's

play08:38

tracking the location of a small boat on the ocean the radar station is sending

play08:43

out a radio signal which reflects off the boat and back again

play08:47

round-trip travel time the Doppler shift of the signal and the azimuth and

play08:51

elevation of the tracking station are all combined to estimate the location

play08:54

and range rate of the boat however if a larger cargo ship gets between the radar

play09:00

station and the smaller boat then the measurement will shift instantly to the

play09:03

location and range rate of that blocking object so in this case we don't need an

play09:09

alternate sensor type or a secondary radar tracking station to help when the

play09:13

measurement drops out because we can use a model of the physical world an

play09:16

algorithm could develop a speed and heading model of the object that's being

play09:20

tracked and when the object is out of the radar line-of-sight the model can

play09:24

take over and make predictions this of course only works when the object you're

play09:29

tracking is relatively predictable and you don't have to rely on your

play09:33

predictions long term which is pretty much the case for slow-moving ships okay

play09:39

the third benefit of sensor fusion is that it can be used to estimate

play09:43

unmeasured States now it's important to recognize that unmeasured doesn't mean

play09:48

unmeasurable it just means that the system doesn't have a sensor that can

play09:51

directly measure the state were interested in for example a visible

play09:55

camera can't measure the distance to an object in its field of view a large

play09:59

object far away can have the same number of pixels as a small but close object

play10:04

however we can add a second optical sensor and through sensor fusion extract

play10:10

three-dimensional information the fusion algorithm would compare the scene from

play10:14

the two different angles and measure the relative distances between the objects

play10:18

in the two images so in this way these two sensors can't measure distance

play10:23

individually but they can when combined now in the next video we're going to

play10:27

expand on this concept of using sensors to estimate unmeasured States by showing

play10:31

how we can estimate position using an accelerometer and a gyro for now though

play10:36

I want to move on to the last benefit that I'm going to cover in this video

play10:39

sensor fusion can be used to increase the coverage area let's imagine the

play10:45

short-range ultrasonic sensors on a car which are used for parking assist these

play10:50

are the sensors that are measuring the distance to nearby objects like other

play10:54

parked cars and the curb to let you know when you're close to impact each

play10:58

individual sensor may only have a range a few feet and a narrow field of view

play11:03

therefore if the car needs to have full coverage on all four sides additional

play11:07

sensors need to be added and the measurements fuse together to produce a

play11:11

larger total field of view now more than likely these measurements won't be

play11:16

averaged or combined mathematically in any way since it's usually helpful to

play11:20

know which sensor is registering an object so that you have an idea of where

play11:24

that object is relative to the car but the algorithm that pulls all of these

play11:29

sensors together into one coherent system is still a form of sensor fusion

play11:34

so hopefully you can start to see that there's a lot of different ways to do

play11:38

sensor fusion and even though the methods don't necessarily share a common

play11:43

algorithm or even have the same design objective the general idea behind them

play11:47

is ubiquitous use multiple data sources to improve measurement quality

play11:52

reliability and coverage as well as be able to estimate states that aren't

play11:56

measured directly the fact that sensor fusion has this broad appeal across

play12:01

completely different types of autonomous systems is what makes it an interesting

play12:05

and rewarding topic to learn in the next two videos we're going to go into more

play12:09

detail on sensor fusion for localization and for multi object tracking in the

play12:15

next video in particular we're going to show how we can combine an accelerometer

play12:18

magnetometer and a gyro to estimate orientation so if you don't want to miss

play12:23

that and future Tech Talk videos don't forget to subscribe to this channel also

play12:27

if you want to check out my channel control system lectures I cover more

play12:30

control theory topics there as well I'll see you next time

Rate This

5.0 / 5 (0 votes)

Related Tags
Sensor FusionAutonomous SystemsSelf-Driving CarsRadar TrackingIoTData AccuracyReliabilityPerceptionLocalizationMATLAB Tech Talk