Coming soon- robots that drive

roboticsqut
9 Apr 201703:18

Summary

TLDRThe script delves into the evolution of self-driving car technology, starting from the 1960s with the Stanford Cart and Shakey, which utilized vision systems to navigate. Fast forward to the DARPA Urban Challenge in 2007, where 'Boss' from Carnegie Mellon University excelled. By 2014, Google's autonomous vehicle demonstrated advanced sensors and point cloud imaging for a more sophisticated understanding of the environment. The narrative highlights the progression from rudimentary robotic navigation to the sophisticated, sensor-rich systems capable of making complex driving decisions.

Takeaways

  • 🤖 The concept of self-driving cars has been a staple in science fiction for a long time, but their technological origins can be traced back to the 1960s.
  • 👓 The Stanford Cart, developed by Hans Moravec, was an early attempt at creating a self-driving machine using stereo vision to navigate and avoid obstacles.
  • 📈 The limited computing power of the 1960s greatly affected the speed and capabilities of early self-driving technology like the Stanford Cart.
  • 🔍 Shakey, developed at SRI International, was another pioneering robot that used vision to map its environment, contributing to the evolution of autonomous vehicles.
  • 🏁 The DARPA Urban Challenge in 2007 was a significant milestone, showcasing the progress of self-driving cars and their ability to perform tasks comparable to human drivers.
  • 🏆 Carnegie Mellon University's 'Boss' won the DARPA Urban Challenge, highlighting the importance of advanced sensors and computing equipment in autonomous vehicles.
  • 🚗 By 2014, Google's self-driving cars had become more streamlined, with the Velodyne scanning laser range finder being the most prominent sensor.
  • 🌐 The point cloud image, generated by the Velodyne scanner, provides a 3D geometric model that allows the car's software to make driving decisions.
  • 🛑 The software in self-driving cars processes rich sensory information to create a driving plan, including adjusting the steering, throttle, and brakes.
  • 🚦 Modern self-driving cars can identify and respond to various objects and road signs, demonstrating an advanced level of environmental awareness and decision-making.
  • 📊 The evolution of self-driving car technology from the 1960s to the present shows rapid advancements in sensors, computing power, and software capabilities.

Q & A

  • What is the significance of the Stanford cart in the history of self-driving car technology?

    -The Stanford cart is significant because it was one of the earliest research robots to use a stereo vision system to perceive its environment and plan a path to avoid obstacles, marking an early step towards self-driving car technology.

  • What was the primary limitation of the Stanford cart?

    -The primary limitation of the Stanford cart was its excruciatingly slow speed, which was mainly due to the limited computing power available in 1964.

  • What was Shakey, and how did it contribute to the development of self-driving cars?

    -Shakey was a famous robot developed at SRI International in the late 60s and 70s. It used vision to build a map of its environment and navigate, contributing to the advancement of self-driving car technology.

  • What was the DARPA Urban Challenge and why was it important?

    -The DARPA Urban Challenge was a competition in 2007 where teams built robot cars to perform tasks like human drivers, such as parking and intersection management. It was important because it significantly advanced the technology and capabilities of self-driving cars.

  • Which team won the DARPA Urban Challenge and what was their robot car called?

    -The team from Carnegie Mellon University won the DARPA Urban Challenge, and their robot car was called 'Boss'.

  • How did the appearance of self-driving cars change from the DARPA Urban Challenge to 2014?

    -By 2014, self-driving cars like Google's became much sleeker, with only one obvious sensor, a Velodyne scanning laser range finder, making them look more like ordinary cars.

  • What is a point cloud image and how is it used in self-driving cars?

    -A point cloud image is a three-dimensional representation of the world surrounding the car, created by sensors like the Velodyne scanner. It helps the car's software make decisions about navigation by identifying objects, humans, and road signs.

  • How do self-driving cars process the sensory information they collect?

    -Self-driving cars process sensory information by creating a three-dimensional geometric model of the environment, which the onboard software uses to make decisions and send commands to control the vehicle.

  • What are the key components of a self-driving car's sensory system?

    -Key components of a self-driving car's sensory system include cameras, LiDAR sensors like the Velodyne scanner, and other sensors that help in perceiving the environment and planning the vehicle's movements.

  • How do the colors in a point cloud image represent different features of the environment?

    -In a point cloud image, cool colors like blue represent the ground plane, green indicates points above the ground plane, and red is used for points that are very high above the ground plane.

  • What role does high-performance computing play in self-driving cars?

    -High-performance computing is crucial in self-driving cars as it processes and analyzes the vast amounts of sensory data in real-time, enabling the car to make quick decisions and navigate safely.

Outlines

00:00

🤖 Early Self-Driving Car Technology

The script begins by discussing the historical roots of self-driving car technology, dating back to the 1960s with two notable research robots. The Stanford cart, developed by Hans Moravec, utilized a stereo vision system to navigate its environment and avoid obstacles, albeit at a slow pace due to limited computing power in 1964. Shakey, developed at SRI International, also relied on visual input to map its surroundings, marking significant advancements in autonomous vehicle technology.

Mindmap

Keywords

💡Self-driving cars

Self-driving cars, also known as autonomous vehicles, are vehicles that have the capability to sense their environment and navigate without the need for human input. In the video's context, self-driving cars represent the main theme, showcasing the evolution of technology from early research robots to modern vehicles equipped with advanced sensors and computing systems.

💡Stereo vision

Stereo vision is a method used by robots and vehicles to perceive depth and reconstruct the three-dimensional world around them. It involves using two cameras to mimic the way human eyes perceive depth. In the script, the Stanford cart is described as using a stereo vision system to plan a path and avoid obstacles, which is fundamental to the development of self-driving car technology.

💡Computing power

Computing power refers to the ability of a computer to perform operations and process information. The script mentions that the Stanford cart was slow due to the limited computing power available in the 1960s, highlighting the importance of computational advancements in the progress of self-driving car technology.

💡Shakey

Shakey is a famous early robot developed at SRI International, known for its ability to use vision to build environmental maps for navigation. The script describes Shakey as a precursor to modern self-driving cars, emphasizing the historical development of autonomous navigation systems.

💡DARPA Urban Challenge

The DARPA Urban Challenge was a competition organized by the Defense Advanced Research Projects Agency (DARPA) to promote the development of autonomous vehicles. The script highlights the 2007 event as a major step forward in self-driving car technology, where teams competed to build vehicles capable of performing tasks comparable to human drivers.

💡Boss

Boss is the name of the robot car developed by Carnegie Mellon University, which won the DARPA Urban Challenge. The script describes Boss as an example of the technological advancements in self-driving cars, equipped with various sensory devices and high-performance computing equipment.

💡Google cars

Google cars refer to the self-driving vehicles developed by Google's autonomous car project, now known as Waymo. The script mentions the evolution of Google cars by 2014, noting their sleek design with a single obvious sensor, the Velodyne scanning laser range finder, which is indicative of the progress in making self-driving cars more integrated and less obtrusive.

💡Velodyne scanner

A Velodyne scanner is a type of LiDAR (Light Detection and Ranging) sensor used in autonomous vehicles to create a 3D map of the surrounding environment. The script describes the Velodyne scanning laser range finder mounted on the roof of Google cars, which generates a point cloud image, essential for the car's perception and navigation.

💡Point cloud

A point cloud is a collection of data points in three-dimensional space, typically used to represent the environment in which a self-driving car operates. The script explains how the point cloud, generated by the Velodyne scanner, is color-coded to differentiate between the ground plane and objects above it, which is crucial for decision-making in navigation.

💡High-performance computing

High-performance computing refers to the use of supercomputers and computing techniques that are significantly faster than a standard computer. In the context of the video, high-performance computing is essential for processing the vast amounts of data collected by self-driving cars to make real-time navigation decisions.

💡Onboard software

Onboard software is the set of programs and algorithms that run within a vehicle to control its functions and operations. The script discusses how the onboard software of a self-driving car takes sensory information, processes it, and sends commands to adjust the vehicle's steering, throttle, and brakes, illustrating the integral role of software in autonomous driving.

Highlights

Interest in self-driving cars has deep roots in both real-world development and science fiction.

The Stanford Cart from the 1960s represents an early milestone in autonomous vehicle technology.

Hans Moravec's Stanford Cart utilized stereo vision to navigate and avoid obstacles.

Early self-driving technology was constrained by the limited computing power of the time.

Shakey, developed at SRI International, was another pioneering robot that used vision to map its environment.

The DARPA Urban Challenge in 2007 was a significant event that spurred advancements in autonomous vehicle technology.

Boss, developed by Carnegie Mellon University, won the DARPA Urban Challenge, showcasing advanced capabilities in autonomous driving.

Modern self-driving cars, like Google's, have evolved to have more streamlined and less obtrusive sensor systems.

The Velodyne scanner is a key sensor used in modern autonomous vehicles for creating a 3D model of the environment.

Point cloud technology allows self-driving cars to perceive their surroundings in three dimensions.

Color coding in point cloud images helps differentiate between the ground plane and objects above it.

Autonomous vehicle software processes sensory data to make driving decisions and control the vehicle.

Self-driving cars can detect and respond to various objects, including humans, other cars, and road signs.

Onboard software creates a driving plan and sends commands to adjust vehicle components like the steering wheel, throttle, and brake.

The evolution of self-driving car technology reflects significant improvements in sensor integration and computational efficiency.

The DARPA Urban Challenge was a pivotal event that accelerated the development of autonomous driving capabilities.

The Boss robot car's victory in the DARPA Challenge demonstrated the potential for autonomous vehicles to perform complex driving tasks.

Google's autonomous vehicles represent a leap forward in the practical application of self-driving technology.

Transcripts

play00:03

There is a lot of interest today in robots that drive, otherwise known as self driving

play00:08

cars and such technology has been depicted in fiction for a very long time. The origin

play00:15

of self driving car technology can probably be traced back to these two research robots

play00:20

from the 1960s.

play00:22

On the left we have a machine known as the Stanford cart. This one here was built by

play00:27

Hans Moravec. What this robot is doing is using a vision system. In fact, a stereo vision

play00:33

system to reconstruct the three dimensional nature of the world in which it’s driving.

play00:38

It uses that information to plan a path so that it can avoid hitting any of these obstacles.

play00:44

This machine was excruciatingly slow. Mostly dictated by the limited computing power that

play00:49

was available for the problem back in 1964.

play00:53

The robot on the right is also pretty famous. It’s known as Shakey and developed at SRI

play00:58

International in the late 60s and its career went on through the 70s. This robot also used

play01:04

sense of vision to build a map of the environment in which it was navigating. A major step forward

play01:11

in self driving car technology was the DARPA urban challenge in 2007. A number of teams competed

play01:17

to build robot cars that could perform as well as human drivers. They had to perform

play01:23

tasks like moving into parking bays. They had to do the right things at the intersections.

play01:29

They had to demonstrate that they could do overtaking and all of this safely with skill

play01:34

levels comparable to human drivers.

play01:37

The winner of that competition was this robot car called, ‘Boss’ developed by Carnegie

play01:42

Mellon University and we can see that it doesn't look anything like an ordinary car. It’s

play01:47

bristling with all sorts of sensory devices and a large part of the car is filled with

play01:52

high performance computing equipment.

play01:54

Now technology has evolved pretty rapidly so by the year 2014, Google cars look much

play02:01

more sleek. There is really in fact only one sensor that’s obvious when you look at the

play02:06

car and that is the device known as a Velodyne scanning laser range finder on the roof of

play02:11

the car. The way the robot car sees its world is shown here in what we call a point cloud

play02:18

image and this is generated by that Velodyne scanner that we saw on top of the Google car.

play02:24

The point cloud is a number of points in three dimensional space and they are typically color

play02:29

coded. So the colors blue, the cool colors indicate the ground plane on which the robot

play02:34

is driving and points above the ground plane where it might be imprudent to drive are colored

play02:39

green or red for those points that are very high above the ground plane. So from this

play02:44

fairly simple three dimensional geometric model of the world surrounding the car, the

play02:49

software on board the car is able to make a number of decisions about which direction

play02:54

it should drive.

play02:55

It’s able to see other objects perhaps human beings, perhaps other cars, perhaps road signs.

play03:01

The software on board the vehicle has to take all of this rich sensory information and create

play03:06

a plan and then send commands to the car to adjust the steering wheel or to adjust the

play03:11

throttle or to adjust the brake.

Rate This

5.0 / 5 (0 votes)

الوسوم ذات الصلة
Self-DrivingAutonomous CarsRoboticsStanford CartShakey RobotDARPA ChallengeCarnegie MellonGoogle CarLiDAR TechnologyAI NavigationTech Evolution
هل تحتاج إلى تلخيص باللغة الإنجليزية؟