Dot products and duality | Chapter 9, Essence of linear algebra

3Blue1Brown
24 Aug 201614:11

Summary

TLDRThis script delves into the concept of dot products in linear algebra, traditionally introduced early but discussed here in the context of linear transformations for a deeper understanding. It explains the numerical computation of dot products and their geometric interpretation involving vector projection. The script explores the surprising symmetry of dot products and their relation to linear transformations, particularly focusing on transformations from multiple dimensions to one. It introduces the idea of duality, where vectors and linear transformations are intrinsically linked, with the dot product serving as a bridge between these concepts. The beauty of this mathematical relationship is highlighted, suggesting that vectors can be seen as representations of transformations, offering a new perspective on their role in mathematics.

Takeaways

  • 📚 Dot products are traditionally introduced early in linear algebra but are better understood in the context of linear transformations.
  • 🔢 The numerical definition of a dot product involves multiplying corresponding components of two vectors and summing the results.
  • 📏 The geometric interpretation of a dot product involves projecting one vector onto another and multiplying the lengths of the projection and the vector.
  • 📉 A dot product can be negative, zero, or positive depending on the relative directions of the two vectors.
  • 🤔 The order of vectors in a dot product does not matter due to the symmetrical nature of projection and scaling.
  • 🔄 Scaling one vector affects the dot product in a consistent way whether you project onto the scaled vector or from it.
  • 🔍 The connection between the numerical process of dot product and geometric projection is clarified through the concept of duality.
  • 📈 Linear transformations from multiple dimensions to one (the number line) maintain even spacing of points, a key property of linearity.
  • 📋 A 1x2 matrix representation is used to describe linear transformations to one dimension, akin to a vector turned on its side.
  • 🔑 The dot product with a unit vector can be seen as projecting a vector onto the line spanned by the unit vector and measuring the length of that projection.
  • 🌐 Duality in mathematics, including linear algebra, reveals surprising correspondences between different mathematical concepts, such as vectors and transformations.

Q & A

  • What is the standard way of introducing dot products in linear algebra?

    -The standard way of introducing dot products is by taking two vectors of the same dimension, pairing up all of the coordinates, multiplying those pairs together, and then adding the result.

  • How does the geometric interpretation of the dot product relate to projections?

    -The geometric interpretation involves projecting one vector onto the line that passes through the origin and the tip of the other vector, then multiplying the length of this projection by the length of the vector to get the dot product.

  • Why is the dot product between two vectors positive, zero, or negative?

    -The dot product is positive when the vectors are generally pointing in the same direction, zero when they are perpendicular, and negative when they point in generally opposite directions.

  • Why doesn't the order of the vectors matter in the dot product calculation?

    -The order doesn't matter because projecting one vector onto the other and then multiplying by the lengths of the vectors yields the same result regardless of which vector is being projected.

  • How does scaling one vector affect the dot product with another vector?

    -Scaling one vector, say by a constant, affects the dot product by scaling the length of the projection of the other vector onto the scaled vector, but the overall effect on the dot product value remains the same under both interpretations of projection.

  • What is the connection between linear transformations and vectors in the context of the dot product?

    -Linear transformations that take vectors to numbers can be described by 1x2 matrices, which are numerically similar to 2D vectors when the matrix is turned on its side. This suggests a connection between linear transformations and vectors, where the dot product can be seen as a form of such a transformation.

  • What is the significance of the 1x2 matrix in the context of linear transformations from 2D to 1D?

    -The 1x2 matrix represents a linear transformation that takes basis vectors i-hat and j-hat to specific numbers on the number line, with each column of the matrix representing where each basis vector lands.

  • How does the concept of duality relate to the dot product and linear transformations?

    -Duality refers to a natural but surprising correspondence between two types of mathematical things. In the context of linear algebra, the dual of a vector is the linear transformation it encodes, and vice versa.

  • What role does the unit vector u-hat play in the explanation of the dot product as a projection?

    -The unit vector u-hat is used to define a linear transformation from 2D vectors to numbers by projecting onto a diagonal number line. The dot product with u-hat is computationally identical to this projection transformation.

  • How does scaling a unit vector affect the interpretation of the dot product?

    -Scaling a unit vector affects the dot product by changing the length of the projection when applying the associated linear transformation, effectively scaling the result of the projection by the factor of the scaling.

  • What is the deeper significance of the dot product in understanding vectors and linear transformations?

    -The deeper significance is that the dot product not only serves as a geometric tool for understanding projections and vector directions but also as a bridge between vectors and linear transformations, embodying the concept of duality in mathematics.

Outlines

00:00

📚 Introduction to Dot Products and Linear Transformations

This paragraph introduces the concept of dot products, traditionally taught early in linear algebra, but here discussed in the context of linear transformations for a deeper understanding. The standard introduction to dot products involves numerical computation by multiplying corresponding components of two vectors and summing the results. The geometric interpretation involves projecting one vector onto another and calculating the product of the projection's length and the original vector's length. The paragraph also touches on the surprising symmetry of the dot product, where the order of the vectors does not affect the result, and the need to delve into linear transformations to fully appreciate the role of dot products in mathematics.

05:04

🔍 Exploring Linear Transformations and Dot Product's Role

The second paragraph delves into linear transformations, specifically those mapping from multiple dimensions to a single dimension, exemplified by the number line. It explains how such transformations maintain the even spacing of points, a key characteristic of linearity. The discussion then shifts to how these transformations can be represented by 1x2 matrices, with the example of a transformation mapping i-hat to 1 and j-hat to -2. The paragraph highlights the connection between these transformations and vectors, suggesting that a 1x2 matrix can be thought of as a vector 'tilted' on its side. This leads to the revelation that the numerical operation of matrix-vector multiplication is akin to taking the dot product of two vectors, thereby bridging the gap between the numerical and geometric interpretations of the dot product.

10:04

🌐 The Duality of Vectors and Transformations Through Dot Products

The final paragraph explores the profound connection between vectors and linear transformations, a concept known as duality. It uses the example of projecting 2D vectors onto a diagonal number line to define a linear transformation from 2D vectors to numbers. This projection is shown to be equivalent to taking a dot product with a specific unit vector, u-hat, which lies along the diagonal. The paragraph explains how the components of u-hat directly relate to the transformation matrix's columns, illustrating the symmetry and elegance of this relationship. It further discusses how scaling the unit vector affects the transformation, and how the dot product can be interpreted in terms of projection and scaling. The summary concludes by emphasizing the beauty of duality in mathematics, where vectors and transformations are deeply interconnected, and how understanding a vector as a linear transformation can provide new insights into its nature.

Mindmap

Keywords

💡Dot Product

The dot product is a fundamental concept in linear algebra that represents a mathematical operation between two vectors. It is defined as the sum of the products of their respective coordinates. In the video, the dot product is introduced as a means to quantify the geometric relationship between two vectors, such as determining the angle between them or projecting one vector onto another. The script illustrates the dot product numerically with examples like multiplying the pairs of coordinates from two vectors and adding the results.

💡Linear Transformations

Linear transformations are functions that map vectors from one space to another while preserving the operations of vector addition and scalar multiplication. In the context of the video, linear transformations are discussed in relation to their role in understanding the dot product, particularly how they can project vectors and relate to the concept of duality. The script explains that linear transformations from multiple dimensions to one dimension maintain the even spacing of points on a line when transformed, which is a key property used to explore the connection between vectors and transformations.

💡Projection

Projection in the video refers to the process of casting a vector onto another vector or a line, resulting in a new vector that represents the component of the original vector in the direction of the second vector. The script describes how the dot product can be interpreted as the product of the length of the projection of one vector onto another and the length of that second vector. This concept is crucial for understanding the geometric interpretation of the dot product.

💡Unit Vector

A unit vector is a vector with a length of one. In the video, the unit vector u-hat is used to illustrate the concept of projection and the relationship between vectors and linear transformations. The script explains that projecting a 2D vector onto a diagonal number line effectively defines a linear transformation, and the unit vector's role in this process helps to establish the connection between the dot product and projection.

💡Duality

Duality in mathematics refers to a relationship between two concepts that are mirror images of each other in a certain sense. In the video, duality is introduced as a profound concept that shows a surprising correspondence between vectors and linear transformations. The script demonstrates this through the example of how a vector can be seen as encoding a linear transformation, and vice versa, which is a key insight into the deeper understanding of the dot product.

💡Basis Vectors

Basis vectors are the fundamental building blocks of a vector space, used to express any vector in that space as a linear combination of the basis vectors. In the video, the script mentions i-hat and j-hat, which are the standard basis vectors in two-dimensional space. These basis vectors are crucial in defining linear transformations and understanding how they map to numbers, as seen in the example where they are projected onto a diagonal number line.

💡Vector Scaling

Vector scaling is the process of multiplying a vector by a scalar, which results in a new vector that is longer or shorter than the original, depending on the scalar's value. The video script discusses how scaling a vector affects the dot product, illustrating that when one vector is scaled, the dot product is affected in a way that maintains the same value, regardless of whether you project one vector onto the other or vice versa.

💡Geometric Interpretation

The geometric interpretation of a mathematical concept involves understanding it in terms of shapes, sizes, and spatial relationships. In the video, the geometric interpretation of the dot product is explored through the concept of projection, where the dot product is visualized as the product of the length of the projection of one vector onto another and the length of that vector. This interpretation helps to clarify the seemingly asymmetric nature of the dot product.

💡Numerical Process

A numerical process refers to a method of computation or calculation that involves numbers. In the context of the video, the numerical process of calculating the dot product is discussed, which involves multiplying corresponding components of two vectors and summing the results. The script highlights how this numerical process is intimately connected to the geometric concept of projection, providing a deeper understanding of the dot product.

💡Perpendicular Vectors

Perpendicular vectors are vectors that are at right angles to each other. In the video, the concept of perpendicularity is used to explain the dot product's value when two vectors are perpendicular; the dot product is zero because the projection of one vector onto another is the zero vector. This concept is essential for understanding the relationship between the orientation of vectors and their dot product.

Highlights

Introduction of dot products in the context of linear transformations rather than at the beginning of a linear algebra course.

Dot product definition: Numeric multiplication and addition of corresponding vector components.

Geometric interpretation of the dot product as a projection and its relation to vector directions.

Asymmetry in the geometric interpretation of dot products and its implications.

Commutative property of dot products and the intuition behind it.

Explaining the connection between scaling vectors and the effect on the dot product.

The concept of duality in mathematics and its relation to dot products.

Introduction to linear transformations from multiple dimensions to one dimension.

Visual property of linear transformations maintaining even spacing in the output space.

The role of basis vectors in determining linear transformations and their matrix representation.

Example of applying a linear transformation using matrix-vector multiplication.

Association between 1x2 matrices and 2D vectors through geometric and numerical representations.

Geometric connection between linear transformations and vectors through projection.

The significance of a unit vector in defining a linear transformation from 2D vectors to numbers.

How the dot product with a unit vector can be interpreted as a projection and length calculation.

Interpretation of the dot product with non-unit vectors in terms of projection and scaling.

The unique relationship between linear transformations and vectors described by duality.

Philosophical insight into viewing vectors as linear transformations for deeper understanding.

Preview of the next video discussing the cross product and further exploration of duality.

Transcripts

play00:16

["Ode to Joy", by Beethoven, plays to the end of the piano.] Traditionally,

play00:20

dot products are something that's introduced really early on in a linear algebra course,

play00:24

typically right at the start.

play00:26

So it might seem strange that I've pushed them back this far in the series.

play00:29

I did this because there's a standard way to introduce the topic,

play00:32

which requires nothing more than a basic understanding of vectors,

play00:35

but a fuller understanding of the role that dot products play in math can only really be

play00:40

found under the light of linear transformations.

play00:43

Before that, though, let me just briefly cover the standard way that dot products are

play00:47

introduced, which I'm assuming is at least partially review for a number of viewers.

play00:51

Numerically, if you have two vectors of the same dimension,

play00:55

two lists of numbers with the same lengths, taking their dot product means

play00:59

pairing up all of the coordinates, multiplying those pairs together,

play01:03

and adding the result.

play01:06

So the vector 1, 2 dotted with 3, 4 would be 1 times 3 plus 2 times 4.

play01:14

The vector 6, 2, 8, 3 dotted with 1, 8, 5, 3 would be

play01:19

6 times 1 plus 2 times 8 plus 8 times 5 plus 3 times 3.

play01:24

Luckily, this computation has a really nice geometric interpretation.

play01:29

To think about the dot product between two vectors, v and w,

play01:33

imagine projecting w onto the line that passes through the origin and the tip of v.

play01:38

Multiplying the length of this projection by the length of v,

play01:42

you have the dot product v dot w.

play01:46

Except when this projection of w is pointing in the opposite direction from v,

play01:50

that dot product will actually be negative.

play01:53

So when two vectors are generally pointing in the same direction,

play01:56

their dot product is positive.

play01:59

When they're perpendicular, meaning the projection of one

play02:02

onto the other is the zero vector, their dot product is zero.

play02:05

And if they point in generally the opposite direction, their dot product is negative.

play02:11

Now, this interpretation is weirdly asymmetric.

play02:14

It treats the two vectors very differently.

play02:16

So when I first learned this, I was surprised that order doesn't matter.

play02:20

You could instead project v onto w, multiply the length of

play02:24

the projected v by the length of w, and get the same result.

play02:30

I mean, doesn't that feel like a really different process?

play02:35

Here's the intuition for why order doesn't matter.

play02:38

If v and w happened to have the same length, we could leverage some symmetry.

play02:43

Since projecting w onto v, then multiplying the length of that projection

play02:47

by the length of v, is a complete mirror image of projecting v onto w,

play02:51

then multiplying the length of that projection by the length of w.

play02:57

Now, if you scale one of them, say v, by some constant like 2,

play03:00

so that they don't have equal length, the symmetry is broken.

play03:05

But let's think through how to interpret the dot product between this new vector,

play03:09

2 times v, and w.

play03:10

If you think of w as getting projected onto v,

play03:14

then the dot product 2v dot w will be exactly twice the dot product v dot w.

play03:20

This is because when you scale v by 2, it doesn't change the length of the

play03:24

projection of w, but it doubles the length of the vector that you're projecting onto.

play03:30

But on the other hand, let's say you were thinking about v getting projected onto w.

play03:34

Well, in that case, the length of the projection is the thing that gets scaled when we

play03:38

multiply v by 2, but the length of the vector that you're projecting onto stays constant.

play03:43

So the overall effect is still to just double the dot product.

play03:47

So even though symmetry is broken in this case,

play03:49

the effect that this scaling has on the value of the dot product is the same

play03:53

under both interpretations.

play03:56

There's also one other big question that confused me when I first learned this stuff.

play04:00

Why on earth does this numerical process of matching coordinates,

play04:04

multiplying pairs, and adding them together have anything to do with projection?

play04:10

Well, to give a satisfactory answer, and also to do full justice to

play04:14

the significance of the dot product, we need to unearth something a

play04:17

little bit deeper going on here, which often goes by the name duality.

play04:22

But before getting into that, I need to spend some time talking about linear

play04:25

transformations from multiple dimensions to one dimension, which is just the number line.

play04:32

These are functions that take in a 2D vector and spit out some number,

play04:35

but linear transformations are of course much more restricted than

play04:39

your run-of-the-mill function with a 2D input and a 1D output.

play04:43

As with transformations in higher dimensions, like the ones I talked about in chapter 3,

play04:47

there are some formal properties that make these functions linear,

play04:50

but I'm going to purposefully ignore those here so as to not distract from our end goal,

play04:54

and instead focus on a certain visual property that's equivalent to all the formal stuff.

play04:59

If you take a line of evenly spaced dots and apply a transformation,

play05:03

a linear transformation will keep those dots evenly spaced once

play05:07

they land in the output space, which is the number line.

play05:12

Otherwise, if there's some line of dots that gets unevenly spaced,

play05:15

then your transformation is not linear.

play05:19

As with the cases we've seen before, one of these linear transformations is

play05:23

completely determined by where it takes i-hat and j-hat,

play05:26

but this time each one of those basis vectors just lands on a number,

play05:30

so when we record where they land as the columns of a matrix,

play05:34

each of those columns just has a single number.

play05:38

This is a 1x2 matrix.

play05:41

Let's walk through an example of what it means

play05:43

to apply one of these transformations to a vector.

play05:46

Let's say you have a linear transformation that takes i-hat to 1 and j-hat to negative 2.

play05:52

To follow where a vector with coordinates, say, 4, 3 ends up,

play05:56

think of breaking up this vector as 4 times i-hat plus 3 times j-hat.

play06:01

A consequence of linearity is that after the transformation,

play06:05

the vector will be 4 times the place where i-hat lands, 1,

play06:09

plus 3 times the place where j-hat lands, negative 2,

play06:12

which in this case implies that it lands on negative 2.

play06:18

When you do this calculation purely numerically, it's matrix vector multiplication.

play06:25

Now, this numerical operation of multiplying a 1x2 matrix by

play06:29

a vector feels just like taking the dot product of two vectors.

play06:33

Doesn't that 1x2 matrix just look like a vector that we tipped on its side?

play06:37

In fact, we could say right now that there's a nice association between 1x2 matrices

play06:42

and 2D vectors, defined by tilting the numerical representation of a vector on its side

play06:47

to get the associated matrix, or to tip the matrix back up to get the associated vector.

play06:53

Since we're just looking at numerical expressions right now,

play06:56

going back and forth between vectors and 1x2 matrices might feel like a silly thing to do.

play07:01

But this suggests something that's truly awesome from the geometric view.

play07:05

There's some kind of connection between linear transformations

play07:08

that take vectors to numbers and vectors themselves.

play07:14

Let me show an example that clarifies the significance,

play07:17

and which just so happens to also answer the dot product puzzle from earlier.

play07:22

Unlearn what you have learned, and imagine that you don't

play07:24

already know that the dot product relates to projection.

play07:28

What I'm going to do here is take a copy of the number line and place

play07:32

it diagonally in space somehow, with the number 0 sitting at the origin.

play07:36

Now think of the two-dimensional unit vector whose

play07:39

tip sits where the number 1 on the number is.

play07:42

I want to give that guy a name, u-hat.

play07:45

This little guy plays an important role in what's about to happen,

play07:48

so just keep him in the back of your mind.

play07:50

If we project 2d vectors straight onto this diagonal number line,

play07:54

in effect, we've just defined a function that takes 2d vectors to numbers.

play07:59

What's more, this function is actually linear,

play08:02

since it passes our visual test that any line of evenly spaced dots remains evenly

play08:06

spaced once it lands on the number line.

play08:11

Just to be clear, even though I've embedded the number line in 2d space like this,

play08:16

the outputs of the function are numbers, not 2d vectors.

play08:19

You should think of a function that takes in two

play08:21

coordinates and outputs a single coordinate.

play08:25

But that vector u-hat is a two-dimensional vector, living in the input space.

play08:29

It's just situated in such a way that overlaps with the embedding of the number line.

play08:34

With this projection, we just defined a linear transformation from 2d vectors to numbers,

play08:39

so we're going to be able to find some kind of 1x2 matrix that

play08:42

describes that transformation.

play08:45

To find that 1x2 matrix, let's zoom in on this diagonal number

play08:49

line setup and think about where i-hat and j-hat each land,

play08:52

since those landing spots are going to be the columns of the matrix.

play08:58

This part's super cool.

play08:59

We can reason through it with a really elegant piece of symmetry.

play09:03

Since i-hat and u-hat are both unit vectors, projecting i-hat onto the line

play09:07

passing through u-hat looks totally symmetric to projecting u-hat onto the x-axis.

play09:13

So when we ask what number does i-hat land on when it gets projected,

play09:17

the answer is going to be the same as whatever u-hat lands on when it's projected

play09:21

onto the x-axis.

play09:22

But projecting u-hat onto the x-axis just means taking the x-coordinate of u-hat.

play09:29

So by symmetry, the number where i-hat lands when it's projected onto

play09:32

that diagonal number line is going to be the x-coordinate of u-hat.

play09:37

Isn't that cool?

play09:39

The reasoning is almost identical for the j-hat case.

play09:42

Think about it for a moment.

play09:49

For all the same reasons, the y-coordinate of u-hat gives us the

play09:52

number where j-hat lands when it's projected onto the number line copy.

play09:57

Pause and ponder that for a moment.

play09:58

I just think that's really cool.

play10:00

So the entries of the 1x2 matrix describing the projection

play10:04

transformation are going to be the coordinates of u-hat.

play10:08

And computing this projection transformation for arbitrary vectors in space,

play10:12

which requires multiplying that matrix by those vectors,

play10:15

is computationally identical to taking a dot product with u-hat.

play10:21

This is why taking the dot product with a unit vector can be interpreted as

play10:26

projecting a vector onto the span of that unit vector and taking the length.

play10:34

So what about non-unit vectors?

play10:36

For example, let's say we take that unit vector u-hat,

play10:38

but we scale it up by a factor of 3.

play10:41

Numerically, each of its components gets multiplied by 3.

play10:44

So looking at the matrix associated with that vector,

play10:47

it takes i-hat and j-hat to three times the values where they landed before.

play10:55

Since this is all linear, it implies more generally that the new matrix can be

play10:59

interpreted as projecting any vector onto the number line copy and multiplying where it

play11:04

lands by 3.

play11:05

This is why the dot product with a non-unit vector can be

play11:08

interpreted as first projecting onto that vector,

play11:11

then scaling up the length of that projection by the length of the vector.

play11:17

Take a moment to think about what happened here.

play11:19

We had a linear transformation from 2D space to the number line,

play11:23

which was not defined in terms of numerical vectors or numerical dot products,

play11:26

it was just defined by projecting space onto a diagonal copy of the number line.

play11:31

But because the transformation is linear, it was necessarily described by some 1x2 matrix.

play11:37

And since multiplying a 1x2 matrix by a 2D vector is the same

play11:40

as turning that matrix on its side and taking a dot product,

play11:44

this transformation was inescapably related to some 2D vector.

play11:49

The lesson here is that any time you have one of these linear transformations whose

play11:53

output space is the number line, no matter how it was defined,

play11:56

there's going to be some unique vector v corresponding to that transformation,

play12:00

in the sense that applying the transformation is the same thing as taking a dot

play12:05

product with that vector.

play12:09

To me, this is utterly beautiful.

play12:12

It's an example of something in math called duality.

play12:16

Duality shows up in many different ways and forms throughout math,

play12:19

and it's super tricky to actually define.

play12:22

Loosely speaking, it refers to situations where you have a natural

play12:26

but surprising correspondence between two types of mathematical thing.

play12:31

For the linear algebra case that you just learned about,

play12:34

you'd say that the dual of a vector is the linear transformation that it encodes,

play12:38

and the dual of a linear transformation from some space to one dimension is a

play12:43

certain vector in that space.

play12:46

So to sum up, on the surface, the dot product is a very useful

play12:50

geometric tool for understanding projections and for testing

play12:53

whether or not vectors tend to point in the same direction.

play12:56

And that's probably the most important thing for you to remember about the dot product.

play13:01

But at a deeper level, dotting two vectors together is a way

play13:04

to translate one of them into the world of transformations.

play13:08

Again, numerically, this might feel like a silly point to emphasize.

play13:11

It's just two computations that happen to look similar.

play13:14

But the reason I find this so important is that throughout math,

play13:18

when you're dealing with a vector, once you really get to know its personality,

play13:22

sometimes you realize that it's easier to understand it not as an arrow in space,

play13:26

but as the physical embodiment of a linear transformation.

play13:30

It's as if the vector is really just a conceptual shorthand for a certain transformation,

play13:35

since it's easier for us to think about arrows in space rather than

play13:38

moving all of that space to the number line.

play13:42

In the next video, you'll see another really cool example of this duality in action,

play13:47

as I talk about the cross product.

Rate This

5.0 / 5 (0 votes)

Related Tags
Dot ProductLinear AlgebraVector MathematicsGeometric InterpretationLinear TransformationsMath DualityProjectionsVectorsNumerical CalculationsMath Education