Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra

3Blue1Brown
15 Sept 201617:15

Summary

TLDRThis script delves into the concept of eigenvectors and eigenvalues, often found challenging by students. It emphasizes the importance of understanding matrices as linear transformations and being comfortable with determinants, linear systems, and change of basis. The script explains how eigenvectors remain on their span during transformations, with eigenvalues representing the stretch or squish factor. It also touches on the utility of eigenvectors in identifying axes of rotation in 3D space and simplifies matrix operations through eigenbases, making complex transformations more manageable.

Takeaways

  • 🧠 Eigenvectors and eigenvalues are foundational concepts in linear algebra that can be challenging to grasp without a strong visual understanding.
  • 🔍 Understanding matrices as linear transformations is crucial for comprehending eigenvectors and eigenvalues.
  • 📏 Eigenvectors are special vectors that remain on their span after a linear transformation, only being stretched or squished by a scalar factor known as the eigenvalue.
  • 📐 The transformation of a vector can be visualized as moving from its original position to a new position on a line or plane, with eigenvectors aligning with this new position.
  • 🔄 Eigenvectors with eigenvalues of 1 indicate vectors that remain fixed in place after a transformation, such as the axis of rotation in 3D space.
  • 🔢 The process of finding eigenvectors and eigenvalues involves solving for values that satisfy the equation Av = λv, where A is the matrix, v is the eigenvector, and λ is the eigenvalue.
  • 📉 Determining eigenvalues involves setting the determinant of (A - λI) to zero, where I is the identity matrix, and solving for λ.
  • 🔍 The absence of real eigenvalues, such as in the case of a 90-degree rotation, indicates that no real eigenvectors exist for the transformation.
  • 📚 A shear transformation provides an example where eigenvectors exist but are limited to a single line, highlighting the diversity of eigenvector behavior.
  • 🌐 The concept of an eigenbasis, where basis vectors are eigenvectors, simplifies matrix operations and transformations, especially when raising matrices to high powers.

Q & A

  • What is an eigenvector?

    -An eigenvector is a special vector that remains on its own span during a linear transformation, meaning it only gets stretched or squished without being rotated.

  • What is the significance of eigenvalues?

    -Eigenvalues represent the factor by which an eigenvector is stretched or squished during a linear transformation.

  • Why are eigenvectors and eigenvalues important in understanding linear transformations?

    -Eigenvectors and eigenvalues help to understand the essence of a linear transformation without being dependent on the specific coordinate system, making it easier to analyze transformations.

  • How can eigenvectors help in visualizing a 3D rotation?

    -If you find an eigenvector for a 3D rotation, it corresponds to the axis of rotation, simplifying the understanding of the rotation to just an axis and an angle.

  • What is the relationship between eigenvectors and the columns of a transformation matrix?

    -The columns of a transformation matrix represent the effect of the transformation on the basis vectors, and eigenvectors are special basis vectors that are stretched or squished without rotation during the transformation.

  • What does it mean for a matrix to be diagonalizable?

    -A matrix is diagonalizable if there exists a basis of eigenvectors, meaning it can be represented as a diagonal matrix in some coordinate system.

  • How do you determine if a matrix has eigenvectors?

    -You determine if a matrix has eigenvectors by finding values of lambda that make the determinant of (A - lambda * I) zero, where A is the matrix and I is the identity matrix.

  • What is the role of the determinant in finding eigenvalues?

    -The determinant is used to find eigenvalues by setting it to zero after subtracting lambda from the diagonal elements of the matrix, resulting in a polynomial equation that can be solved for lambda.

  • Why might a transformation not have eigenvectors?

    -A transformation might not have eigenvectors if it rotates every vector off of its own span, as is the case with a 90-degree rotation, which results in imaginary eigenvalues.

  • What is an eigenbasis and why is it useful?

    -An eigenbasis is a set of basis vectors that are also eigenvectors. It is useful because it allows the transformation matrix to be represented as a diagonal matrix, simplifying operations like computing matrix powers.

  • How does changing the basis to an eigenbasis simplify matrix operations?

    -Changing to an eigenbasis simplifies matrix operations because the transformation matrix in this basis is diagonal, with eigenvalues on the diagonal, making operations like matrix multiplication and exponentiation much easier.

Outlines

00:00

📐 Understanding Eigenvectors and Eigenvalues

This paragraph introduces the concept of eigenvectors and eigenvalues, which are often challenging for students to grasp intuitively. The author acknowledges that while the topic itself is straightforward, a solid visual understanding of prerequisite topics like linear transformations, determinants, and change of basis is crucial. The paragraph explains how eigenvectors remain on their span during a linear transformation, exemplified by a two-dimensional transformation that moves the basis vectors i-hat and j-hat to specific coordinates. The author emphasizes that eigenvectors are special vectors that only get stretched or compressed during a transformation, represented by a matrix, without being rotated off their span. The eigenvalue associated with an eigenvector is the factor by which it is stretched or compressed. The paragraph concludes with the idea that eigenvectors can simplify the understanding of complex transformations, such as finding the axis of rotation in a 3D space.

05:00

🔍 Computational Insight into Eigenvectors

The second paragraph delves into the computational aspect of finding eigenvectors and eigenvalues. It starts by defining the symbolic representation of an eigenvector equation, where a matrix A represents a transformation, v is the eigenvector, and lambda is the eigenvalue. The challenge of equating matrix-vector multiplication with scalar-vector multiplication is addressed by introducing a scaling matrix. The process of finding eigenvalues involves adjusting lambda to make the determinant of the matrix (A - lambda*I) zero, which signifies that the transformation squishes space into a lower dimension. The author uses a concrete example to illustrate how tweaking lambda affects the determinant and how finding the 'sweet spot' for lambda helps identify eigenvalues. The paragraph also touches on the importance of understanding determinants and linear systems for grasping the concept of eigenvalues and eigenvectors.

10:03

🔄 Exploring Eigenvectors in Different Transformations

This paragraph explores eigenvectors in the context of different types of transformations, such as rotations and shears. It explains that not all transformations have eigenvectors, using a 90-degree rotation as an example where no eigenvectors exist because all vectors are rotated off their span. The paragraph also discusses the eigenvalues and eigenvectors of a shear transformation, where all vectors on the x-axis are eigenvectors with an eigenvalue of 1. The concept of an eigenbasis is introduced, which is a set of basis vectors that are also eigenvectors, leading to a diagonal matrix representation of the transformation. The benefits of diagonal matrices, such as simplifying the computation of matrix powers, are highlighted. The paragraph concludes by emphasizing the utility of eigenbases in making matrix operations more manageable.

15:04

🌐 Change of Basis and Eigenbasis

The final paragraph discusses the concept of change of basis and how it relates to eigenvectors. It explains that by using eigenvectors as the new basis vectors, the transformation matrix becomes diagonal with the eigenvalues on the diagonal. This process simplifies operations like computing the power of a matrix. The paragraph provides a step-by-step guide on how to change the basis to an eigenbasis, which involves creating a change of basis matrix using the eigenvectors and then applying it to the original transformation matrix. The author points out that not all transformations can have an eigenbasis, such as a shear, which lacks enough eigenvectors to span the space. The paragraph ends with a prompt for the audience to engage with the material and a teaser for the next video on abstract vector spaces.

Mindmap

Keywords

💡Eigenvectors

Eigenvectors are vectors that, when subjected to a linear transformation represented by a matrix, do not change direction. Instead, they only change in magnitude. In the context of the video, eigenvectors are special vectors that remain on their own span after a transformation, meaning they are either stretched or compressed but not rotated. The video uses the example of a basis vector i-hat, which remains on the x-axis and is stretched by a factor of 3, illustrating this concept.

💡Eigenvalues

Eigenvalues are scalar factors by which eigenvectors are stretched or compressed during a linear transformation. They are intrinsically linked to eigenvectors, as each eigenvector has a corresponding eigenvalue. The video mentions that eigenvalues can be positive, negative, or even zero, and they are crucial in determining the effect of a transformation on an eigenvector.

💡Linear Transformation

A linear transformation is a function that maps vectors from one space to another while preserving the operations of vector addition and scalar multiplication. In the video, linear transformations are represented by matrices, and the focus is on how these transformations affect vectors, particularly eigenvectors. The script describes a transformation that moves basis vectors i-hat and j-hat to new coordinates, represented by the columns of a matrix.

💡Matrix

A matrix is a rectangular array of numbers arranged in rows and columns, which can represent a linear transformation. The video explains that matrices can be thought of as describing how basis vectors are transformed, with the columns of the matrix showing the new coordinates of the basis vectors after the transformation.

💡Determinants

The determinant of a matrix is a scalar value that can be computed from the elements of a square matrix and is used to determine if the matrix is invertible or if it has a non-trivial solution to a system of linear equations. In the video, determinants are used to find eigenvalues by setting them to zero, which helps in identifying the transformation that 'squishes' space into a lower dimension.

💡Diagonal Matrix

A diagonal matrix is a square matrix in which the elements outside the main diagonal are all zero. In the video, diagonal matrices are mentioned as a result of having an eigenbasis, where the transformation matrix, when represented in the eigenbasis, becomes diagonal with eigenvalues on the diagonal, simplifying computations.

💡Eigenbasis

An eigenbasis is a basis of a vector space consisting entirely of eigenvectors of a given linear transformation. The video explains that if you can find enough eigenvectors to span the entire space, you can change your coordinate system to use these eigenvectors as basis vectors, resulting in a diagonal transformation matrix.

💡Change of Basis

A change of basis is the process of expressing vectors in a different but equivalent coordinate system. The video touches on this concept by explaining how to convert a transformation matrix from one basis to another using the eigenvectors as the new basis, which simplifies the matrix to a diagonal form.

💡Identity Matrix

The identity matrix is a square matrix with ones on the main diagonal and zeros elsewhere, which, when multiplied with any vector, returns the original vector unchanged. In the video, the identity matrix is used in combination with eigenvalues to create a matrix that scales vectors by a certain factor, aiding in the calculation of eigenvectors.

💡Scalar Multiplication

Scalar multiplication is the operation of multiplying a vector by a scalar (a single number), which results in a new vector that is a scaled version of the original. The video uses scalar multiplication to explain how eigenvectors are stretched or compressed by the eigenvalues during a linear transformation.

Highlights

Eigenvectors and eigenvalues can be unintuitive for students without a solid visual understanding.

Understanding eigenvectors and eigenvalues requires knowledge of linear transformations, determinants, linear systems, and change of basis.

Eigenvectors remain on their span during a linear transformation, meaning they are only stretched or squished.

Eigenvalues are the factor by which eigenvectors are stretched or squished during a transformation.

Eigenvectors and eigenvalues help identify the axis of rotation in 3D transformations.

Eigenvectors and eigenvalues can be found by solving the equation Av = λv, where A is the transformation matrix, v is the eigenvector, and λ is the eigenvalue.

The process involves rewriting the equation to (A - λI)v = 0, where I is the identity matrix.

A matrix transformation squishes space into a lower dimension when its determinant is zero.

Eigenvalues are found by setting the determinant of (A - λI) to zero and solving for λ.

Eigenvectors corresponding to a given eigenvalue are found by solving (A - λI)v = 0.

A 2D transformation does not necessarily have eigenvectors, such as a 90-degree rotation.

Eigenvalues can be complex numbers, indicating no real eigenvectors exist, as seen in the case of a 90-degree rotation matrix.

Shear transformations have eigenvectors with eigenvalue 1, as all vectors on the x-axis remain fixed.

A matrix can have only one eigenvalue with multiple eigenvectors spanning the space.

Eigenbasis is a set of basis vectors that are also eigenvectors, simplifying matrix operations.

Changing to an eigenbasis simplifies computing the power of a matrix, such as the 100th power.

Not all transformations have enough eigenvectors to form an eigenbasis, such as shear transformations.

Transcripts

play00:19

Eigenvectors and eigenvalues is one of those topics

play00:22

that a lot of students find particularly unintuitive.

play00:25

Questions like, why are we doing this and what does this actually mean,

play00:29

are too often left just floating away in an unanswered sea of computations.

play00:33

And as I've put out the videos of this series,

play00:36

a lot of you have commented about looking forward to visualizing this topic in particular.

play00:40

I suspect that the reason for this is not so much that

play00:43

eigenthings are particularly complicated or poorly explained.

play00:46

In fact, it's comparatively straightforward, and

play00:49

I think most books do a fine job explaining it.

play00:51

The issue is that it only really makes sense if you have a

play00:54

solid visual understanding for many of the topics that precede it.

play00:59

Most important here is that you know how to think about matrices as

play01:02

linear transformations, but you also need to be comfortable with things

play01:06

like determinants, linear systems of equations, and change of basis.

play01:10

Confusion about eigenstuffs usually has more to do with a shaky foundation in

play01:14

one of these topics than it does with eigenvectors and eigenvalues themselves.

play01:19

To start, consider some linear transformation in two dimensions, like the one shown here.

play01:25

It moves the basis vector i-hat to the coordinates 3, 0, and j-hat to 1, 2.

play01:31

So it's represented with a matrix whose columns are 3, 0, and 1, 2.

play01:36

Focus in on what it does to one particular vector,

play01:39

and think about the span of that vector, the line passing through its origin and its tip.

play01:44

Most vectors are going to get knocked off their span during the transformation.

play01:48

I mean, it would seem pretty coincidental if the place where

play01:52

the vector landed also happened to be somewhere on that line.

play01:57

But some special vectors do remain on their own span,

play02:00

meaning the effect that the matrix has on such a vector is just to stretch it or

play02:05

squish it, like a scalar.

play02:09

For this specific example, the basis vector i-hat is one such special vector.

play02:14

The span of i-hat is the x-axis, and from the first column of the matrix,

play02:19

we can see that i-hat moves over to 3 times itself, still on that x-axis.

play02:26

What's more, because of the way linear transformations work,

play02:30

any other vector on the x-axis is also just stretched by a factor of 3,

play02:34

and hence remains on its own span.

play02:38

A slightly sneakier vector that remains on its own

play02:41

span during this transformation is negative 1, 1.

play02:44

It ends up getting stretched by a factor of 2.

play02:49

And again, linearity is going to imply that any other vector on the diagonal

play02:53

line spanned by this guy is just going to get stretched out by a factor of 2.

play02:59

And for this transformation, those are all the vectors

play03:02

with this special property of staying on their span.

play03:05

Those on the x-axis getting stretched out by a factor of 3,

play03:08

and those on this diagonal line getting stretched by a factor of 2.

play03:12

Any other vector is going to get rotated somewhat during the transformation,

play03:16

knocked off the line that it spans.

play03:22

As you might have guessed by now, these special vectors are called the eigenvectors of

play03:27

the transformation, and each eigenvector has associated with it what's called an

play03:31

eigenvalue, which is just the factor by which it's stretched or squished during the

play03:36

transformation.

play03:40

Of course, there's nothing special about stretching versus squishing,

play03:43

or the fact that these eigenvalues happen to be positive.

play03:46

In another example, you could have an eigenvector with eigenvalue negative 1 half,

play03:51

meaning that the vector gets flipped and squished by a factor of 1 half.

play03:56

But the important part here is that it stays on the

play03:59

line that it spans out without getting rotated off of it.

play04:04

For a glimpse of why this might be a useful thing to think about,

play04:07

consider some three-dimensional rotation.

play04:11

If you can find an eigenvector for that rotation,

play04:14

a vector that remains on its own span, what you have found is the axis of rotation.

play04:22

And it's much easier to think about a 3D rotation in terms of some

play04:26

axis of rotation and an angle by which it's rotating,

play04:29

rather than thinking about the full 3x3 matrix associated with that transformation.

play04:37

In this case, by the way, the corresponding eigenvalue would have to be 1,

play04:40

since rotations never stretch or squish anything,

play04:43

so the length of the vector would remain the same.

play04:48

This pattern shows up a lot in linear algebra.

play04:50

With any linear transformation described by a matrix,

play04:53

you could understand what it's doing by reading off the columns of this matrix as the

play04:57

landing spots for basis vectors.

play05:00

But often, a better way to get at the heart of what the linear

play05:03

transformation actually does, less dependent on your particular coordinate system,

play05:08

is to find the eigenvectors and eigenvalues.

play05:15

I won't cover the full details on methods for computing eigenvectors

play05:18

and eigenvalues here, but I'll try to give an overview of the

play05:22

computational ideas that are most important for a conceptual understanding.

play05:27

Symbolically, here's what the idea of an eigenvector looks like.

play05:31

A is the matrix representing some transformation, with v as the eigenvector,

play05:35

and lambda is a number, namely the corresponding eigenvalue.

play05:40

What this expression is saying is that the matrix-vector product, A times v,

play05:45

gives the same result as just scaling the eigenvector v by some value lambda.

play05:51

So finding the eigenvectors and their eigenvalues of a matrix A comes

play05:55

down to finding the values of v and lambda that make this expression true.

play06:01

It's a little awkward to work with at first, because that left-hand side represents

play06:06

matrix-vector multiplication, but the right-hand side here is scalar-vector

play06:09

multiplication.

play06:11

So let's start by rewriting that right-hand side as some kind of matrix-vector

play06:15

multiplication, using a matrix which has the effect of scaling any vector by a factor

play06:20

of lambda.

play06:21

The columns of such a matrix will represent what happens to each basis vector,

play06:26

and each basis vector is simply multiplied by lambda,

play06:29

so this matrix will have the number lambda down the diagonal, with zeros everywhere else.

play06:36

The common way to write this guy is to factor that lambda out and write it

play06:40

as lambda times i, where i is the identity matrix with 1s down the diagonal.

play06:45

With both sides looking like matrix-vector multiplication,

play06:48

we can subtract off that right-hand side and factor out the v.

play06:54

So what we now have is a new matrix, A minus lambda times the identity,

play06:58

and we're looking for a vector v such that this new matrix times v gives the zero vector.

play07:06

Now, this will always be true if v itself is the zero vector, but that's boring.

play07:11

What we want is a non-zero eigenvector.

play07:14

And if you watch chapter 5 and 6, you'll know that the only way it's possible

play07:18

for the product of a matrix with a non-zero vector to become zero is if the

play07:23

transformation associated with that matrix squishes space into a lower dimension.

play07:29

And that squishification corresponds to a zero determinant for the matrix.

play07:35

To be concrete, let's say your matrix A has columns 2, 1 and 2, 3,

play07:39

and think about subtracting off a variable amount, lambda, from each diagonal entry.

play07:46

Now imagine tweaking lambda, turning a knob to change its value.

play07:50

As that value of lambda changes, the matrix itself changes,

play07:54

and so the determinant of the matrix changes.

play07:58

The goal here is to find a value of lambda that will make this determinant zero,

play08:02

meaning the tweaked transformation squishes space into a lower dimension.

play08:08

In this case, the sweet spot comes when lambda equals 1.

play08:12

Of course, if we had chosen some other matrix, the eigenvalue might not necessarily be 1.

play08:16

The sweet spot might be hit at some other value of lambda.

play08:20

So this is kind of a lot, but let's unravel what this is saying.

play08:22

When lambda equals 1, the matrix A minus lambda

play08:26

times the identity squishes space onto a line.

play08:30

That means there's a non-zero vector v such that A minus

play08:34

lambda times the identity times v equals the zero vector.

play08:40

And remember, the reason we care about that is because it means A times v

play08:46

equals lambda times v, which you can read off as saying that the vector v

play08:51

is an eigenvector of A, staying on its own span during the transformation A.

play08:58

In this example, the corresponding eigenvalue is 1,

play09:01

so v would actually just stay fixed in place.

play09:06

Pause and ponder if you need to make sure that that line of reasoning feels good.

play09:13

This is the kind of thing I mentioned in the introduction.

play09:16

If you didn't have a solid grasp of determinants and why they

play09:19

relate to linear systems of equations having non-zero solutions,

play09:22

an expression like this would feel completely out of the blue.

play09:28

To see this in action, let's revisit the example from the start,

play09:31

with a matrix whose columns are 3, 0 and 1, 2.

play09:35

To find if a value lambda is an eigenvalue, subtract it from

play09:39

the diagonals of this matrix and compute the determinant.

play09:50

Doing this, we get a certain quadratic polynomial in lambda,

play09:54

3 minus lambda times 2 minus lambda.

play09:57

Since lambda can only be an eigenvalue if this determinant happens to be zero,

play10:02

you can conclude that the only possible eigenvalues are lambda equals 2 and lambda

play10:08

equals 3.

play10:09

To figure out what the eigenvectors are that actually have one of these eigenvalues,

play10:14

say lambda equals 2, plug in that value of lambda to the matrix and then

play10:19

solve for which vectors this diagonally altered matrix sends to zero.

play10:24

If you computed this the way you would any other linear system,

play10:28

you'd see that the solutions are all the vectors on the diagonal line spanned

play10:33

by negative 1, 1.

play10:35

This corresponds to the fact that the unaltered matrix, 3, 0, 1,

play10:39

2, has the effect of stretching all those vectors by a factor of 2.

play10:46

Now, a 2D transformation doesn't have to have eigenvectors.

play10:50

For example, consider a rotation by 90 degrees.

play10:53

This doesn't have any eigenvectors since it rotates every vector off of its own span.

play11:00

If you actually try computing the eigenvalues of a rotation like this,

play11:04

notice what happens.

play11:06

Its matrix has columns 0, 1 and negative 1, 0.

play11:11

Subtract off lambda from the diagonal elements and look for when the determinant is zero.

play11:18

In this case, you get the polynomial lambda squared plus 1.

play11:22

The only roots of that polynomial are the imaginary numbers, i and negative i.

play11:28

The fact that there are no real number solutions indicates that there are no eigenvectors.

play11:35

Another pretty interesting example worth holding in the back of your mind is a shear.

play11:40

This fixes i-hat in place and moves j-hat 1 over, so its matrix has columns 1, 0 and 1, 1.

play11:48

All of the vectors on the x-axis are eigenvectors

play11:51

with eigenvalue 1 since they remain fixed in place.

play11:55

In fact, these are the only eigenvectors.

play11:58

When you subtract off lambda from the diagonals and compute the determinant,

play12:03

what you get is 1 minus lambda squared.

play12:09

And the only root of this expression is lambda equals 1.

play12:14

This lines up with what we see geometrically,

play12:17

that all of the eigenvectors have eigenvalue 1.

play12:21

Keep in mind though, it's also possible to have just one eigenvalue,

play12:25

but with more than just a line full of eigenvectors.

play12:29

A simple example is a matrix that scales everything by 2.

play12:33

The only eigenvalue is 2, but every vector in the

play12:37

plane gets to be an eigenvector with that eigenvalue.

play12:42

Now is another good time to pause and ponder some

play12:44

of this before I move on to the last topic.

play13:03

I want to finish off here with the idea of an eigenbasis,

play13:06

which relies heavily on ideas from the last video.

play13:11

Take a look at what happens if our basis vectors just so happen to be eigenvectors.

play13:17

For example, maybe i-hat is scaled by negative 1 and j-hat is scaled by 2.

play13:23

Writing their new coordinates as the columns of a matrix,

play13:27

notice that those scalar multiples, negative 1 and 2,

play13:30

which are the eigenvalues of i-hat and j-hat, sit on the diagonal of our matrix,

play13:35

and every other entry is a 0.

play13:38

Any time a matrix has zeros everywhere other than the diagonal,

play13:42

it's called, reasonably enough, a diagonal matrix.

play13:45

And the way to interpret this is that all the basis vectors are eigenvectors,

play13:50

with the diagonal entries of this matrix being their eigenvalues.

play13:57

There are a lot of things that make diagonal matrices much nicer to work with.

play14:01

One big one is that it's easier to compute what will happen

play14:05

if you multiply this matrix by itself a whole bunch of times.

play14:09

Since all one of these matrices does is scale each basis vector by some eigenvalue,

play14:14

applying that matrix many times, say 100 times,

play14:17

is just going to correspond to scaling each basis vector by the 100th power of

play14:22

the corresponding eigenvalue.

play14:25

In contrast, try computing the 100th power of a non-diagonal matrix.

play14:29

Really, try it for a moment.

play14:31

It's a nightmare.

play14:36

Of course, you'll rarely be so lucky as to have your basis vectors also be eigenvectors.

play14:42

But if your transformation has a lot of eigenvectors,

play14:45

like the one from the start of this video, enough so that you can choose a set that

play14:49

spans the full space, then you could change your coordinate system so that these

play14:54

eigenvectors are your basis vectors.

play14:57

I talked about change of basis last video, but I'll go through

play15:00

a super quick reminder here of how to express a transformation

play15:03

currently written in our coordinate system into a different system.

play15:08

Take the coordinates of the vectors that you want to use as a new basis,

play15:12

which in this case means our two eigenvectors,

play15:14

then make those coordinates the columns of a matrix, known as the change of basis matrix.

play15:20

When you sandwich the original transformation,

play15:22

putting the change of basis matrix on its right and the inverse of the

play15:26

change of basis matrix on its left, the result will be a matrix representing

play15:31

that same transformation, but from the perspective of the new basis

play15:35

vectors coordinate system.

play15:37

The whole point of doing this with eigenvectors is that this new matrix is

play15:41

guaranteed to be diagonal with its corresponding eigenvalues down that diagonal.

play15:46

This is because it represents working in a coordinate system where what

play15:50

happens to the basis vectors is that they get scaled during the transformation.

play15:55

A set of basis vectors which are also eigenvectors is called,

play15:59

again, reasonably enough, an eigenbasis.

play16:02

So if, for example, you needed to compute the 100th power of this matrix,

play16:07

it would be much easier to change to an eigenbasis,

play16:10

compute the 100th power in that system, then convert back to our standard system.

play16:16

You can't do this with all transformations.

play16:18

A shear, for example, doesn't have enough eigenvectors to span the full space.

play16:23

But if you can find an eigenbasis, it makes matrix operations really lovely.

play16:29

For those of you willing to work through a pretty neat puzzle to

play16:31

see what this looks like in action and how it can be used to produce

play16:34

some surprising results, I'll leave up a prompt here on the screen.

play16:37

It takes a bit of work, but I think you'll enjoy it.

play16:40

The next and final video of this series is going to be on abstract vector spaces.

play16:45

See you then!

Rate This

5.0 / 5 (0 votes)

関連タグ
EigenvectorsEigenvaluesLinear AlgebraMatrix TransformationsVisual ExplanationEducational ContentMath ConceptsDeterminantsLinear SystemsChange of Basis
英語で要約が必要ですか?