Abstract vector spaces | Chapter 16, Essence of linear algebra

3Blue1Brown
24 Sept 201616:46

Summary

TLDRThis video script delves into the fundamental nature of vectors, exploring whether they are best understood as arrows in a plane or as lists of numbers. It challenges the viewer to consider vectors as spatial entities or manifestations of a deeper concept. The script introduces an analogy between vectors and functions, demonstrating how functions can be treated as vectors by applying principles of linear algebra. Through the lens of linear transformations and the properties of additivity and scaling, the video builds a bridge between abstract mathematical concepts and concrete applications. The script concludes by emphasizing the abstract nature of vectors in modern linear algebra, defined by a set of axioms that allow for a wide range of 'vectorish' objects, including arrows, numbers, functions, and more.

Takeaways

  • 🔍 Vectors can be viewed in multiple ways, such as arrows in a plane or pairs of real numbers, and may have deeper spatial properties.
  • 📏 Defining vectors as lists of numbers is clear-cut, but they may represent an independent space that coordinates are arbitrarily assigned to.
  • 📊 Core linear algebra topics like determinants and eigenvectors are spatial and invariant under changes of coordinate systems.
  • 🎯 The essence of vectors might be more about the operations of addition and scaling rather than their representation as numbers or arrows.
  • 📚 Functions can also be considered vectors because they support addition and scaling, similar to vectors in linear algebra.
  • ✏️ Linear transformations, like the derivative in calculus, can be applied to functions, emphasizing the vector-like qualities of functions.
  • 📐 A linear transformation must satisfy additivity and scaling to be considered linear, preserving the operations of vector addition and scalar multiplication.
  • 📈 The concept of a matrix can be extended to function spaces, such as polynomials, to describe operations like derivatives.
  • 📘 Vector spaces are sets of objects that adhere to the rules of vector addition and scalar multiplication, forming the abstract foundation of linear algebra.
  • 📝 Axioms in linear algebra define the properties that vector addition and scalar multiplication must follow, allowing for general application across different types of vector spaces.
  • 🎓 The modern theory of linear algebra focuses on abstraction, allowing mathematicians to apply their results to any vector space that satisfies the defined axioms.

Q & A

  • What is the deceptively simple question the video aims to revisit?

    -The video revisits the question of what vectors are, exploring whether they are fundamentally arrows on a flat plane, pairs of real numbers, or manifestations of something deeper.

  • Why might defining vectors as primarily a list of numbers feel clear-cut and unambiguous?

    -Defining vectors as a list of numbers provides a clear and unambiguous way to conceptualize higher-dimensional vectors, making abstract ideas like four-dimensional vectors more concrete and workable.

  • What is the common sensation among those who work with linear algebra as they become more fluent with changing their basis?

    -The common sensation is that they are dealing with a space that exists independently from the coordinates, and that coordinates are somewhat arbitrary, depending on the chosen basis vectors.

  • Why do determinants and eigenvectors seem indifferent to the choice of coordinate systems?

    -Determinants and eigenvectors are core topics in linear algebra that are inherently spatial, and their underlying values do not change with different coordinate systems.

  • How does the concept of functions relate to the concept of vectors?

    -Functions can be seen as another type of vector because they can be added together and scaled by real numbers, similar to how vectors can be combined through addition and scalar multiplication.

  • What is the formal definition of a linear transformation for functions?

    -A linear transformation for functions is defined by two properties: additivity and scaling. Additivity means the transformation of the sum of two functions is the same as the sum of their individual transformations. Scaling means scaling a function and then transforming it yields the same result as transforming the function first and then scaling the result.

  • Why is the concept of a basis important when dealing with function spaces?

    -A basis is important because it provides a coordinate system for the function space, allowing for the representation of functions as vectors with coordinates, which is essential for applying linear algebra concepts to functions.

  • How is the derivative of a function related to the concept of linear transformations?

    -The derivative is an example of a linear transformation for functions. It takes one function and turns it into another while preserving the properties of additivity and scaling.

  • What is the significance of the matrix representation of the derivative in the context of polynomials?

    -The matrix representation of the derivative for polynomials allows for the application of linear algebra techniques to function spaces, specifically polynomial functions, making it possible to visualize and calculate derivatives using matrix-vector multiplication.

  • What are vector spaces and why are they fundamental in the theory of linear algebra?

    -Vector spaces are sets of objects that follow certain rules for vector addition and scalar multiplication, known as axioms. They are fundamental because they provide a general framework for applying linear algebra concepts to various 'vectorish' things, regardless of their specific form or nature.

  • How does the modern theory of linear algebra approach the concept of vectors?

    -The modern theory of linear algebra approaches vectors abstractly, focusing on the axioms that define vector spaces rather than on the specific form vectors take, such as arrows, lists of numbers, or functions.

  • Why is it beneficial to start learning linear algebra with a concrete, visualizable setting?

    -Starting with a concrete, visualizable setting, like 2D space with arrows, helps build intuitions about linear algebra concepts. These intuitions can then be applied more efficiently to more abstract or complex scenarios as one progresses in their understanding of the subject.

Outlines

00:00

🔍 Exploring the Essence of Vectors

This paragraph delves into the fundamental nature of vectors, questioning whether they are best understood as arrows in a plane with coordinates or as pairs of real numbers. It explores the idea that vectors might be manifestations of a deeper spatial concept, independent of the coordinate system. The paragraph also introduces the notion that functions can be treated similarly to vectors, with operations like addition and scaling, and suggests that linear algebra's principles can be applied to functions, like the derivative, which is a linear transformation.

05:05

📚 Linearity in Functions and Vectors

The second paragraph discusses the concept of linearity, specifically in the context of transformations applied to vectors and functions. It explains that a transformation is linear if it maintains the properties of additivity and scaling. The paragraph uses the derivative as an example of a linear transformation for functions and illustrates how this can be represented using an infinite matrix when dealing with polynomials. It emphasizes the parallel between the operations on vectors and functions, highlighting the universality of linear algebra concepts.

10:07

🌐 The Abstract Concept of Vector Spaces

This paragraph introduces the abstract concept of vector spaces, which are sets of objects that can be scaled and added together while adhering to certain axioms. It explains that mathematicians use these axioms to ensure that their discoveries in linear algebra are applicable to any vector space that follows the rules, regardless of the specific nature of the vectors involved. The paragraph also discusses the importance of abstract reasoning in mathematics, allowing for broad application of concepts without being confined to a specific representation of vectors.

15:09

🎓 Embracing Abstraction in Linear Algebra

The final paragraph concludes the series on the essence of linear algebra by emphasizing the importance of understanding the abstract nature of vectors and vector spaces. It suggests that while concrete, visualizable examples are helpful for beginners, the true power of linear algebra comes from its broad applicability to any vector space that follows the established axioms. The paragraph encourages learners to apply their intuitions about vectors to more complex problems and wishes them well in their future studies.

Mindmap

Keywords

💡Vectors

Vectors are mathematical objects that can be described as arrows with direction and magnitude, or as pairs of real numbers. In the context of the video, vectors are explored as having a deeper spatial essence rather than just being lists of numbers. The script discusses vectors in relation to linear algebra, emphasizing their role in transformations and coordinate systems. For example, the video mentions two-dimensional vectors as arrows on a flat plane and four-dimensional vectors as abstract concepts.

💡Linear Algebra

Linear algebra is a branch of mathematics that deals with the study of vectors, vector spaces (also called linear spaces), and linear transformations. The video script delves into the core topics of linear algebra, such as determinants and eigenvectors, and how they are indifferent to the choice of coordinate systems. Linear algebra provides a framework for understanding and working with vectors beyond their geometric representation.

💡Coordinates

Coordinates are numerical values that define the position of a point in a space. In the video, coordinates are discussed as a means of describing vectors for convenience, but also as somewhat arbitrary depending on the chosen basis vectors. The script suggests that the space a vector represents exists independently of the coordinates assigned to it.

💡Basis Vectors

Basis vectors are a set of vectors that define the direction and scale of a vector space. They are fundamental in establishing a coordinate system. The video script explains that changing the basis vectors can alter the coordinates of a vector, but the underlying spatial essence of the vector remains the same.

💡Determinants

In linear algebra, a determinant is a scalar value that can be computed from the elements of a square matrix, which provides information about the matrix's properties, such as its invertibility and how it scales areas. The video script mentions determinants as a core topic in linear algebra that is indifferent to the choice of coordinate systems.

💡Eigenvectors

Eigenvectors are vectors that, when a linear transformation is applied to them, remain on their own span, only being scaled. They are associated with eigenvalues, which are the scaling factors. The video script discusses eigenvectors as spatial properties that remain unchanged regardless of the coordinate system.

💡Functions

Functions are mathematical mappings from one set to another, often represented as f(x) = y. The video script introduces functions as another type of vector, where they can be added together and scaled by real numbers, similar to vectors. This concept is used to illustrate that linear algebra concepts can be applied to functions as well.

💡Linear Transformation

A linear transformation, also referred to as a linear operator in the context of functions, is a function that maps one vector space to another while preserving the operations of vector addition and scalar multiplication. The video script explains that the derivative is an example of a linear transformation for functions.

💡Additivity

Additivity is a property of a function, such as a linear transformation, where the function of the sum of two vectors is equal to the sum of the function of each vector. In the video, additivity is discussed as one of the properties that define a linear transformation, with the derivative being an example that possesses this property.

💡Scaling

Scaling refers to the operation of multiplying a vector by a scalar, which is a real number. The video script discusses the scaling property as a characteristic of linear transformations, where scaling a vector and then applying a transformation yields the same result as applying the transformation first and then scaling.

💡Vector Spaces

Vector spaces, also known as linear spaces, are mathematical structures that consist of a set of vectors and two operations, vector addition and scalar multiplication, that follow certain rules (axioms). The video script emphasizes that the concept of a vector is generalized in the modern theory of linear algebra to any set of objects that adhere to these axioms, making the form that vectors take irrelevant.

💡Axioms

Axioms are the basic principles or rules from which other truths are derived. In the context of vector spaces, the video script mentions that there are eight axioms that any vector space must satisfy for the constructs of linear algebra to apply. These axioms serve as an interface between mathematicians and others who want to apply linear algebra to new types of vector spaces.

Highlights

Vectors can be viewed as arrows in a flat plane or pairs of real numbers, or manifestations of something deeper.

Defining vectors as lists of numbers makes higher-dimensional vectors more concrete.

Linear algebra suggests vectors are spaces independent of coordinates, with coordinates being somewhat arbitrary.

Core linear algebra topics like determinants and eigenvectors are indifferent to the choice of coordinate systems.

Functions can be considered another type of vector, with operations similar to vector addition and scaling.

Linear transformations can be applied to functions, such as the derivative in calculus.

Linearity in transformations is defined by additivity and scaling properties.

A linear transformation is fully described by its effect on basis vectors.

The derivative is an example of a linear transformation that is additive and has scaling properties.

Matrix representation of the derivative can be constructed for polynomial functions.

Polynomial functions can be represented in a vector space with an infinite basis of powers of x.

An infinite matrix can describe the derivative operation in the context of polynomials.

The concept of vector spaces allows for the abstraction of different types of vectors, such as arrows, lists, or functions.

Vector spaces are sets of objects that adhere to rules of vector addition and scalar multiplication.

Axioms in linear algebra define the properties that any vector space must satisfy.

Mathematical abstraction allows for general application of linear algebra concepts across different types of vector spaces.

The modern theory of vectors focuses on the properties and operations rather than their concrete form.

Linear algebra concepts can be applied to various vectorish things following the established axioms.

The series concludes by emphasizing the importance of intuitions in understanding linear algebra.

Transcripts

play00:16

I'd like to revisit a deceptively simple question

play00:19

that I asked in the very first video of this series.

play00:22

What are vectors?

play00:24

Is a two-dimensional vector, for example, fundamentally an arrow on

play00:27

a flat plane that we can describe with coordinates for convenience?

play00:30

Or is it fundamentally that pair of real numbers which

play00:34

is just nicely visualized as an arrow on a flat plane?

play00:38

Or are both of these just manifestations of something deeper?

play00:42

On the one hand, defining vectors as primarily being

play00:45

a list of numbers feels clear-cut and unambiguous.

play00:49

It makes things like four-dimensional vectors or 100-dimensional vectors sound like real,

play00:53

concrete ideas that you can actually work with.

play00:55

When otherwise, an idea like four dimensions is just a vague geometric

play00:59

notion that's difficult to describe without waving your hands a bit.

play01:05

But on the other hand, a common sensation for those who actually work with

play01:09

linear algebra, especially as you get more fluent with changing your basis,

play01:13

is that you're dealing with a space that exists independently from the

play01:16

coordinates that you give it, and that coordinates are actually somewhat arbitrary,

play01:20

depending on what you happen to choose as your basis vectors.

play01:24

Core topics in linear algebra, like determinants and eigenvectors,

play01:27

seem indifferent to your choice of coordinate systems.

play01:31

The determinant tells you how much a transformation scales areas,

play01:34

and eigenvectors are the ones that stay on their own span during a transformation.

play01:40

But both of these properties are inherently spatial,

play01:42

and you can freely change your coordinate system without changing the underlying

play01:47

values of either one.

play01:50

But if vectors are not fundamentally lists of real numbers,

play01:53

and if their underlying essence is something more spatial,

play01:57

that just begs the question of what mathematicians mean when they use a

play02:00

word like space or spatial.

play02:03

To build up to where this is going, I'd actually like to spend the

play02:06

bulk of this video talking about something which is neither an arrow

play02:09

nor a list of numbers, but also has vector-ish qualities – functions.

play02:13

You see, there's a sense in which functions are actually just another type of vector.

play02:19

In the same way that you can add two vectors together,

play02:22

there's also a sensible notion for adding two functions, f and g,

play02:25

to get a new function, f plus g.

play02:28

It's one of those things where you kind of already know what it's going to be,

play02:31

but actually phrasing it is a mouthful.

play02:33

The output of this new function at any given input, like negative four,

play02:38

is the sum of the outputs of f and g when you evaluate them each at that same input,

play02:43

negative four.

play02:45

Or more generally, the value of the sum function at any

play02:49

given input x is the sum of the values f of x plus g of x.

play03:00

This is pretty similar to adding vectors coordinate by coordinate,

play03:04

it's just that there are, in a sense, infinitely many coordinates to deal with.

play03:11

Similarly, there's a sensible notion for scaling a function by a real number,

play03:15

just scale all of the outputs by that number.

play03:20

And again, this is analogous to scaling a vector coordinate by coordinate,

play03:23

it just feels like there's infinitely many coordinates.

play03:28

Now, given that the only thing vectors can really do is get added together or scaled,

play03:33

it feels like we should be able to take the same useful constructs and problem

play03:37

solving techniques of linear algebra that were originally thought about in

play03:41

the context of arrows and space and apply them to functions as well.

play03:46

For example, there's a perfectly reasonable notion of a linear transformation

play03:51

for functions, something that takes in one function and turns it into another.

play03:59

One familiar example comes from calculus, the derivative.

play04:03

It's something which transforms one function into another function.

play04:08

Sometimes in this context you'll hear these called operators instead of transformations,

play04:12

but the meaning is the same.

play04:16

A natural question you might want to ask is what it

play04:18

means for a transformation of functions to be linear.

play04:22

The formal definition of linearity is relatively abstract and symbolically driven

play04:26

compared to the way that I first talked about it in chapter 3 of this series.

play04:30

But the reward of abstractness is that we'll get something

play04:33

general enough to apply to functions as well as arrows.

play04:39

A transformation is linear if it satisfies two properties,

play04:42

commonly called additivity and scaling.

play04:46

Additivity means that if you add two vectors, v and w,

play04:50

then apply a transformation to their sum, you get the same result as if you added the

play04:57

transformed versions of v and w.

play05:04

The scaling property is that when you scale a vector v by some number,

play05:09

then apply the transformation, you get the same ultimate vector as

play05:14

if you scaled the transformed version of v by that same amount.

play05:21

The way you'll often hear this described is that linear transformations

play05:25

preserve the operations of vector addition and scalar multiplication.

play05:32

The idea of gridlines remaining parallel and evenly spaced that I've

play05:36

talked about in past videos is really just an illustration of what

play05:40

these two properties mean in the specific case of points in 2D space.

play05:44

One of the most important consequences of these properties,

play05:48

which makes matrix vector multiplication possible,

play05:50

is that a linear transformation is completely described by where it

play05:54

takes the basis vectors.

play05:57

Since any vector can be expressed by scaling and adding the basis vectors in some way,

play06:02

finding the transformed version of a vector comes down to scaling and adding

play06:06

the transformed versions of the basis vectors in that same way.

play06:12

As you'll see in just a moment, this is as true for functions as it is for arrows.

play06:18

For example, calculus students are always using the fact that the derivative is

play06:22

additive and has the scaling property, even if they haven't heard it phrased that way.

play06:28

If you add two functions, then take the derivative,

play06:31

it's the same as first taking the derivative of each one separately,

play06:35

then adding the result.

play06:40

Similarly, if you scale a function, then take the derivative,

play06:43

it's the same as first taking the derivative, then scaling the result.

play06:50

To really drill in the parallel, let's see what it

play06:53

might look like to describe the derivative with a matrix.

play06:56

This will be a little tricky, since function spaces have a tendency to be

play07:00

infinite dimensional, but I think this exercise is actually quite satisfying.

play07:04

Let's limit ourselves to polynomials, things like x squared plus 3x plus 5,

play07:09

or 4x to the seventh minus 5x squared.

play07:12

Each of the polynomials in our space will only have finitely many terms,

play07:16

but the full space is going to include polynomials with arbitrarily large degree.

play07:22

The first thing we need to do is give coordinates to this space,

play07:25

which requires choosing a basis.

play07:28

Since polynomials are already written down as the sum of scaled powers of the variable x,

play07:33

it's pretty natural to just choose pure powers of x as the basis function.

play07:38

In other words, our first basis function will be the constant function, b0 of x equals 1.

play07:44

The second basis function will be b1 of x equals x,

play07:48

then b2 of x equals x squared, then b3 of x equals x cubed, and so on.

play07:53

The role that these basis functions serve will be similar to the roles of i-hat,

play07:58

j-hat, and k-hat in the world of vectors as arrows.

play08:02

Since our polynomials can have arbitrarily large degree,

play08:05

this set of basis functions is infinite.

play08:08

But that's okay, it just means that when we treat our polynomials as vectors,

play08:11

they're going to have infinitely many coordinates.

play08:15

A polynomial like x squared plus 3x plus 5, for example,

play08:19

would be described with the coordinates 5, 3, 1, then infinitely many zeros.

play08:26

You'd read this as saying that it's 5 times the first basis function,

play08:30

plus 3 times that second basis function, plus 1 times the third basis function,

play08:34

and then none of the other basis functions should be added from that point on.

play08:40

The polynomial 4x to the seventh minus 5x squared would have the coordinates 0,

play08:47

0, negative 5, 0, 0, 0, 0, 4, then an infinite string of zeros.

play08:53

In general, since every individual polynomial has only finitely many terms,

play08:57

its coordinates will be some finite string of numbers with an infinite tail of zeros.

play09:06

In this coordinate system, the derivative is described with

play09:10

an infinite matrix that's mostly full of zeros,

play09:13

but which has the positive integers counting down on this offset diagonal.

play09:18

I'll talk about how you could find this matrix in just a moment,

play09:21

but the best way to get a feel for it is to just watch it in action.

play09:24

Take the coordinates representing the polynomial x cubed plus 5x squared plus 4x plus 5,

play09:31

then put those coordinates on the right of the matrix.

play09:40

The only term that contributes to the first coordinate of the result is 1 times 4,

play09:45

which means the constant term in the result will be 4.

play09:50

This corresponds to the fact that the derivative of 4x is the constant 4.

play09:55

The only term contributing to the second coordinate of the matrix vector product

play10:00

is 2 times 5, which means the coefficient in front of x in the derivative is 10.

play10:06

That one corresponds to the derivative of 5x squared.

play10:10

Similarly, the third coordinate in the matrix

play10:13

vector product comes down to taking 3 times 1.

play10:17

This one corresponds to the derivative of x cubed being 3x squared.

play10:23

And after that, it'll be nothing but zeros.

play10:26

What makes this possible is that the derivative is linear.

play10:31

And for those of you who like to pause and ponder,

play10:34

you could construct this matrix by taking the derivative of each

play10:37

basis function and putting the coordinates of the results in each column.

play10:59

So, surprisingly, matrix vector multiplication and taking a derivative,

play11:03

which at first seem like completely different animals,

play11:07

are both just really members of the same family.

play11:11

In fact, most of the concepts I've talked about in this series with

play11:14

respect to vectors as arrows in space, things like the dot product or eigenvectors,

play11:19

have direct analogs in the world of functions,

play11:21

though sometimes they go by different names, things like inner product or eigenfunction.

play11:28

So back to the question of what is a vector.

play11:31

The point I want to make here is that there are lots of vectorish things in math.

play11:35

As long as you're dealing with a set of objects where there's a reasonable notion of

play11:40

scaling and adding, whether that's a set of arrows in space, lists of numbers, functions,

play11:45

or whatever other crazy thing you choose to define,

play11:48

all of the tools developed in linear algebra regarding vectors,

play11:51

linear transformations and all that stuff, should be able to apply.

play11:57

Take a moment to imagine yourself right now as a

play11:59

mathematician developing the theory of linear algebra.

play12:02

You want all of the definitions and discoveries of your work to apply to

play12:06

all of the vectorish things in full generality, not just to one specific case.

play12:13

These sets of vectorish things, like arrows or lists of numbers or functions,

play12:18

are called vector spaces.

play12:20

And what you as the mathematician might want to do is say,

play12:23

hey everyone, I don't want to have to think about all the

play12:25

different types of crazy vector spaces that you all might come up with.

play12:29

So what you do is establish a list of rules that

play12:32

vector addition and scaling have to abide by.

play12:36

These rules are called axioms, and in the modern theory of linear algebra,

play12:40

there are eight axioms that any vector space must satisfy if all of

play12:43

the theory and constructs that we've discovered are going to apply.

play12:47

I'll leave them on the screen here for anyone who wants to pause and ponder,

play12:51

but basically it's just a checklist to make sure that the notions of vector

play12:54

addition and scalar multiplication do the things that you'd expect them to do.

play12:58

These axioms are not so much fundamental rules of nature as they are an

play13:02

interface between you, the mathematician, discovering results,

play13:05

and other people who might want to apply those results to new sorts of vector spaces.

play13:11

If, for example, someone defines some crazy type of vector space,

play13:14

like the set of all pi creatures with some definition of adding and scaling pi creatures,

play13:19

these axioms are like a checklist of things that they need to verify about

play13:23

their definitions before they can start applying the results of linear algebra.

play13:28

And you, as the mathematician, never have to think about

play13:31

all the possible crazy vector spaces people might define.

play13:34

You just have to prove your results in terms of these axioms so

play13:38

anyone whose definitions satisfy those axioms can happily apply your results,

play13:42

even if you never thought about their situation.

play13:46

As a consequence, you'd tend to phrase all of your results pretty abstractly,

play13:50

which is to say, only in terms of these axioms,

play13:53

rather than centering on a specific type of vector, like arrows in space or functions.

play14:01

For example, this is why just about every textbook you'll find will

play14:05

define linear transformations in terms of additivity and scaling,

play14:09

rather than talking about gridlines remaining parallel and evenly spaced.

play14:13

Even though the latter is more intuitive, and at least in my view,

play14:16

more helpful for first-time learners, even if it is specific to one situation.

play14:22

So the mathematician's answer to what are vectors is to just ignore the question.

play14:27

In the modern theory, the form that vectors take doesn't really matter.

play14:31

Arrows, lists of numbers, functions, pi creatures, really, it can be anything,

play14:36

so long as there's some notion of adding and scaling vectors that follows these rules.

play14:41

It's like asking what the number 3 really is.

play14:45

Whenever it comes up concretely, it's in the context of some triplet of things,

play14:49

but in math, it's treated as an abstraction for all possible triplets of things,

play14:54

and lets you reason about all possible triplets using a single idea.

play14:59

Same goes with vectors, which have many embodiments,

play15:02

but math abstracts them all into a single, intangible notion of a vector space.

play15:08

But, as anyone watching this series knows, I think it's better

play15:12

to begin reasoning about vectors in a concrete, visualizable setting,

play15:16

like 2D space, with arrows rooted at the origin.

play15:19

But as you learn more linear algebra, know that these tools apply much more generally,

play15:24

and that this is the underlying reason why textbooks and lectures tend to be phrased,

play15:29

well, abstractly.

play15:31

So with that, folks, I think I'll call it an in to this essence of linear algebra series.

play15:36

If you've watched and understood the videos, I really do believe that

play15:39

you have a solid foundation in the underlying intuitions of linear algebra.

play15:44

This is not the same thing as learning the full topic, of course,

play15:47

that's something that can only really come from working through problems,

play15:50

but the learning you do moving forward could be substantially more efficient if you have

play15:54

all the right intuitions in place.

play15:56

So, have fun applying those intuitions, and best of luck with your future learning.

Rate This

5.0 / 5 (0 votes)

Related Tags
VectorsLinear AlgebraMathematicsAbstract ConceptsCoordinate SystemsSpace TheoryEigenvectorsDeterminantsFunctionsDerivativesPolynomials