Dear linear algebra students, This is what matrices (and matrix manipulation) really look like

Zach Star
5 Mar 202016:26

Summary

TLDRThis video, sponsored by Brilliant, explores the fascinating world of matrices and their applications beyond traditional math curriculums. The presenter delves into matrix manipulation using 3D software, illustrating how matrices can represent systems of equations and be visualized as intersecting planes or vectors. The video explains concepts like null space, row space, and column space in a 3D context, and connects these mathematical ideas to real-world applications such as circuit analysis. It also touches on graph theory, demonstrating how matrices can represent and simplify complex networks. The video encourages viewers to think visually about linear algebra and to explore Brilliant's courses for a deeper understanding of these concepts and their applications.

Takeaways

  • πŸ˜€ The video is sponsored by Brilliant, an educational platform that offers courses on various subjects including linear algebra.
  • πŸ“Š Matrices can be visualized as sets of vectors, which is useful for understanding systems of equations and their solutions.
  • πŸ” Two main ways to approach matrix problems are by finding the intersection of planes or by determining scale factors for vectors to sum to a specific vector.
  • 🎯 The null space of a matrix represents all the solutions where the equations equal zero, which can be visualized as a line or a plane depending on the matrix.
  • πŸ”„ Matrix manipulation, such as Gaussian elimination, can simplify solving systems of equations and reveal the underlying structure of the problem.
  • πŸ“ The row space of a matrix is the set of all vectors perpendicular to the null space, and it's always two dimensions less than the matrix's dimension.
  • πŸ”— The concept of linear dependence and independence of vectors is crucial for understanding the solutions that a matrix can represent.
  • πŸ” The incidence matrix is used to represent networks or circuits, and its null space can reveal important properties about the network, like the absence of loops.
  • 🌐 The row and column spaces of a matrix are always the same dimension, which is a fundamental property of linear algebra.
  • πŸ”§ Applications of matrices extend beyond traditional mathematics to include graph theory, network analysis, and even the Google page rank algorithm.

Q & A

  • What is the primary focus of the video sponsored by Brilliant?

    -The video focuses on exploring matrix manipulation and arithmetic in a visually engaging way, using 3D software and applications not typically taught in schools.

  • How does the video presenter suggest visualizing a matrix?

    -The presenter suggests visualizing a matrix as a set of vectors, either as three column vectors or three row vectors, and demonstrates this with 3x3 matrices.

  • What is the significance of representing a matrix as column vectors in the context of systems of equations?

    -Representing a matrix as column vectors allows for the system of equations to be visualized as a sum or linear combination of these vectors, where the variables x, y, and z are scale factors.

  • How does the video explain the solution to a system of equations using matrix visualization?

    -The video explains that the solution to a system of equations can be visualized as the intersection of planes or as scale factors that add vectors tip-to-tail to reach a specific vector b.

  • What is the null space in the context of the video?

    -The null space is the set of all solutions where all the equations equal zero, often represented as an intersection of planes when they are all set to zero.

  • How does the video demonstrate the use of Gaussian elimination in the context of matrices?

    -The video shows Gaussian elimination by manipulating the constants in the equations to cancel out variables, which results in a simpler form that is easier to analyze.

  • What is the relationship between the null space and the row space as explained in the video?

    -The video explains that the null space and the row space are always perpendicular to each other, and that they have dimensions that add up to the dimension of the matrix.

  • How does the video connect matrix concepts to graph theory and networks?

    -The video connects matrix concepts to graph theory by using an incidence matrix to represent a directed graph, and then analyzing the null space and row space in terms of the graph's properties.

  • What does the video suggest about the physical meaning of the null space in a circuit represented by an incidence matrix?

    -The video suggests that the null space in a circuit represented by an incidence matrix corresponds to the voltages that result in no current flow, which aligns with Kirchhoff's voltage law.

  • What additional topics does the video mention that are covered by Brilliant's courses?

    -The video mentions that Brilliant's courses cover topics such as adjacency matrices, the use of matrices in graph theory, and unique applications like the Google page rank algorithm, as well as differential equations and their applications.

Outlines

00:00

πŸ“Š Matrix Manipulation and Visualization

The paragraph introduces the concept of matrices and their manipulations, emphasizing how they can be visualized and understood through 3D software. It discusses the initial struggle with matrices in a school setting and aims to present them in an engaging way. The paragraph explains how matrices can be seen as sets of vectors, specifically focusing on 3x3 matrices, which can be interpreted as either three column vectors or three row vectors. It further elaborates on how these matrices can represent systems of linear equations and how they can be visualized as intersections of planes or as linear combinations of column vectors. The concept of null space is introduced as the set of solutions where all equations equal zero, which can be visualized as the intersection of planes when they are not all zero.

05:01

πŸ” Exploring Row and Column Spaces

This section delves into the row and column spaces of matrices, explaining how they are related to the null space and how they can be visualized geometrically. It describes the row space as the set of all vectors perpendicular to the null space, which is represented by the planes formed by the row vectors. The column space, on the other hand, is the set of all possible linear combinations of the column vectors. The paragraph also touches on the concept of linear dependence and how it affects the ability to span the entire space, contrasting this with linear independence. The discussion includes the implications of these spaces for solving systems of equations and how they relate to the physical world, such as in circuits.

10:02

πŸ”Œ Applications of Matrices in Circuits

The paragraph explores the application of matrices in the context of electrical circuits, specifically using incidence matrices to represent networks of resistors and batteries. It explains how the null space of an incidence matrix can be used to determine the conditions under which there is no current flow in a circuit, which corresponds to a state of equilibrium where all voltages are equal. The discussion also covers the row space and how it can be used to identify vectors that represent valid potential differences in the circuit. The concept of graph reduction to a tree structure is introduced, explaining how cycles in a graph correspond to dependent rows in the matrix, and how the dimension of the row space relates to the number of edges that can be added without creating loops in the graph.

15:03

πŸŽ“ Learning Resources and Conclusion

The final paragraph serves as a conclusion and a call to action for viewers interested in learning more about matrices and linear algebra. It promotes Brilliant.org as a resource for further education, highlighting its courses on linear algebra, graph theory, and differential equations. The paragraph emphasizes the practical applications of these topics, such as the Google page rank algorithm and laser technology. It also mentions a special offer for the first 200 people who sign up through a provided link, offering a discount on the annual premium subscription. The video concludes with a thank you to supporters and a prompt to follow on social media for future content.

Mindmap

Keywords

πŸ’‘Matrix

A matrix is a rectangular array of numbers, symbols, or expressions, arranged in rows and columns. In the context of the video, matrices are used to represent systems of linear equations and to perform operations that help visualize and solve these systems. The video explains how matrices can be thought of as sets of vectors, which are fundamental in understanding linear combinations and systems of equations.

πŸ’‘Vector

A vector is a quantity that has both magnitude and direction. In the video, vectors are used to represent the columns or rows of a matrix. The concept is crucial in understanding how matrices can be manipulated to find solutions to systems of equations. The video demonstrates how column vectors can be visualized and used to find scale factors that satisfy a given equation.

πŸ’‘Linear Equations

Linear equations are equations that represent a straight line when graphed. The video uses 3x3 matrices to represent systems of three linear equations, where each equation can be visualized as a plane in three-dimensional space. The intersection of these planes, if it exists, gives the solution to the system.

πŸ’‘Null Space

The null space of a matrix is the set of all vectors that, when multiplied by the matrix, result in the zero vector. In the video, the null space is visualized as the intersection of planes when all equations equal zero, representing all possible solutions where there are no potential differences, akin to a circuit with no current flow.

πŸ’‘Row Space

The row space of a matrix is the set of all possible linear combinations of its row vectors. The video explains that the row space is always perpendicular to the null space and contains the row vectors themselves along with all their linear combinations. This concept is used to determine whether a given vector can be represented as a combination of the rows of a matrix.

πŸ’‘Column Space

The column space of a matrix is the set of all possible linear combinations of its column vectors. In the video, the column space is discussed in the context of graph theory, where it represents all possible output vectors that can be formed from the edges of a graph, adhering to certain conditions like Kirchhoff's voltage law.

πŸ’‘Gaussian Elimination

Gaussian elimination is a method for solving systems of linear equations by performing a series of row operations to transform the matrix into a row-echelon form. The video uses this method to simplify the analysis of systems of equations, making it easier to find solutions or determine the dimensions of the row and column spaces.

πŸ’‘Dot Product

The dot product is an algebraic operation that takes two equal-length sequences of numbers (usually coordinate vectors) and returns a single number. In the video, the dot product is used to determine the perpendicularity between vectors, which is crucial in identifying the row space and understanding the relationship between row vectors and the null space.

πŸ’‘Linearly Dependent

Vectors are linearly dependent if one can be expressed as a linear combination of the others. The video explains that if a set of vectors is linearly dependent, they do not span the entire space and are confined to a subspace, such as a line or a plane. This concept is important for understanding the column space and the solutions to a system of equations.

πŸ’‘Graph Theory

Graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. In the video, graph theory is applied to understand the relationships between nodes and edges in a network, using matrices like the incidence matrix to represent these relationships and analyze the properties of the network.

Highlights

The video introduces matrix manipulation with 3D software and applications not commonly taught in schools.

A matrix can be visualized as a set of vectors, either as column or row vectors.

Matrices can represent systems of equations, where each row corresponds to a linear equation.

Visualizing a matrix as a linear combination of column vectors provides an alternative to solving systems of equations.

The intersection of planes in 3D space can represent the solution to a system of equations.

The null space of a matrix is the set of all solutions when the system of equations equals zero.

Gaussian elimination is a method for solving systems of equations, but the video presents a visual approach to understanding it.

The video demonstrates how changing a matrix can lead to different solutions, such as a line of solutions instead of a single point.

The dot product of a row vector and the null space reveals the perpendicular relationship between them.

The row space of a matrix is the set of all vectors perpendicular to the null space.

The video explains how the dimension of the null space and row space always add up to the dimension of the matrix.

Linear dependence of vectors is demonstrated through the inability to span the entire space, confined to a line or plane.

The column space of a matrix is the set of all possible output vectors that can be formed from the column vectors.

The video connects matrix analysis to graph theory and network analysis, showing practical applications of linear algebra.

An incidence matrix is introduced as a way to represent a network or circuit in terms of nodes and edges.

The null space of an incidence matrix represents the voltage configurations that result in no current flow in a circuit.

The video concludes by emphasizing the importance of understanding the visual and practical aspects of linear algebra beyond textbook explanations.

Transcripts

play00:00

this video was sponsored by brilliant

play00:02

every matrix paint some kind of picture

play00:05

while matrix manipulation or arithmetic

play00:07

tells us story and that's not just the

play00:10

one of how boring this can be in school

play00:12

at least for me the beginning of

play00:13

matrices was one of my least favorite

play00:15

parts of math so I won this to at least

play00:17

show you what this all looks like with

play00:20

cool 3d software as well as an

play00:22

application I never learned in school so

play00:24

here we go when you're given a matrix it

play00:27

can often be useful to think of it as a

play00:29

set of vectors I'll be working mostly

play00:31

with 3x3 matrices and you can think of

play00:33

these both as a set of three column

play00:35

vectors or three row vectors we'll look

play00:38

into each where the column vectors come

play00:41

in immediately is when we use this

play00:43

matrix to represent a system of

play00:44

equations here I'm sure most you know

play00:47

this gives you three linear equations

play00:48

for example the first is 1x plus 2y plus

play00:52

4z equals some b1 and the rest of the

play00:56

matrix is all the other coefficients but

play00:59

another way to visualize this same thing

play01:01

is to write it as a sum or linear

play01:03

combination of the column vectors where

play01:05

XY and z are now just scale factors here

play01:09

the first equation would be 1 times X

play01:11

plus 2 times y plus 4z equals b1 the

play01:15

exact same thing so given some system to

play01:18

solve you can visually think of this two

play01:20

ways for the first option you say if I

play01:23

were to graph each of these or in this

play01:24

case three planes where do they all

play01:26

intersect because that intersection is

play01:29

our solution XYZ and in this case it'd

play01:31

be 1 comma 1 comma 1 now I'm going to

play01:35

switch to geogebra real quick because

play01:37

it's better for vectors but the second

play01:39

option says to instead take the columns

play01:41

of our matrix and consider them as

play01:43

vectors then find which scale factors

play01:45

are needed such that those vectors add

play01:47

tip-to-tail to get some other vector b1

play01:50

b2 b3 so instead of an intersection

play01:54

we're looking for scale factors and in

play01:57

this case all of them would be 1 just

play01:58

add the vectors together as they are

play02:00

thus 1 comma 1 comma 1 is our solution

play02:03

just like we saw before so we have two

play02:06

totally different visualizations for the

play02:09

exact same question

play02:11

I like using the intersection one when I

play02:13

have to solve for x y&z but when I'm

play02:16

asked what are the possible outputs here

play02:18

for B then I like thinking of vectors

play02:21

now I'm going to change the matrix just

play02:23

a bit and also make the B vector all

play02:25

zeros this then changes the other

play02:28

equations and now let's go back to the

play02:30

3d plot here we have the first and third

play02:34

equation and unless they're parallel two

play02:37

different planes will always intersect

play02:38

in a line now if the remaining plane

play02:42

happens to intersect that same line as

play02:44

well which it does then we have an

play02:46

entire set of solutions XY and Z such

play02:49

that all these equations are zero the

play02:52

name we give to those solutions is the

play02:54

null space it's just the intersection of

play02:57

all your equations when they equal zero

play02:59

often that solution is just zero comma

play03:02

zero comma zero but sometimes there's

play03:04

more here the null space is one

play03:06

dimensional just a line in 3d space now

play03:11

on your homework you want it graph three

play03:13

planes most likely you do something like

play03:15

Gaussian elimination where you take two

play03:17

equations multiply one or both by a

play03:19

constant and cancel out one of the

play03:21

variables but instead of just

play03:23

multiplying by negative two immediately

play03:25

I'm gonna sweep the constant from zero

play03:27

to negative two and watch what happens

play03:29

to the resultant function which

play03:31

currently is just that second graph in

play03:33

pink

play03:43

so look you can see when you add any two

play03:46

of these linear equations regardless of

play03:48

the scale factor in front their

play03:50

intersection or the null space in this

play03:52

case is preserved the new plane just

play03:54

rotates about that intersection so we

play03:58

may have a totally different plane here

play04:00

but we haven't lost the solutions so we

play04:02

can just replace either equation one or

play04:04

two and still go through the analysis

play04:06

but now the arithmetic is a little

play04:08

easier because one of the coefficients

play04:10

is 0 if you do the same thing with

play04:15

equations 1 & 3 then one plane actually

play04:18

becomes another

play04:27

this happens because if we replace

play04:28

equation three these last two planes are

play04:31

the exact same now that means if we were

play04:34

to continue the elimination we get a row

play04:36

of all zeros and four square matrices at

play04:39

least a single row of zeros means we

play04:41

have a single free variable this tells

play04:44

us we have infinitely many solutions to

play04:46

the system and we say Z can be anything

play04:49

it's free

play04:50

but x and y depend on that value so we

play04:53

don't just have any solution those

play04:56

dependent variables correspond to

play04:58

something called pivots and since

play05:00

there's only one free variable then our

play05:02

null space will be one dimensional and

play05:06

by the way if we did have three planes

play05:09

that only intersect at a single point

play05:10

then the elimination eventually leads to

play05:13

a plane of one variable like in this

play05:15

case Z equals one and from there we

play05:18

would back solve to get Y and X but

play05:21

anyways now I want to complete the

play05:22

picture by putting back the original

play05:24

equations and graph now what if I told

play05:27

you that the dot product of the vector 1

play05:30

comma 2 comma 4 and some random vector X

play05:33

Y Z is 0

play05:35

well that means these two vectors are

play05:37

perpendicular

play05:38

but look the actual dot product or 1 X

play05:42

plus 2y plus 4z equals 0 is our first

play05:45

equation an XYZ represents the null

play05:50

space that line of solutions so our

play05:53

equation says the first row vector of

play05:55

our matrix 1 comma 2 comma 4 is

play05:57

perpendicular to the null space and the

play06:00

second equation says the second row

play06:02

vector is also perpendicular to that

play06:04

same line and same with the third these

play06:07

are all just dot products being equal to

play06:09

0 and the set of all vectors

play06:11

perpendicular to the null space line is

play06:14

this plane here and this is what we call

play06:17

the row space this is always

play06:20

perpendicular to the null space it

play06:22

contains the three row vectors all three

play06:24

are in that plane and it also contains

play06:27

every linear combination of those row

play06:29

vectors so we have a one-dimensional

play06:31

null space and a 2-dimensional row space

play06:34

which add to 3 and that matches this

play06:38

dimension of the matrix

play06:40

just note that this will always be true

play06:44

but don't forget these equations which

play06:46

represent planes and now we know also

play06:48

dot products with the null space can

play06:50

also be thought of the combination of

play06:52

the column vectors since this is the

play06:57

exact same question we already know

play06:59

there are XYZ solutions to this that sum

play07:02

to the zero vector it's just all the

play07:04

values that made up that null space line

play07:05

from before so there are infinitely many

play07:08

scale factors that make this work and

play07:10

when a set of vectors can combine to the

play07:13

zero vector given scale factors that

play07:15

aren't all zero then those vectors are

play07:17

linearly dependent or you can also say

play07:20

one of these vectors is just a linear

play07:22

combination of the other two same thing

play07:26

when you have a square matrix with

play07:29

linearly dependent vectors it means

play07:31

those vectors don't span the entire

play07:33

space therein they're confined to like a

play07:35

line or in this case a plane all the

play07:39

column vectors are found here and also

play07:41

all of their linear combinations all

play07:43

possible tip-to-tail summations the name

play07:46

we give to that plane that the vectors

play07:47

span is the column space see often you

play07:53

could put any three vector here and find

play07:55

a solution which would mean the vectors

play07:57

are linearly independent but in the

play08:01

dependent case we can't have any

play08:03

solution the output vector has to lie

play08:05

within this plane the column space in

play08:07

order for a solution to exist the column

play08:11

space and the row space which I'll throw

play08:14

in here as well usually look very

play08:16

different but they're always the same

play08:18

dimension both are 2d in this case for

play08:22

non square matrices the row and column

play08:24

space are way different here the column

play08:26

space is just the XY plane these four

play08:28

vectors can only combine to some other X

play08:31

comma Y vector but the row space is the

play08:34

plane spanned by these two vectors in

play08:36

four dimensional space however both

play08:39

those spaces are planes that are

play08:41

themselves two-dimensional so that

play08:43

aspect does match but graphically these

play08:45

are very different now with regards to

play08:48

elimination the obvious reason as to why

play08:50

this is important is because it's used

play08:52

to solve systems of equations

play08:53

when there are many of those equations

play08:56

which can come up in circuits or other

play08:57

physical systems then we might not solve

play09:00

things by hand but we do have to tell

play09:02

computers how to get a solution however

play09:05

there's even more of a picture and story

play09:07

beyond just solving these equations and

play09:09

that has to do with graph theory and

play09:10

networks let's say we have some directed

play09:14

graph with four nodes and five

play09:15

connecting edges and I'll actually label

play09:18

all these edges e1 through five and the

play09:20

nodes and one through four now you can

play09:23

think of this like a circuit where the

play09:25

edges are either resistors or a battery

play09:27

or whatever where current flows and the

play09:29

nodes would all have some specific

play09:31

voltage in fact I'll change the labels

play09:34

to voltages to say consistent with this

play09:36

then the arrows would sort of represent

play09:38

current although we can't know the

play09:40

direction yet until at least here we

play09:42

know if the voltage is positive or

play09:44

negative now we can represent this

play09:46

network with something called an

play09:47

incidence matrix that will have four

play09:50

columns for the four nodes and five rows

play09:52

for the five edges to fill this in just

play09:55

consider the first edge on the graph

play09:58

it's coming out of v1 and going into v2

play10:01

so we put a negative one under v1 and a

play10:04

positive one under v2 the rest are zero

play10:08

since they aren't connected to e1 e2 is

play10:12

then coming out of v2 and going into v3

play10:15

so we put a negative one under v2 and a

play10:18

1 under v3 then zeros for the non

play10:20

connected notes this is all there is to

play10:23

it negative ones for the out of nodes

play10:26

and positive one for the into notes so

play10:29

the rest of the matrix would look like

play10:30

this now when we multiply this matrix by

play10:35

a vector of the voltages it equals every

play10:38

difference between connected nodes or

play10:40

really potential differences that's like

play10:43

the voltage drop across resistor or a

play10:45

battery so now what does the null space

play10:48

of this matrix represent will remember

play10:52

the null space is all the solutions here

play10:54

or the voltages that output all zeros or

play10:57

no potential differences which is like

play10:59

asking which voltages will result in no

play11:02

current well I'm not going to show it

play11:05

but using Gaussian elimination we get

play11:07

this

play11:07

matrix here which again has the same

play11:09

null space all we did was rotate the

play11:11

higher dimensional equations around

play11:13

their intersection and this matrix has

play11:15

three pivots and one free variable

play11:18

this means v4 can be whatever and the

play11:21

rest of the voltages are dependent on

play11:23

what we pick I'll say v4 equals some

play11:26

arbitrary T and since the other

play11:28

equations are just going to lead to V 4

play11:30

equals V 3 V 3 equals V 2 and V 2 equals

play11:33

V 1 then every variable would have to be

play11:36

T or whatever V 4 was selected this is

play11:40

our null space just a line in four

play11:42

dimensions we can pick something for V 4

play11:45

like ground or 5 volts or whatever and

play11:48

so long as everything is the same then

play11:50

we have no potential differences or

play11:52

really no current yeah it's pretty

play11:55

obvious if you know your circuits but it

play11:57

gives you an idea of what the null space

play11:58

really means here and with regards to

play12:01

the row space if you were asked whether

play12:03

some vector is a part of it or it can it

play12:05

be made by combinations of the rows then

play12:08

all you gotta do is see if it's

play12:09

perpendicular to the null space and

play12:11

doing a dot product we see that it is

play12:13

since we get out 0 in fact so long as

play12:17

all these numbers add to 0 then it's

play12:19

definitely in the row space for this

play12:20

matrix one thing that did have some more

play12:23

meaning though is the elimination we did

play12:25

to reiterate what we have here is the

play12:28

original incidence matrix on top and the

play12:30

reduced matrix on bottom the original

play12:34

graph looked like this but now I'm going

play12:36

to plot the graph or network associated

play12:38

with the bottom or reduced incidence

play12:40

matrix which would give us this here

play12:42

it's the same graph minus 2 edges but

play12:46

the thing to realize is that it has no

play12:48

loops meaning it's a tree and it turns

play12:50

out this will always be the case every

play12:53

connected graph reduces to a tree and

play12:56

certain rows or edges that create loops

play12:59

like this one that represents this edge

play13:01

eventually reduce to all zeros so we can

play13:05

say cycles lead to dependent rows since

play13:07

they reduced to 0 also the dimension of

play13:11

the row space or 3 in this case means

play13:13

you can have three edges in this graph

play13:15

without any loops but any fourth edge

play13:17

will create one

play13:22

lastly the column space is just what all

play13:25

the columns can combine to or any

play13:27

possible output vector be from a linear

play13:30

combination of these vectors if you go

play13:33

through with the analysis you find the

play13:34

columns combined to any vector so long

play13:37

as B 1 plus B 4 minus B 5 equals 0 and B

play13:42

1 plus B 2 plus B 3 + B 4 equals 0 this

play13:46

definitely has a physical meaning I'm

play13:48

using the letter B as a filler but

play13:50

really B 1 is just the first row

play13:52

summation so really V 2 minus V 1 B 4 is

play13:59

V 1 minus V 4 and B 5 is V 2 minus V 4

play14:04

so these values really just represent

play14:07

potential differences between two

play14:08

connected notes and bringing back our

play14:11

original graph in circuit form we find

play14:13

those are the voltage drops in this loop

play14:16

thus the potential differences in this

play14:18

loop sum to zero and this is a

play14:21

fundamental law of circuits known as

play14:23

Kirchhoff's voltage law it emerges from

play14:25

analyzing the column space of the matrix

play14:27

and by the way the other equation

play14:30

corresponds to the larger loop where the

play14:32

voltages must also sum to zero so if you

play14:35

were given a vector and had to determine

play14:37

whether it's in the column space you

play14:39

just need to see whether it obeys

play14:40

kerkoff's voltage law this vector does

play14:42

not cuz this loop fails to sum to zero

play14:45

for example thus it's not in the column

play14:47

space everything we've seen here might

play14:51

not be what you typically learn when it

play14:52

comes to elimination row and column

play14:55

spaces and so on but within linear

play14:57

algebra there's almost always an

play14:58

interesting picture or story going on

play15:01

beyond what your textbook is telling you

play15:03

and if you want to dive deeper into what

play15:05

we've seen here as well as more advanced

play15:07

topics you can check out brilliant org

play15:09

the sponsor of this video to continue

play15:11

with the applications of matrices and

play15:13

linear algebra brilliant actually has

play15:15

several courses to learn from first

play15:17

their linear algebra course covers all

play15:19

the basics of matrices but it even gets

play15:21

to adjacency matrices the use of

play15:23

matrices in graph theory and unique

play15:25

applications like the Google page rank

play15:27

algorithm you can go beyond this though

play15:30

in their differential equation series

play15:31

which covers on

play15:33

damn systems matrix Exponential's and

play15:35

even more advanced applications like

play15:37

laser technology and the associated

play15:39

equations covering this wide range of

play15:42

applications really does help connect

play15:43

all the little pieces of linear algebra

play15:45

from determinants to eigenvectors to

play15:48

diagonalization and so on so you gain a

play15:50

much better understanding of the big

play15:51

picture and as you can see brilliant

play15:54

courses all come with intuitive

play15:55

animations and tons of practice problems

play15:57

so you know you have a solid

play15:58

understanding of whatever topic in math

play16:00

science or engineering you're interested

play16:03

in learning also the first 200 people to

play16:05

go to brilliant org slash text are or

play16:07

click the link below will get 20% off

play16:09

their annual premium subscription and

play16:11

with that I'm gonna end that video there

play16:14

thanks as always my supporters on

play16:16

patreon social media links to follow me

play16:18

or down below and I'll see you guys in

play16:20

the next video

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Linear AlgebraMatrix Manipulation3D VisualizationEducational ContentMathematicsCoding ApplicationsGraph TheoryNetwork AnalysisBrilliant.orgLearning Resources