Top 7 Algorithms for Coding Interviews Explained SIMPLY

Codebagel
12 Aug 202221:21

Summary

TLDRThis comprehensive video script delves into the seven most crucial algorithms for coding interviews: binary search, depth-first search, breadth-first search, insertion sort, merge sort, quick sort, and greedy algorithms. It meticulously explains each algorithm's mechanism, real-life applications, time complexity, and when to utilize them effectively. Through insightful visualizations and relatable examples, the script offers a deep understanding of these fundamental algorithms, empowering viewers to excel in coding interviews and strengthen their problem-solving abilities.

Takeaways

  • 😃 Binary search is an efficient search algorithm for sorted lists, with a time complexity of O(log n).
  • 🤖 Depth-first search (DFS) and breadth-first search (BFS) are crucial algorithms for traversing trees and graphs, commonly used in coding interviews.
  • 🧮 Insertion sort is a simple sorting algorithm with a time complexity of O(n) for best case and O(n^2) for worst case, suitable for small or mostly sorted lists.
  • 📈 Merge sort is a divide-and-conquer sorting algorithm with a time complexity of O(n log n), efficient for larger, unsorted lists.
  • 🔄 Quick sort is a divide-and-conquer sorting algorithm with a time complexity of O(n log n) for the average case, but can degrade to O(n^2) for the worst case. It is generally faster than merge sort on average but harder to implement correctly.
  • 💰 Greedy algorithms make the locally optimal choice at each step, without considering the global optimum. They are useful when finding the exact optimal solution is computationally infeasible.
  • 🧭 The traveling salesman problem is a classic example where greedy algorithms can provide a reasonably good approximation when finding the exact optimal solution is impractical due to the exponential growth of possible routes.
  • 🔑 Understanding time complexity is crucial for evaluating and comparing the efficiency of different algorithms.
  • 📚 Familiarity with basic data structures, such as arrays, trees, and graphs, is essential for understanding and implementing these algorithms effectively.
  • 🚀 Mastering these algorithms is essential for coding interviews, as they are among the most commonly asked questions, particularly for data structures and algorithms.

Q & A

  • What are the three search algorithms discussed in the script?

    -The three search algorithms discussed are binary search, depth-first search (DFS), and breadth-first search (BFS).

  • What is the time complexity of the binary search algorithm?

    -The time complexity of the binary search algorithm is O(log n).

  • What is the difference between depth-first search (DFS) and breadth-first search (BFS)?

    -DFS explores as far as possible along each branch before backtracking, while BFS explores all the nodes at the current level before moving on to the next level.

  • What is the time complexity of DFS and BFS?

    -Both DFS and BFS have a time complexity of O(V + E), where V is the number of vertices (nodes) and E is the number of edges (connections).

  • What are the three sorting algorithms discussed in the script?

    -The three sorting algorithms discussed are insertion sort, merge sort, and quick sort.

  • What is the best-case and worst-case time complexity of insertion sort?

    -The best-case time complexity of insertion sort is O(n), and the worst-case time complexity is O(n^2).

  • What is the time complexity of merge sort?

    -The time complexity of merge sort is O(n log n) for both the best and worst cases.

  • What is the best-case and worst-case time complexity of quick sort?

    -The best-case time complexity of quick sort is O(n log n), while the worst-case time complexity is O(n^2).

  • What is a greedy algorithm, and when is it used?

    -A greedy algorithm is an algorithm that makes the best possible choice at every local decision point. It is used when finding the optimal solution might not be possible, and we want to find a relatively accurate solution.

  • Can you provide an example of a problem where a greedy algorithm might be useful?

    -The traveling salesman problem, where the goal is to find the shortest possible route that visits each city once and returns to the origin city, is a good example of a problem where a greedy algorithm might be useful when the number of cities becomes too large to calculate the optimal solution.

Outlines

00:00

🔍 Essential Algorithms for Coding Interviews

This segment introduces seven crucial algorithms for coding interview preparation, categorized into search, sorting, and a special type called greedy algorithms. It emphasizes the importance of understanding basic data structures and time complexity prior to learning these algorithms, suggesting resources for refreshers. The video begins with binary search, providing a visual example to demonstrate its efficiency compared to linear search, especially in large datasets. Binary search's log(n) time complexity is highlighted as significantly more efficient, making it a preferred method for searching in sorted lists.

05:01

🌳 Search Algorithms: DFS and BFS

The video continues with depth-first search (DFS) and breadth-first search (BFS), two critical algorithms for traversing trees and graphs, essential for software engineering interviews. DFS is explained as a method of exploring as far down a branch as possible before backtracking, useful for tasks like solving mazes. BFS, on the other hand, examines all nodes at one level before moving to the next, applicable in scenarios like chess algorithms. Both have a time complexity of O(V + E) and are illustrated through visual demonstrations. The importance of these algorithms in interviews and their practical applications are underscored.

10:02

🔄 Sorting Algorithms: From Basic to Advanced

This part introduces three fundamental sorting algorithms: insertion sort, merge sort, and quick sort. Insertion sort is described as suitable for small or mostly sorted lists due to its O(n) to O(n^2) time complexity. Merge sort, a divide-and-conquer algorithm, is recommended for larger lists, offering a consistent O(n log(n)) runtime. Quick sort, although potentially as inefficient as O(n^2) in the worst case, is typically faster on average due to optimization possibilities, making it preferred for its balance of speed and memory efficiency. Each algorithm's workings, use cases, and efficiencies are explained with visual aids.

15:07

🏹 Greedy Algorithms: Efficient but Not Always Optimal

The final segment discusses greedy algorithms, which make the most advantageous local decision at each step. It details scenarios where greedy algorithms may not find the optimal solution but are useful when dealing with large datasets where calculating the optimal solution is impractical. The segment uses the travelling salesman problem to illustrate the exponential increase in complexity with more cities, highlighting the utility of greedy algorithms for finding a relatively efficient solution amidst computationally intensive problems. The video closes by reiterating the value of understanding these algorithms for interviews and encourages viewer engagement.

Mindmap

Keywords

💡Binary Search

Binary Search is an efficient algorithm for finding the position of an element in a sorted list. By starting in the middle of the list and eliminating half of the remaining elements with each comparison, binary search drastically reduces the number of comparisons needed to find an element. In the video, binary search is contrasted with linear search to highlight its efficiency, particularly in scenarios where the list is large, as it operates in O(logn) time complexity, significantly faster than O(n) for linear search.

💡Depth-First Search (DFS)

Depth-First Search (DFS) is an algorithm used for traversing or searching tree or graph data structures. The algorithm starts at the root (selecting some arbitrary node as the root in the case of a graph) and explores as far as possible along each branch before backtracking. The video explains DFS as diving deep into one path all the way to the end before exploring another path, using backtracking. DFS is represented as useful in scenarios like solving mazes, with a time complexity denoted as O(V + E) where V represents vertices and E represents edges.

💡Breadth-First Search (BFS)

Breadth-First Search (BFS) is an algorithm for searching a tree or graph data structure. It starts at the tree root and explores the neighbor nodes first, before moving to the next level neighbors. BFS uses a queue to keep track of the nodes to be explored and operates in O(V + E) time complexity, similar to DFS. In the video, BFS is described as examining all nodes at one level before moving down to the nodes of the next level, useful in situations like predicting the best move in a chess game.

💡Insertion Sort

Insertion Sort is a simple, intuitive sorting algorithm where the sorted array is built one element at a time. The video describes the process of comparing each element with its predecessor and moving it to its correct position in the sorted part of the array. Despite its simplicity, the video notes insertion sort's limitations, specifically its best-case time complexity of O(n) and worst-case time complexity of O(n^2), making it suitable for small or mostly sorted datasets.

💡Merge Sort

Merge Sort is a 'divide and conquer' algorithm that divides the list into halves, sorts each half, and then merges the sorted halves. The video highlights merge sort's efficiency with a consistent O(n log(n)) time complexity for both the best and worst cases. This makes merge sort suitable for large, unsorted lists, contrasting with insertion sort's inefficiency in similar scenarios.

💡Quick Sort

Quick Sort is a divide-and-conquer sorting algorithm that selects a 'pivot' element and partitions the other elements into two sub-arrays, according to whether they are less than or greater than the pivot. The process is then repeated for the sub-arrays. The video mentions quick sort's best-case time complexity of O(n log(n)) and worst-case of O(n^2), but notes that with optimization, quick sort is usually faster and more memory-efficient than merge sort.

💡Greedy Algorithm

Greedy algorithms make the locally optimal choice at each stage with the hope of finding a global optimum. The video explains that greedy algorithms are not always efficient for finding the optimal solution because they do not consider the bigger picture. However, they are used when the search space is too large to compute an exact solution, as in the traveling salesman problem, to find a close-to-optimal solution in a reasonable time.

💡Time Complexity

Time complexity is a computational concept that describes the amount of computer time it takes to run an algorithm. The video references time complexity in discussing the efficiency of different algorithms, using notations like O(n), O(logn), and O(n log(n)) to compare their performance. Understanding time complexity helps in selecting the most appropriate algorithm for a problem based on the size and nature of the dataset.

💡Divide and Conquer

Divide and Conquer is a strategy for solving problems by breaking them down into smaller, more manageable problems, solving each of the smaller problems independently, and then combining their solutions to solve the original problem. The video illustrates this concept through the explanations of merge sort and quick sort, which both utilize this strategy to sort lists more efficiently than simpler, more straightforward algorithms.

💡Backtracking

Backtracking is a strategy used in algorithms to find solutions to problems by incrementally building candidates to the solutions and abandoning a candidate ('backtracking') as soon as it determines that this candidate cannot possibly lead to a final solution. The video demonstrates backtracking in the context of Depth-First Search (DFS), where the algorithm returns to explore other branches when it reaches a dead end.

Highlights

These are the 7 most important algorithms you have to know if you're preparing for coding interviews: binary search, depth-first search, breadth-first search, insertion sort, merge sort, quick sort, and greedy algorithms.

Binary search is a search algorithm used to find the position of a specific element in a sorted list. It has a runtime of O(log n), which is much better than linear search's runtime of O(n).

Depth-first search (DFS) is an algorithm used to search through trees and graphs by starting at the root node and going as far down one branch as possible until reaching the deepest point, backtracking to unvisited branches. Its time complexity is O(V + E), where V represents total nodes and E represents total branches.

Breadth-first search (BFS) is an algorithm used to search through trees and graphs by looking at every node at one level before continuing further down. Its time complexity is also O(V + E).

Insertion sort is a simple sorting algorithm that works by comparing and swapping adjacent elements until the list is sorted. Its best-case runtime is O(n), and its worst-case runtime is O(n^2), making it suitable for small or mostly sorted lists.

Merge sort is a divide-and-conquer sorting algorithm that recursively splits the array in half, sorts each subarray, and then merges them back together. Its runtime is O(n log n) in both best and worst cases.

Quick sort is a divide-and-conquer sorting algorithm that picks a pivot element and partitions the array into two subarrays, one with elements less than the pivot and one with elements greater than the pivot. It has a best-case runtime of O(n log n) but a worst-case runtime of O(n^2). However, when implemented correctly, it is the fastest of the three sorting algorithms on average.

Greedy algorithms are algorithms that make the best possible choice at every local decision point without looking too much into the future. They are used when finding the optimal solution might not be possible, and we want to find a relatively accurate solution.

The traveling salesman problem is a famous example of when a greedy algorithm might be used, as calculating the exact optimal route for visiting a large number of cities is computationally expensive or impossible.

Understanding basic data structures and time complexity is important before learning these algorithms.

Binary search only works when the list is sorted.

DFS and BFS are extremely important as they are used to solve the most common questions asked at coding interviews, just behind arrays.

One real-life example of where DFS might be useful is for solving mazes.

One real-life example of where BFS is used is for Chess algorithms that predict the best move at any given point in a game.

The time complexity of each algorithm is discussed, highlighting when to use one over the other based on the input size and order.

Transcripts

play00:01

These are the 7 most important algorithms you have to know if you’re preparing for

play00:05

coding interviews. We’ll start off with the three search algorithms, binary search,

play00:11

depth-first search, and breadth-first search; then, we’ll take a look at the three sorting

play00:16

algorithms, insertion sort, merge sort, and quick sort; and lastly, we’ll take a look

play00:21

at a special type of algorithm called a greedy algorithm.

play00:24

We’ll cover what each algorithm is and give a visual demonstration, explain when to use

play00:30

them, and discuss how efficient they are. Quick note. You should understand basic data

play00:36

structures before learning these algorithms. If you don’t know the data structures or

play00:40

need a refresher, check out this video here first, linked in the description below. Knowing

play00:46

time complexity will also help you during this video, so for a 2 minute recap, check

play00:50

out this video. Let’s not waste any more time and get right

play00:54

into it. Binary search is a search algorithm that is

play01:01

used to find the position of a specific element in a sorted list.

play01:06

I find it’s best explained with an example, so here’s a visual demonstration and example.

play01:11

Let’s say we were playing a guessing game with a friend, where your friend chooses a

play01:16

number from 1-100 and you have to guess what the number is. With each guess, your friend

play01:22

will tell you if the correct number is higher or lower than the one you guessed.

play01:28

One possible way of guessing is to start at 1, and go up by 1 each time until you guess

play01:32

it. This is called linear search. While this would work well if the number was close to

play01:38

1, imagine if the correct answer was 100. It would take 100 guesses to get the correct

play01:45

answer. On top of that, what if our guessing range was instead between 1 and 10,000? As

play01:53

you can see, this becomes inefficient when scaled, which is NOT optimal for a computer

play01:58

algorithm. This algorithm has a runtime of O(n).

play02:02

Now, let’s take a look at a different way of guessing the number between 1 and 100.

play02:08

Instead of starting at 1, let’s start right in the middle, at 50. Now, when our friend

play02:14

tells us if the correct answer is higher or lower, we are getting rid of half of the possible

play02:18

numbers to guess. We can keep guessing the middle number of the remaining possibilities,

play02:24

and eventually, we reach our answer. This algorithm has a runtime of O(log(base 2) n),

play02:30

commonly written as just O(logn). Let’s compare this to the linear search

play02:36

algorithm we just looked at before, with a range of 1 to 10,000.

play02:41

Well, let’s look at the worst case scenario, which is when the number is 10,000. For linear

play02:48

search, it’s pretty clear how long it would take: 10,000 guesses! We start at 1, and go

play02:54

through all the way to 10,000. But what about for binary search? Feel free to try it yourselves,

play03:00

but I’m going to go ahead and instead use binary search’s time complexity, which we

play03:04

know is O(log n). So, log (base 2) of 10,000 comes out to 13.3, so it would take 14 guesses

play03:15

to reach the worst case! That’s MUCH better than the 10,000 guesses linear search took

play03:20

us. So there we go, that’s binary search. It’s

play03:24

a pretty simple algorithm that you’ve probably used in real life to sort through a sorted

play03:29

list of things. The important thing to remember here is that it only works when the list is

play03:34

sorted. If you ever see a sorted list that requires you to find an element, binary search

play03:40

is usually a good place to start. If you’ve reached this point in the video,

play03:47

don’t click off now, because these next two search algorithms are the most important

play03:52

to know. Depth-first search, and the one we’ll look at next, breadth-first search, are two

play03:58

algorithms used to search through trees and graphs, which as you might remember from the

play04:03

Software Engineering Interview Process video, are the most common questions asked at interviews

play04:09

just behind arrays. So yeah, they’re extremely important.

play04:13

We’ll be starting with depth-first search, commonly referred to as DFS. The idea with

play04:19

DFS is to start at the root node, and go as far down one branch as possible, all the way

play04:25

to the end. Once we hit the deepest point, which is where the “depth” part of DFS

play04:31

comes from, we come back to an unvisited branch, and go down that one. This process of coming

play04:38

back to an unvisited branch is called backtracking. Here’s a visual demonstration of how we

play04:44

might traverse through this graph using DFS. Before doing any type of DFS, we want to create

play04:50

a visited array, that keeps track of nodes we’ve already visited.

play04:55

To begin the DFS, first, we’ll start at the root node, and go down the first branch

play05:00

to the next node. We’ll also add both the root node and the next node to our visited

play05:06

array. From here, we continue going down this same branch, adding each node we hit along

play05:12

the way to our visited array. Once we reach the bottom, we backtrack up to the previous

play05:18

node, and see if there are any unvisited nodes we can go through. As you can see, there is

play05:25

one, so we’ll go down to it, and add it to the visited array. Now, we backtrack to

play05:30

the previous node again. Do we have any more unvisited nodes here? No, so we backtrack

play05:37

up again. We repeat this process of traversing unvisited nodes and backtracking up until

play05:43

we’ve gone through the entire graph. One real-life example of where depth-first

play05:49

search might be useful is for solving a maze, which is actually what they were originally

play05:54

created for! Starting from the entrance of the maze, we look through each path all the

play06:00

way to the very end to see if it contains the exit. If we hit a wall, we backtrack to

play06:06

the nearest opening, and continue. The time complexity for this algorithm is

play06:11

given a notation of O(V + E), where V represents total nodes AKA vertices, and E represents

play06:19

total branches AKA edges. We’ll explore the why behind this a bit more in a video

play06:25

dedicated to depth-first search, but for now, all you need to know is that the runtime is

play06:31

Big O of V + E. Now, let’s move on to DFS’s counterpart.

play06:41

Now that we’ve covered DFS, breadth-first search, commonly referred to as BFS, will

play06:47

be a bit easier to understand. In BFS, instead of going all the way down each branch and

play06:53

backtracking up, we’ll look at every node at one level before continuing further down.

play06:59

Let’s look at a visual demonstration. As was the case with depth-first search, for

play07:04

breadth-first search, we will also create a visited array to keep track of what nodes

play07:09

we’ve already visited. We’ll also create a neighbours queue, which we’ll add all

play07:14

neighbours of a node to. To begin with BFS, we start at the root node,

play07:20

add it to the visited array, and we’ll add all of the nodes it’s connected to, to the

play07:24

neighbours queue. We’ll then take a look at the first node in the neighbours queue,

play07:29

mark it as visited, and add all of it’s direct neighbours to the end of the queue.

play07:35

This process continues, and as you can see, we’ll have visited each node on the first

play07:41

level before we progress down to the next level of nodes.

play07:45

One real-life example of where breadth-first search is used is for Chess algorithms. For

play07:50

those of you who don’t know what this is, it’s where a computer program predicts what

play07:54

the best move is at any given point in a game. The way they work is by starting a certain

play08:01

player’s turn, and taking a look at each possible move they have next. The algorithm

play08:06

looks at all possible moves for the next turn, and then looks at all possible moves from

play08:11

all of those possible moves. As I’m hoping you can start to see, this is just depth-first

play08:17

search, where the nodes are moves, and the levels are turns.

play08:23

Just like with depth-first search, the runtime for this algorithm is also O(V + E). Again,

play08:30

this will be covered more in-depth in a dedicated video to breadth-first search, but we’ll

play08:35

save that for another day. And there we go. We’ve covered all 3 search

play08:40

algorithms! Next we’re going to cover the 3 sorting algorithms, but before I do, just

play08:45

a quick ask. If you’re enjoying this video, please like, subscribe, and share it with

play08:50

friends. I’m usually putting 12+ hours into each video, and trying to balance that with

play08:56

a full-time schedule sometimes requires me to stay up until 2am or 3am to get videos

play09:02

done. I’d love to keep making more and more videos, so I’d really appreciate it if you

play09:07

could share these videos with anyone you know. Thanks so much, let’s continue with the

play09:11

algorithms. Sorting algorithms are used to sort lists

play09:18

of elements, usually in ascending order. There are tons of different types of sorting algorithms,

play09:23

but today we’re looking at the 3 most important for interviews. Insertion sort is the first

play09:29

of the sorting algorithms, and the easiest to start with.

play09:32

Insertion sort works as follows. The algorithm looks at the first element in the list, and

play09:38

compares it with the second. If the second is less than the first, swap the two. Then,

play09:45

compare the element in the second position with the element in the third position. If

play09:50

the 3rd element is less than the 2nd element, swap the two. If this element is also smaller

play09:56

than the 1st element, swap them again. As you can see, we’ll continue with this pattern

play10:01

until we reach the end and voila! We have a sorted list of elements now.

play10:07

Insertion sort is a simple sorting algorithm, and that’s where its limitations are. Insertion

play10:13

sort has a best-case run-time of O(n), and a worst-case run-time of O(n^2). It is O(n)

play10:22

when everything is already sorted, as it only goes through each element, and O(n^2) when

play10:28

nothing is sorted, because it has to go through every element times the total number of elements.

play10:33

As a result, insertion sort is best used for lists that are already mostly sorted, or for

play10:41

lists of smaller sizes. Once the lists grow larger and more unordered, the O(n^2) run-time

play10:48

starts to become problematic. Now, let’s take a look at a sorting algorithm

play10:54

that might be better for large lists. Merge sort is a sorting algorithm that falls

play11:03

under the category of “divide-and-conquer” algorithms, because it breaks up the problem

play11:08

into smaller problems, and solves each smaller problem. Does this sound familiar to you?

play11:14

Well, if you watched my video explaining recursion, I hope you recognized that this is actually

play11:19

a recursive algorithm! As per usual, let’s take a look at a visualization

play11:24

for this. We start by splitting the array in half, and we continue to split each subarray

play11:29

in half until the array has been split into pairs. Then, at the same time, each pair is

play11:36

going to do a comparison of the 1st element and the 2nd element, and swap them if the

play11:42

2nd is greater than the 1st. Now we have sorted pairs. The next thing our algorithm does is

play11:49

combine two sets of pairs, and do the exact same thing as before, sorting the numbers

play11:54

for each group of 4. This process continues all the way back up until we reach the full

play12:00

array, which is now sorted. Merge sort has a run-time of O(n log(n)) in

play12:07

the best and worse-cases. Comparing that to insertion sort, we can see when you might

play12:13

want to use one over the other. For smaller, already somewhat sorted lists, insertion sort

play12:19

has a runtime closer to O(n), whereas merge sort is O(n log(n)), so that’s where we

play12:25

might want to use insertion sort. For larger, less sorted lists, insertion sort has a runtime

play12:31

closer to O(n^2), whereas merge sort remains at O(n log(n)), so that’s where we might

play12:37

want to use merge sort. I’m hoping this is all making sense so far,

play12:42

and you’re understanding how the algorithms work, and when we might want to use one over

play12:46

the other. Now, time to look at our last sorting algorithm.

play12:55

Quick sort is our final sorting algorithm, and the most complicated of the three. However,

play13:01

once you understand how the other two work, it becomes a lot easier, which is why I left

play13:06

it for last. Like merge sort, quick sort is a divide-and-conquer algorithm, and is recursive.

play13:14

The idea is to pick a pivot number, ideally as close to the median of the list as possible,

play13:20

and split the list into two lists, one where all the numbers are less than the pivot, and

play13:25

one where all the numbers are greater than the pivot. We continue this process on each

play13:30

sublist, until we’ve reached our sorted list.

play13:33

Let’s look at the visualization for it. We start with our list, and try to pick a

play13:39

number close to the median. Once we’ve selected that number, we move it to the end of the

play13:44

list. Now, we place a pointer at the left-most element, and the right-most element, and we

play13:51

compare the two. If the left-most element is greater than the pivot, and the right-most

play13:57

element is less than the pivot, we swap them; otherwise, we leave them as is. We continue

play14:04

through the list until our pointers cross; once they do, we replace the element at the

play14:10

left pointer with our pivot. Now, everything before our pivot is less than the pivot, and

play14:17

everything after our pivot is greater. We will do this same process on both of these

play14:22

lists now, choosing a median pivot for both, and sorting the same way. Once we finish,

play14:29

the list is completely sorted. Quick sort has a best-case run-time of O(n

play14:35

log(n)), but a worst-case run-time of O(n^2). You might be wondering why we would ever use

play14:42

this algorithm, as it has a worse time complexity than both of our previous sorting algorithms.

play14:48

Well, this is where it gets interesting, and a bit complicated. Quick sort, when implemented

play14:55

correctly, is actually the fastest of the three on average. The complexities behind

play15:01

it are best saved for another video, but for a simple reason, the code in the inner loop

play15:06

of the quick sort algorithm can be optimized to significantly reduce probability of worst-case,

play15:12

and is on average, 2-3x faster than merge sort.

play15:18

On top of that, quick sort has a space complexity of O(log n), whereas merge sort has a space

play15:25

complexity of O(n), so quick sort is better for memory. However, one of the largest drawbacks

play15:32

to quick sort is that all of this relies on the code being optimized. Any little mistake

play15:38

or inefficiency in the quick sort code can cause the algorithm to run much slower, and

play15:43

it is a much harder algorithm to implement than merge sort.

play15:47

That wraps up our sorting algorithms. Before this video ends, I want to take a look at

play15:52

one more type of algorithm, which is a special type of algorithm.

play16:00

When you think of someone being greedy, what do you think of? Usually it’s someone who

play16:05

always wants and does the best thing for themselves in any scenario. Well, that’s exactly what

play16:12

this algorithm does. Greedy algorithms are algorithms that make

play16:16

the best possible choice at every local decision point. Put more simply, every time they have

play16:23

to make a decision, they just look at what the next best move is, without looking too

play16:28

much into the future. We’re first going to look at when NOT to

play16:32

use greedy algorithms, and then when you should use them.

play16:36

Greedy algorithms are not used for efficiency, because typically, they’re not looking at

play16:42

every possible outcome, just the best outcome at each stage. Here’s an example of where

play16:49

a greedy algorithm doesn’t work optimally. Consider this scenario: for each decision

play16:54

you make here, you have to spend a certain amount of money, indicated by the number on

play16:58

the path. Think for a moment – what should you do?

play17:03

Hopefully you came up with this path as the correct solution. However, using a greedy

play17:09

algorithm, we might not get this. The algorithm first looks at the first two choices – 7

play17:16

and 8. It chooses what’s best right then, which is a 7, and moves on. From here, it

play17:24

looks at its next two choices – 9 and 10. Again, it chooses what’s best in the moment,

play17:31

which is a 9, and reaches the end. The algorithm reached the end spending $16 total, but if

play17:39

we did it ourselves, we could reach the end spending $3 only!

play17:44

So why ever use them if they’re inefficient? Well, for a scenario like the one above, we

play17:50

could have easily developed a DFS or BFS algorithm to find the optimal solution. That’s because

play17:56

the problem is simple enough for a computer to solve that way. In fact, we could even

play18:01

brute force it and have the computer calculate every possible outcome. But what happens when

play18:07

we have impossibly large amounts of outcomes to go through? Sometimes, even modern-day

play18:13

computing can’t go through every single scenario to get the optimal answer.

play18:18

This is where greedy algorithms are used. If you can’t get the exact right answer

play18:22

using an optimized algorithm, it’s best to use a greedy algorithm and get something

play18:28

that’s relatively close, even if not 100% efficient.

play18:32

One of the most famous examples of this is called the travelling salesman problem. This

play18:38

problem provides a list of cities and distances between each city, and asks for the shortest

play18:43

possible route that visits each city once and returns to the origin city. The formula

play18:49

for total possible routes is (n-1)! / 2. If our salesman wanted to visit 5 cities,

play18:59

that gives us 12 possible routes. If our salesman wanted to visit twice the amount of cities,

play19:06

10 cities, that gives us 181,440 possible routes. Just doubling our city number gives

play19:15

us an exponential growth in route possibilities. But still, 181,440 routes can be solved by

play19:24

a computer in milliseconds. Now, let’s consider that the travelling salesman wants to visit

play19:30

every capital city in every US state. As there are 50 US states, that gives us 50 total cities.

play19:39

Take a guess as to what the possible route count is. A few million? Maybe even in to

play19:44

the billions or trillions? Well, if you guessed anywhere near there,

play19:48

you’re not even close, because the total number of routes is THIS (304140932017133780436126081660647688443776415689605120000000000).

play19:53

Yeah. And while a super computer might be able to calculate that one day, that’s only

play20:00

50 cities. Just for fun, let’s say it was 1,000 cities. Ready for the number?

play20:07

201193630038546886771851216961501992859687432105357316271899955214969256199314510296022104243484702400239994305098598029315833436497404279450661914834972295498712252043536879959411813863594366259889752975497638060437487731248521800709139047323248145528196943718943243668559590522912891823924988506238316444917977867716256592661979231537778704557131208737174673776714323288305833898698334410145603689571926859794124904063433919187279865873068042689767262110793296600964045439148654215696422201640615779305518488400678652108084373804837935674156012739294660383584566224213118065706254390104000130841575513670913988852392317934085082182512076845699140632405106546380622448179964352557482487709954671110783416286040410666593058405776807918273492023354487801450475268808237923864210944839823122472580382676704099450692721243992479976659550861677778301069725199868140375068918807653563880963424517176312600007944267573665805851051984087960755453894009696589057097272628611932770730531446093980111919485738044253138431483573337348781455617041219604080076890444946982259131621835808381089584454889955951877015637311144994002597722207141006093680872996321478290873314151477785149512162076590808605232916018393453058630079391760375758142112770132585241652113071987143466530845448984241295062729163584113229033263384979326341136403537890695929089444826104082174172412996633021683830088499806415930394193075139732977565578276018046994090306069279300150717847263612103172315898730297341286551895042012216219232828622507201410942626235467595310464511568246636748782756979360279827114374887005706673481357711422931188693769115241932844488230963691907450070383655223320129949745111110882952169950943009283263242530899851178096948508930020405944864959155510585614922950820960534442193560927823062480399361454259648409686194321307419828691145561562512093324676571985068714265963324937668609470347140717059260079007061672414007525699847145076741538822284549536576216639144134932301394932160569541753108547501298694931777138598371411124378793382876172110103786815284749412543984464081376924431698454979913140478060725497435850622258230630189514654560444543471014255320091077199728578402970936374499047127371086791200531838702297870892580414615067679040920048498186262115280427951850312135621708454502076845052966991917888969705485013876736000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 As you can see, we definitely won’t be finding

play20:11

the exact optimal route from every possible route. Here’s where our friend the greedy

play20:17

algorithm comes in. Let’s choose an arbitrary starting city, and write a simple algorithm

play20:22

that always chooses the next city that’s the least far away. We can continue this process

play20:28

until we reach the end, and then return to our starting city. This is definitely not

play20:34

the most efficient way to do it, but as we’ve seen, we can’t calculate the most efficient

play20:39

solution, and we’ll still get an answer that is far more efficient than randomly choosing

play20:44

cities. So, to summarize all of that. Greedy algorithms

play20:48

are used when finding the optimal solution might not be possible, so we want to find

play20:53

something that’s relatively accurate. Those are the 7 most important algorithms

play20:59

for interviews. This is my longest video yet, and took the most amount of time to make,

play21:05

so if you could smash the like button and share this video I’d really appreciate it.

play21:08

I put a ton of effort into these videos, and I’d love it if as many people as possible

play21:14

could see them and learn from them. Thanks so much for watching, and I’ll catch

play21:19

you in the next video!

Rate This

5.0 / 5 (0 votes)

Related Tags
Coding InterviewsSearch AlgorithmsSorting AlgorithmsGreedy AlgorithmAlgorithm EfficiencyComputer ScienceSoftware EngineeringData StructuresProblem SolvingCoding Tips