Registers and RAM: Crash Course Computer Science #6

CrashCourse
29 Mar 201712:16

Summary

TLDRIn this episode of Crash Course Computer Science, Carrie Anne explores the concept of computer memory, starting with the creation of a simple circuit to store a single bit of information. She explains the function of RAM and its role in retaining game state while powered on. The video delves into building an AND-OR Latch, a Gated Latch, and scales up to a memory module. It discusses the transition from individual latches to a matrix system for efficient memory storage, culminating in the creation of a CPU with integrated memory in future episodes.

Takeaways

  • đŸ’Ÿ The script introduces the concept of computer memory, emphasizing its importance for storing data and enabling sequential operations.
  • 🔌 It explains the function of Random Access Memory (RAM), which is volatile and requires constant power to retain data, contrasting it with persistent memory that can retain data without power.
  • đŸ› ïž The episode demonstrates building a basic memory circuit that can store a single bit of information, which is a fundamental unit of data in computing.
  • 🔄 The script illustrates how a simple OR gate can be used to create a memory circuit that 'remembers' a value, but points out its limitation of being unable to reset.
  • 🔗 It introduces the AND-OR Latch, a combination of circuits that can record both 0s and 1s, and which forms the basis for more complex memory storage.
  • 🔑 The concept of a 'latch' is explained as a device that 'latches onto' a value and retains it, which is crucial for memory functions.
  • đŸ”© The script describes the Gated Latch, an improved memory circuit that allows for controlled writing and reading of data using a single data wire and an enable line.
  • 📈 It discusses the scaling up of memory storage by combining multiple latches into a register, which can store larger amounts of data, such as 8-bit numbers.
  • đŸ—ïž The episode explains the use of a matrix arrangement for memory storage, which significantly reduces the number of wires needed compared to linear arrangements.
  • 🔱 The script introduces the idea of addresses for memory locations, necessary for identifying and accessing specific bits of data within a memory matrix.
  • 🔄 It outlines the process of reading and writing data to memory, involving enabling the appropriate latches and using the data, write enable, and read enable lines.

Q & A

  • What is the primary function of an ALU in a computer?

    -An ALU, or Arithmetic Logic Unit, performs arithmetic and logic operations, which are fundamental to computer processing.

  • Why is computer memory important in storing data?

    -Computer memory is crucial for storing data temporarily, allowing for the execution of multiple operations in sequence without losing the results.

  • What is the difference between RAM and persistent memory?

    -RAM, or Random Access Memory, requires power to maintain stored data, whereas persistent memory can retain data even when power is off.

  • How does a simple memory circuit using an OR gate work?

    -A simple memory circuit with an OR gate can store a '1' by looping the output back to one of its inputs, creating a feedback loop that maintains the state.

  • What is the purpose of a 'set' and 'reset' input in an AND-OR Latch?

    -The 'set' input sets the output to '1', and the 'reset' input resets the output to '0', allowing the latch to store and maintain a single bit of information.

  • What is a Gated Latch and how does it improve memory storage?

    -A Gated Latch is a memory storage circuit that uses a single data input wire and a write enable line to control when data can be written to memory, simplifying the input process.

  • How does a register differ from a single latch?

    -A register is a group of latches that work together to store multiple bits of information, such as an 8-bit number, making it a more complex memory unit.

  • What is the advantage of using a matrix configuration for memory storage?

    -A matrix configuration allows for a more efficient use of wiring by enabling the selection of individual latches with fewer wires, reducing the overall complexity and cost.

  • How does a multiplexer contribute to memory addressing in a computer?

    -A multiplexer selects the appropriate row or column in a memory matrix based on the address provided, enabling precise access to specific memory locations.

  • What is the significance of the term 'Random Access Memory' or RAM?

    -RAM refers to memory that can be accessed randomly, meaning any memory location can be read or written to in any order, without needing to access preceding locations first.

  • How does the size of memory addresses relate to the amount of memory a computer can have?

    -The size of memory addresses determines the number of memory locations that can be accessed. Larger addresses allow for more memory locations, scaling up to gigabytes or more.

Outlines

00:00

đŸ’Ÿ Introduction to Computer Memory

Carrie Anne introduces the concept of computer memory, emphasizing its importance for storing data, such as game states, while power is on. She differentiates between volatile RAM, which requires power to maintain data, and persistent memory, which does not. The episode's focus is on building a circuit to store a single bit of information, which will eventually be scaled up to create a memory module and combined with an ALU to form a CPU. The discussion includes the creation of a latch circuit using logic gates to 'remember' a bit, and the evolution from simple latches to gated latches, which allow for writing and reading operations controlled by a write enable line.

05:02

🔗 Building a Memory Matrix and Registers

This section delves into the process of expanding memory capacity by arranging latches in a matrix to form a grid, which significantly reduces the number of wires needed compared to a linear arrangement. The concept of a memory address is introduced, where each latch in the matrix has a unique address determined by its row and column. The use of multiplexers to select rows and columns based on address inputs is explained. The idea of a register, which is a group of latches that can store an 8-bit number, is also discussed. The summary highlights the transition from individual latches to a more complex memory system, including the practical considerations of wiring and the abstraction of these systems into higher-level components.

10:03

📚 Exploring Types of RAM and Memory Scaling

The final paragraph explores the physical manifestation of RAM, starting with a look at a vintage RAM stick from the 1980s and breaking down its memory capacity. It contrasts Static RAM (SRAM), which was discussed in the episode, with other types of RAM like DRAM, Flash memory, and NVRAM, each using different technologies to store bits. The paragraph emphasizes the vast scaling of memory from early computers to modern ones, highlighting the transition from kilobytes to gigabytes and the corresponding increase in address size. It concludes with a metaphor comparing the layers of memory abstraction to Russian dolls, suggesting the complexity hidden within seemingly simple operations.

Mindmap

Keywords

💡ALU

ALU stands for Arithmetic Logic Unit, which is a fundamental component of a computer's central processing unit (CPU). It performs all the arithmetic and logical operations of the system, such as addition, subtraction, and bitwise operations. In the context of the video, the ALU is built using logic gates and is essential for carrying out calculations. The script mentions building a simple ALU as a precursor to understanding memory storage, highlighting its foundational role in computer operations.

💡RAM

Random Access Memory (RAM) is a type of computer memory that can be read and changed in any order, typically used to store data that is being actively used by the system. The video script discusses RAM's role in storing game states and other temporary data, emphasizing its volatile nature, meaning it requires a constant power supply to retain information. The script also differentiates RAM from persistent memory, which can retain data without power.

💡Latch

A latch is a type of memory storage that can 'latch onto' a single bit of information and hold it indefinitely, unless explicitly reset. In the video, latches are used to build the most basic form of memory, capable of storing a single bit. The script explains how an AND-OR latch functions, using a 'set' input to store a '1' and a 'reset' input to store a '0', and how it can 'remember' this value, which is crucial for memory functionality.

💡Gated Latch

A Gated Latch is an enhancement of the basic latch that allows data to be written to it only when a 'gate' is open. This gate is controlled by a write enable signal. The video script describes how a Gated Latch works, using a single wire for data input and a write enable wire to control when data can be written, making it more versatile and controlled than a basic latch. This concept is essential for understanding how memory can be selectively updated.

💡Register

A register is a small and fast storage location within a CPU that can hold a limited amount of data. It is composed of multiple latches and is used to store and manipulate data for immediate use. In the script, registers are described as groups of latches that can hold an 8-bit number, with the number of bits determining the register's width. Registers are fundamental for the CPU's operation, as they allow it to perform operations on data quickly.

💡Matrix

In the context of the video, a matrix refers to an arrangement of memory latches in a grid pattern, allowing for a more efficient use of wiring and enabling the storage of larger amounts of data. The script explains how a 16x16 grid of latches can be used to store 256 bits of information, reducing the number of wires needed compared to a linear arrangement. This concept is crucial for scaling up memory storage in computers.

💡Address

An address in computer memory refers to the unique identifier for a specific location in memory where data can be stored or retrieved. The video script uses the analogy of a city address to explain how each latch in a memory matrix can be uniquely identified by a combination of row and column addresses. This concept is essential for understanding how data is accessed in memory and how memory can be organized and scaled.

💡Multiplexer

A multiplexer is a device that selects one of many input signals and forwards the selected input to a single output line. In the video, multiplexers are used to translate addresses into signals that select the appropriate row or column in a memory matrix. The script describes how a 1-to-16 multiplexer can select one of 16 possible outputs based on a 4-bit input, which is crucial for managing the complexity of memory selection in larger memory arrays.

💡DRAM

DRAM, or Dynamic Random-Access Memory, is a type of volatile memory that stores each bit of data in a separate capacitor within an integrated circuit. The video script contrasts DRAM with SRAM, noting that while they serve similar functions, DRAM uses capacitors to store bits, which requires periodic refreshing to maintain the stored data. This distinction is important for understanding the different technologies used in memory storage.

💡Abstraction

Abstraction in computer science refers to the practice of hiding the complex reality of a system's inner workings behind a simpler interface. The video script discusses how, as the complexity of memory circuits increases, it becomes necessary to abstract these details into simpler components. This allows for easier design, understanding, and use of memory systems without needing to understand every individual component, which is a key concept in managing complexity in computing.

Highlights

Introduction to the concept of computer memory as a means to store calculation results.

Explanation of the difference between volatile RAM and persistent memory.

Building a basic circuit to store a single bit of information.

Demonstration of a feedback loop in an OR gate to create a memory element.

The limitation of the OR gate memory circuit in reverting from storing a '1' back to '0'.

Utilization of an AND gate to create a circuit that can record a '0'.

Combining OR and AND gate circuits to form an AND-OR Latch for memory storage.

Introduction of the Gated Latch as an improvement over the basic latch.

Concept of write enable line to control the writing operation in a Gated Latch.

Building a register by combining multiple latches to store an 8-bit number.

Scaling up memory storage by arranging latches in a matrix to form a larger memory module.

Use of row and column wires with AND gates to select and enable specific latches in a matrix.

Introduction of the concept of addresses for accessing specific memory locations.

Explanation of multiplexers used to select rows or columns in a memory matrix.

Building a larger memory component by combining multiple 256-bit memory modules.

Scaling memory to larger sizes like megabytes and gigabytes by increasing address sizes.

Comparison of different types of RAM technologies like SRAM, DRAM, Flash memory, and NVRAM.

Overview of how memory is organized in a physical RAM module.

Final thoughts on the simplicity of fundamental memory operations and the complexity of abstraction layers in computing.

Transcripts

play00:03

Hi, I’m Carrie Anne and welcome to Crash Course Computer Science.

play00:05

So last episode, using just logic gates, we built a simple ALU, which performs arithmetic

play00:11

and logic operations, hence the ‘A’ and the ‘L’.

play00:13

But of course, there’s not much point in calculating a result only to throw it away

play00:17

- it would be useful to store that value somehow, and maybe even run several operations in a row.

play00:22

That's where computer memory comes in!

play00:24

If you've ever been in the middle of a long RPG campaign on your console, or slogging

play00:28

through a difficult level on Minesweeper on your desktop, and your dog came by, tripped

play00:32

and pulled the power cord out of the wall, you know the agony of losing all your progress.

play00:36

Condolences.

play00:38

But the reason for your loss is that your console, your laptop and your computers make

play00:42

use of Random Access Memory, or RAM, which stores things like game state - as long as

play00:46

the power stays on.

play00:47

Another type of memory, called persistent memory, can survive without power, and it’s

play00:51

used for different things; We'll talk about the persistence of memory in a later episode.

play00:55

Today, we’re going to start small - literally by building a circuit that can store one..

play01:00

single.. bit of information.

play01:01

After that, we’ll scale up, and build our very own memory module, and we’ll combine

play01:05

it with our ALU next time, when we finally build our very own CPU!

play01:10

INTRO

play01:19

All of the logic circuits we've discussed so far go in one direction - always flowing

play01:23

forward - like our 8-bit ripple adder from last episode.

play01:26

But we can also create circuits that loop back on themselves.

play01:29

Let’s try taking an ordinary OR gate, and feed the output back into one of its inputs

play01:34

and see what happens.

play01:35

First, let’s set both inputs to 0.

play01:37

So 0 OR 0 is 0, and so this circuit always outputs 0.

play01:41

If we were to flip input A to 1.

play01:44

1 OR 0 is 1, so now the output of the OR gate is 1.

play01:48

A fraction of a second later, that loops back around into input B, so the OR gate sees that

play01:52

both of its inputs are now 1.

play01:54

1 OR 1 is still 1, so there is no change in output.

play01:58

If we flip input A back to 0, the OR gate still outputs 1.

play02:01

So now we've got a circuit that records a “1” for us.

play02:04

Except, we've got a teensy tiny problem - this change is permanent!

play02:07

No matter how hard we try, there’s no way to get this circuit to flip back from a 1

play02:12

to a 0.

play02:13

Now let’s look at this same circuit, but with an AND gate instead.

play02:16

We'll start inputs A and B both at 1.

play02:19

1 AND 1 outputs 1 forever.

play02:21

But, if we then flip input A to 0, because it’s an AND gate, the output will go to 0.

play02:26

So this circuit records a 0, the opposite of our other circuit.

play02:29

Like before, no matter what input we apply to input A afterwards, the circuit will always output 0.

play02:34

Now we’ve got circuits that can record both 0s and 1s.

play02:38

The key to making this a useful piece of memory is to combine our two circuits into what is

play02:42

called the AND-OR Latch.

play02:44

It has two inputs, a "set" input, which sets the output to a 1, and a "reset" input, which

play02:48

resets the output to a 0.

play02:50

If set and reset are both 0, the circuit just outputs whatever was last put in it.

play02:54

In other words, it remembers a single bit of information!

play02:58

Memory!

play02:59

This is called a “latch” because it “latches onto” a particular value and stays that way.

play03:03

The action of putting data into memory is called writing, whereas getting the data out

play03:08

is called reading.

play03:09

Ok, so we’ve got a way to store a single bit of information!

play03:12

Great!

play03:13

Unfortunately, having two different wires for input – set and reset – is a bit confusing.

play03:18

To make this a little easier to use, we really want a single wire to input data, that we

play03:22

can set to either 0 or 1 to store the value.

play03:24

Additionally, we are going to need a wire that enables the memory to be either available

play03:28

for writing or “locked” down --which is called the write enable line.

play03:32

By adding a few extra logic gates, we can build this circuit, which is called a Gated Latch

play03:37

since the “gate” can be opened or closed.

play03:39

Now this circuit is starting to get a little complicated.

play03:41

We don’t want to have to deal with all the individual logic gates... so as before, we’re

play03:44

going to bump up a level of abstraction, and put our whole Gated Latch circuit in a box

play03:48

-- a box that stores one bit.

play03:50

Let’s test out our new component!

play03:52

Let’s start everything at 0.

play03:54

If we toggle the Data wire from 0 to 1 or 1 to 0, nothing happens - the output stays at 0.

play04:00

That’s because the write enable wire is off, which prevents any change to the memory.

play04:04

So we need to “open” the “gate” by turning the write enable wire to 1.

play04:07

Now we can put a 1 on the data line to save the value 1 to our latch.

play04:11

Notice how the output is now 1.

play04:14

Success!

play04:14

We can turn off the enable line and the output stays as 1.

play04:18

Once again, we can toggle the value on the data line all we want, but the output will

play04:21

stay the same.

play04:22

The value is saved in memory.

play04:24

Now let’s turn the enable line on again use our data line to set the latch to 0.

play04:29

Done.

play04:30

Enable line off, and the output is 0.

play04:32

And it works!

play04:33

Now, of course, computer memory that only stores one bit of information isn’t very

play04:37

useful -- definitely not enough to run Frogger.

play04:39

Or anything, really.

play04:41

But we’re not limited to using only one latch.

play04:43

If we put 8 latches side-by-side, we can store 8 bits of information like an 8-bit number.

play04:48

A group of latches operating like this is called a register, which holds a single number,

play04:53

and the number of bits in a register is called its width.

play04:56

Early computers had 8-bit registers, then 16, 32, and today, many computers have registers

play05:01

that are 64-bits wide.

play05:03

To write to our register, we first have to enable all of the latches.

play05:06

We can do this with a single wire that connects to all of their enable inputs, which we set to 1.

play05:11

We then send our data in using the 8 data wires, and then set enable back to 0, and

play05:17

the 8 bit value is now saved in memory.

play05:19

Putting latches side-by-side works ok for a small-ish number of bits.

play05:23

A 64-bit register would need 64 wires running to the data pins, and 64 wires running to

play05:28

the outputs.

play05:29

Luckily we only need 1 wire to enable all the latches, but that’s still 129 wires.

play05:36

For 256 bits, we end up with 513 wires!

play05:40

The solution is a matrix!

play05:42

In this matrix, we don’t arrange our latches in a row, we put them in a grid.

play05:46

For 256 bits, we need a 16 by 16 grid of latches with 16 rows and columns of wires.

play05:52

To activate any one latch, we must turn on the corresponding row AND column wire.

play05:56

Let’s zoom in and see how this works.

play05:58

We only want the latch at the intersection of the two active wires to be enabled,

play06:02

but all of the other latches should stay disabled.

play06:05

For this, we can use our trusty AND gate!

play06:08

The AND gate will output a 1 only if the row and the column wires are both 1.

play06:12

So we can use this signal to uniquely select a single latch.

play06:15

This row/column setup connects all our latches with a single, shared, write enable wire.

play06:20

In order for a latch to become write enabled, the row wire, the column wire, and the write

play06:24

enable wire must all be 1.

play06:26

That should only ever be true for one single latch at any given time.

play06:29

This means we can use a single, shared wire for data.

play06:32

Because only one latch will ever be write enabled, only one will ever save the data

play06:37

-- the rest of the latches will simply ignore values on the data wire because they are not

play06:40

write enabled.

play06:41

We can use the same trick with a read enable wire to read the data later, to get the data

play06:46

out of one specific latch.

play06:48

This means in total, for 256 bits of memory, we only need 35 wires - 1 data wire, 1 write

play06:55

enable wire, 1 read enable wire, and 16 rows and columns for the selection.

play06:59

That’s significant wire savings!

play07:01

But we need a way to uniquely specify each intersection.

play07:05

We can think of this like a city, where you might want to meet someone at 12th avenue

play07:08

and 8th street -- that's an address that defines an intersection.

play07:11

The latch we just saved our one bit into has an address of row 12 and column 8.

play07:15

Since there is a maximum of 16 rows, we store the row address in a 4 bit number.

play07:20

12 is 1100 in binary.

play07:23

We can do the same for the column address: 8 is 1000 in binary.

play07:28

So the address for the particular latch we just used can be written as 11001000.

play07:35

To convert from an address into something that selects the right row or column, we need

play07:39

a special component called a multiplexer -- which is the computer component with a pretty cool

play07:43

name at least compared to the ALU.

play07:45

Multiplexers come in all different sizes, but because we have 16 rows, we need a 1 to

play07:50

16 multiplexer.

play07:51

It works like this.

play07:52

You feed it a 4 bit number, and it connects the input line to a corresponding output line.

play07:56

So if we pass in 0000, it will select the very first column for us.

play08:02

If we pass in 0001, the next column is selected, and so on.

play08:06

We need one multiplexer to handle our rows and another multiplexer to handle the columns.

play08:10

Ok, it’s starting to get complicated again, so let’s make our 256-bit memory its own component.

play08:16

Once again a new level of abstraction!

play08:24

It takes an 8-bit address for input - the 4 bits for the column and 4 for the row.

play08:29

We also need write and read enable wires.

play08:32

And finally, we need just one data wire, which can be used to read or write data.

play08:37

Unfortunately, even 256-bits of memory isn’t enough to run much of anything, so we need

play08:42

to scale up even more!

play08:43

We’re going to put them in a row.

play08:45

Just like with the registers.

play08:46

We’ll make a row of 8 of them, so we can store an 8 bit number - also known as a byte.

play08:51

To do this, we feed the exact same address into all 8 of our 256-bit memory components

play08:57

at the same time, and each one saves one bit of the number.

play09:01

That means the component we just made can store 256 bytes at 256 different addresses.

play09:07

Again, to keep things simple, we want to leave behind this inner complexity.

play09:11

Instead of thinking of this as a series of individual memory modules and circuits, we’ll

play09:15

think of it as a uniform bank of addressable memory.

play09:18

We have 256 addresses, and at each address, we can read or write an 8-bit value.

play09:23

We’re going to use this memory component next episode when we build our CPU.

play09:28

The way that modern computers scale to megabytes and gigabytes of memory is by doing the same

play09:32

thing we’ve been doing here -- keep packaging up little bundles of memory into larger, and

play09:36

larger, and larger arrangements.

play09:37

As the number of memory locations grow, our addresses have to grow as well.

play09:42

8 bits hold enough numbers to provide addresses for 256 bytes of our memory, but that’s all.

play09:48

To address a gigabyte – or a billion bytes of memory – we need 32-bit addresses.

play09:53

An important property of this memory is that we can access any memory location, at any

play09:58

time, and in a random order.

play09:59

For this reason, it’s called Random-Access Memory or RAM.

play10:03

When you hear people talking about how much RAM a computer has - that's the computer’s memory.

play10:07

RAM is like a human’s short term or working memory, where you keep track of things going

play10:11

on right now - like whether or not you had lunch or paid your phone bill.

play10:14

Here’s an actual stick of RAM - with 8 memory modules soldered onto the board.

play10:18

If we carefully opened up one of these modules and zoomed in, The first thing you would see

play10:22

are 32 squares of memory.

play10:23

Zoom into one of those squares, and we can see each one is comprised of 4 smaller blocks.

play10:28

If we zoom in again, we get down to the matrix of individual bits.

play10:31

This is a matrix of 128 by 64 bits.

play10:34

That’s 8192 bits in total.

play10:37

Each of our 32 squares has 4 matrices, so that’s 32 thousand, 7 hundred and 68 bits.

play10:43

And there are 32 squares in total.

play10:45

So all in all, that’s roughly 1 million bits of memory in each chip.

play10:49

Our RAM stick has 8 of these chips, so in total, this RAM can store 8 millions bits,

play10:54

otherwise known as 1 megabyte.

play10:56

That’s not a lot of memory these days -- this is a RAM module from the 1980’s.

play11:00

Today you can buy RAM that has a gigabyte or more of memory - that’s billions of bytes

play11:05

of memory.

play11:06

So, today, we built a piece of SRAM - Static Random-Access Memory – which uses latches.

play11:11

There are other types of RAM, such as DRAM, Flash memory, and NVRAM.

play11:15

These are very similar in function to SRAM, but use different circuits to store the individual

play11:19

bits -- for example, using different logic gates, capacitors, charge traps, or memristors.

play11:24

But fundamentally, all of these technologies store bits of information in massively nested

play11:28

matrices of memory cells.

play11:31

Like many things in computing, the fundamental operation is relatively simple.. it’s the

play11:35

layers and layers of abstraction that’s mind blowing -- like a russian doll that

play11:40

keeps getting smaller and smaller and smaller.

play11:42

I’ll see you next week.

play11:44

Credits

Rate This
★
★
★
★
★

5.0 / 5 (0 votes)

Étiquettes Connexes
Computer ScienceMemory ModulesALU IntegrationCircuit DesignData StorageTechnology EducationLogic GatesElectronic ComponentsComputer MemoryCrash Course
Besoin d'un résumé en anglais ?