M1. L3. Measuring Computer Power

Farhad Akbari
30 Apr 202322:33

Summary

TLDRThis script delves into the fundamental components of computer systems, highlighting the roles of the CPU, input and output devices, and storage. It explains the CPU's internal workings, including the ALU, cache, registers, and buses, and the significance of the clock in timing operations. The Von Neumann architecture and its impact on programmable computers are discussed, along with the instruction sets of CISC and RISC. The binary system's role in digital computing is explored, as well as the evolution of data storage and addressing. The script also touches on the challenges of artificial intelligence, such as emotion recognition and independent decision-making, and the computer's reliance on human input for complex tasks.

Takeaways

  • 🏠 Computers are built with a plan, similar to a house, and consist of basic components like input, storage, output devices, and the CPU.
  • πŸ”’ The CPU, despite being a small chip, contains vital components such as the ALU, cache, registers, buses, and clock, which are crucial for system control.
  • πŸ“ˆ The ALU in the CPU performs all arithmetic and logical operations, while the cache is high-speed RAM that stores frequently used data and instructions.
  • πŸ—‚οΈ Registers within the CPU are high-speed memory sections, including general and special purpose registers, that are close to the processor for quick data access.
  • 🚦 Buses within the CPU act as internal connections, with address buses for memory addresses, data buses for data, and control buses for control signals.
  • ⏱️ The CPU's clock regulates the timing of all processes, with cycles measured in hertz, indicating the number of instructions processed per second.
  • πŸ› οΈ The Von Neumann architecture, proposed by John Von Neumann, is the foundation of modern computing, allowing for storing program instructions alongside data.
  • πŸ“š Special registers in the Von Neumann architecture, like the PC, CIR, MAR, MDR, and ACC, play specific roles in instruction fetching, decoding, and execution.
  • πŸ› οΈ Instruction sets define the commands a processor can execute, with CISC and RISC being two categories, where CISC focuses on complex instructions and RISC on simpler, more efficient ones.
  • πŸ”’ Digital computing is based on binary, with bits representing on/off states, forming the basis of all modern computing, including images, music, and movements.
  • πŸ“Š A byte, traditionally 8 bits, is the standard unit for data representation, with larger units like kilobytes, megabytes, and gigabytes used for larger data sets.

Q & A

  • What are the basic components of a computer system?

    -The basic components of a computer system include input devices, storage devices, output devices, and the CPU.

  • What is the role of the CPU in a computer system?

    -The CPU, or Central Processing Unit, is a small chip that performs most of the processing in a computer. It contains important components like the control unit, arithmetic logic unit, registers, cache, buses, and clock, which together provide system control.

  • What is the function of the Arithmetic Logic Unit (ALU) in the CPU?

    -The ALU performs all the mathematical and logical operations within the CPU, including decision-making tasks.

  • What is cache and why is it important in a CPU?

    -Cache is a small, high-speed RAM built directly into the processor. It holds data and instructions that the processor is likely to use, reducing the time needed to access frequently used data.

  • What are registers in the context of the CPU, and what is their purpose?

    -Registers are small sections of high-speed memory found inside the CPU. They store data and instructions close to the processor, facilitating faster access and processing.

  • How do buses function within a CPU?

    -Buses are internal high-speed connections that transfer data, memory addresses, and control signals between the CPU and other components, similar to roads for cars.

  • What is the significance of the clock in a CPU and how is it measured?

    -The clock in a CPU keeps time for all processes, ensuring they are precisely timed. It is measured in hertz, which are cycles per second, and indicates how many processing cycles the CPU can complete in a second.

  • What is the Von Neumann architecture and its significance in computing?

    -The Von Neumann architecture is a design where program instructions and data are stored together in memory. It allows computers to be programmable and to perform a wide range of tasks, limited only by the programmer's imagination.

  • What are the five special registers in the Von Neumann architecture?

    -The five special registers are the Program Counter (PC), Current Instruction Register (CIR), Memory Address Register (MAR), Memory Data Register (MDR), and the Accumulator (ACC).

  • What is the difference between CISC and RISC in terms of instruction sets?

    -CISC (Complex Instruction Set Computer) aims to complete tasks with fewer lines of assembly code, often using microcode for complex instructions. RISC (Reduced Instruction Set Computer) uses simpler instructions that can be executed in one clock cycle, requiring more lines of code but often resulting in simpler and more efficient processor design.

  • How does the binary system form the basis of all modern computing?

    -Modern computing is based on the binary system, which uses only two states, represented by 0 and 1. These states are used to represent all data in a computer, from images and music to text and commands.

  • What is a byte and how did it become the standard unit of data representation?

    -A byte is a standard unit of data representation consisting of eight bits. It became the standard as computers started using eight bits to represent characters, and the term 'byte' was deliberately misspelled to avoid confusion with the word 'bite'.

  • Why are powers of two important in memory addressing and storage devices?

    -Powers of two are important in memory addressing and storage devices because every time the number of address bits is increased, the number of possible storage locations doubles. This ensures efficient use of memory and avoids issues with non-addressable locations.

  • What is the challenge for computers in terms of emotion and decision-making?

    -Computers struggle with showing emotion and making independent decisions because they rely on binary logic and are programmed based on specific instructions. They lack the intuitive understanding and emotional context that humans possess.

  • What is the fetch-decode-execute cycle in computing?

    -The fetch-decode-execute cycle is the process where instructions are fetched from RAM, decoded by the CPU to understand them, and then executed. This cycle is fundamental to how computers process instructions.

  • How is a computer's speed measured and what factors influence it?

    -A computer's speed is measured by the number of cycles it can complete in a second. Factors that influence this include clock speed, cache size, and the number of cores in the CPU.

Outlines

00:00

πŸ–₯️ Basic Components and Architecture of Computer Systems

This paragraph introduces the fundamental components of computer systems, such as input and output devices, storage devices, and the Central Processing Unit (CPU). It explains the role of the CPU, which is often a small chip inside the computer case, and its internal components including the Arithmetic Logic Unit (ALU), cache, registers, buses, and clock. The Von Neumann architecture is highlighted, which allows for the storage of program instructions alongside data, enabling computers to perform a wide range of tasks. The paragraph also touches on the concept of hertz as a measure of CPU speed, with a 2.5 gigahertz processor being able to complete 2.5 billion processing cycles per second.

05:01

πŸ” Instruction Sets and the Evolution of CISC and RISC

The second paragraph delves into the concept of instruction sets, which are the commands that processors understand and execute. It discusses the evolution of two types of instruction sets: Complex Instruction Set Computer (CISC) and Reduced Instruction Set Computer (RISC). CISC aims to complete tasks with fewer lines of code using microcode, which can be updated and allows for complex operations to be performed in multiple cycles. RISC, on the other hand, focuses on simple instructions that can be executed in a single clock cycle, requiring more lines of code but benefiting from simpler and standardized instructions, more registers, and the ability to use pipelining for efficiency. The paragraph also compares CISC and RISC to an adult and a child's ability to follow instructions, highlighting the trade-offs between the two approaches.

10:01

πŸ› οΈ Digital Computing and the Binary System

This paragraph explores the basics of digital computing, which relies on binary digits (bits) that represent on and off states, forming the backbone of all modern computing. It explains the binary system and how bits are used to represent data, leading to the standardization of the byte as an eight-bit unit. The paragraph also discusses the historical development of memory addressing and the importance of powers of two in determining the size and addressing of memory. It touches on the challenges of non-power-of-two storage devices like DVDs and the complexity of their controllers, emphasizing the de facto standard of storage in powers of two.

15:02

πŸ€– The Limitations and Advancements in Artificial Intelligence

The fourth paragraph addresses the challenges and advancements in artificial intelligence, particularly in the areas of emotion recognition and creative generation. It notes the difficulty computers face in interpreting human expressions and making decisions based on emotion, contrasting this with human intuition. The paragraph also mentions the progress in AI, such as Google's inceptionism project, where AI creates images from white noise, and the development of self-learning systems like AlphaGo by DeepMind. It highlights the ongoing efforts to improve computers' ability to perform tasks that typically require human-like thought and decision-making.

20:02

πŸ”„ The Fetch-Decode-Execute Cycle and Factors Affecting Computer Speed

The final paragraph explains the fetch-decode-execute cycle, which is the process by which computers execute instructions. It describes how instructions are fetched from RAM, decoded by the CPU, and then executed. The paragraph also discusses the factors that influence a computer's speed, including clock speed, cache size, and the number of cores. It emphasizes the importance of the clock in timing the cycles and how modern computers can complete billions of cycles per second. The benefits of larger caches and multiple cores in increasing processing power are also highlighted.

Mindmap

Keywords

πŸ’‘Computer Systems

Computer systems are the core subject of the video, representing the structured assembly of components that work together to perform tasks. Defined as having basic components like input devices, storage devices, output devices, and the CPU, they form the foundation of digital computing. The script discusses how these systems are built following a plan, similar to a house, and highlights the importance of each component in the overall functionality of a computer.

πŸ’‘CPU (Central Processing Unit)

The CPU is the brain of the computer, a small chip that performs the primary calculations and operations. The script clarifies that it is not the large box on the desk but a component within it. It contains crucial parts such as the control unit, arithmetic logic unit, registers, cache, buses, and clock, which work in unison to control the system's operations and process data.

πŸ’‘ALU (Arithmetic Logic Unit)

The ALU is a key part of the CPU, responsible for all mathematical and logical operations. It is the component that performs the decision-making processes within the CPU. The script uses the ALU as an example to explain how the CPU processes data and instructions.

πŸ’‘Cache

Cache is a small, high-speed RAM built into the processor to hold data and instructions that the processor is likely to use frequently. It is depicted as a way to speed up processing by reducing the distance data must travel, akin to keeping a pen on the desk for easy access while working. The script mentions cache size as an example of processor specifications.

πŸ’‘Registers

Registers are small sections of high-speed memory inside the CPU, serving as temporary storage for data that the processor is actively using. The script differentiates between general-purpose and special-purpose registers, emphasizing their proximity to the processor and their role in efficient data handling.

πŸ’‘Buses

Buses are internal connections within the CPU that facilitate the transfer of data, memory addresses, and control signals between components. The script likens them to roads, transporting 'cars' (data) to and from their destinations, illustrating their importance in the system's communication infrastructure.

πŸ’‘Clock

The clock in a computer system is responsible for timing the execution of processes, operating at billions of pulses per second. The script explains that it is measured in hertz, or cycles per second, with each cycle representing the processing of a single instruction, highlighting the clock's role in determining computer speed.

πŸ’‘Von Neumann Architecture

This architecture is the foundational concept for modern computers, where program instructions and data are stored together in memory. The script describes how this design, proposed by John Von Neumann, revolutionized computing by enabling computers to be programmable and versatile, contrasting it with the hardwired special-purpose computers of the past.

πŸ’‘Instruction Set

An instruction set is the collection of all possible commands that a processor can execute. The script explains that these commands define the capabilities of the processor and how it interacts with the program, allowing it to perform various operations as directed by the programmer.

πŸ’‘CISC and RISC

CISC (Complex Instruction Set Computer) and RISC (Reduced Instruction Set Computer) are two categories of instruction sets. CISC aims to complete tasks with fewer lines of code by using microcode, while RISC focuses on simple, one-cycle instructions. The script uses the analogy of an adult and a child to illustrate the difference in complexity and efficiency between the two architectures.

πŸ’‘Binary System

The binary system is the basis of digital computing, using only two states, on (1) and off (0), to represent all data. The script explains how this system forms the fundamental building blocks of all modern computing, from images and music to robot movements, and how it is used to represent information in the simplest form of bits.

Highlights

Computer systems are built based on a plan with basic components including input devices, storage devices, output devices, and the CPU.

Input devices feed raw data into the computer, while output devices display the results of processing.

The CPU, a small chip inside the computer case, contains critical components like the ALU, cache, registers, buses, and clock.

The ALU performs all mathematical and logical operations within the CPU.

Cache is high-speed RAM used to hold frequently used data and instructions for quick access.

Registers are high-speed memory sections within the CPU, including general and special purpose registers.

Buses provide high-speed internal connections between CPU and other components, including address, data, and control buses.

The clock in a CPU times all processes and operates at billions of pulses per second, measured in hertz.

Von Neumann architecture, proposed in 1945, stores program instructions alongside data, enabling versatile computing.

Special registers in Von Neumann architecture include the program counter, current instruction register, memory address register, memory data register, and accumulator.

Instruction sets define the commands a processor can execute, evolving into CISC and RISC categories.

CISC aims to complete tasks with fewer lines of assembly code, using microcode for complex operations.

RISC focuses on simple instructions that can be executed in one clock cycle, requiring more lines of code but simplifying hardware.

Digital computing is based on binary system switches, represented by bits, forming the foundation of all modern computing.

A byte, consisting of eight bits, is the standard unit of data in computing, evolving from earlier variations.

Larger data units like kilobytes, megabytes, and gigabytes are powers of two, differing from decimal-based units.

Memory addressing in binary allows for the doubling of storage locations with each additional address bit.

Exceptional cases like DVDs demonstrate that non-power-of-two data storage is possible with complex controller decoding.

Computing faces challenges in areas such as emotion recognition and independent decision-making compared to human capabilities.

Advancements in artificial intelligence, such as Google's inceptionism, allow computers to create new content from scratch.

Computers rely on input and lack independent thought, contrasting with the natural decision-making process of humans.

The fetch-decode-execute cycle is the fundamental process of a computer's operation, timed by the CPU clock.

Modern computer speed is determined by factors like clock speed, cache size, and the number of cores.

Transcripts

play00:00

moving right into Computing systems just

play00:05

like a house a computer is built using a

play00:08

plan that shows where everything is

play00:10

placed for a computer there are

play00:13

variations in which computer systems can

play00:15

be built

play00:17

computer systems have a number of basic

play00:20

components such as input devices storage

play00:23

devices output devices and the CPU

play00:27

these are found in pretty much every

play00:30

computer under the sun

play00:32

as we learn in lesson one input devices

play00:35

are the ones that feed raw data into the

play00:38

computer and output devices are the ones

play00:41

that show us the results of processing

play00:44

storage devices keep data and the CPU

play00:49

the CPU is not actually that big box

play00:52

that sits on your desk it is a small

play00:54

chip that is resident inside that box

play00:57

the CPU itself has some pretty important

play01:00

components inside it namely the control

play01:02

unit arithmetic logic unit registers

play01:06

cachet buses and clock these components

play01:10

provide a spectacular level of system

play01:12

control

play01:13

let's look at each of these components

play01:16

one by one

play01:17

the arithmetic and logic unit or ALU is

play01:21

exactly that it performs all the math

play01:24

and logic so decision operations in the

play01:27

CPU

play01:29

the cache is a small high-speed Ram or

play01:32

Ram built directly into the processor it

play01:36

is used to hold data and instructions

play01:39

that the processor is likely to use

play01:42

you may have seen this written on the

play01:45

processor specifications as three

play01:47

megabyte cache

play01:49

much like in the real world the further

play01:51

relocation is the longer it takes to get

play01:54

to it

play01:56

take for example you keep your pen on

play01:58

your desk while you're working because

play02:00

you're likely going to be jotting down

play02:02

sticky notes throughout the day

play02:04

when you're going for lunch and are in

play02:07

need freak like a computer is you will

play02:09

put your pen in your bag a computer will

play02:12

send the data that it is not using to

play02:15

main memory RAM much in the same way

play02:19

a register is a small section of

play02:22

high-speed memory that is found inside

play02:24

the CPU

play02:25

this is even closer to the processor

play02:28

there are general purpose registers and

play02:31

special purpose registers and we're

play02:33

going to look at the latter shortly

play02:36

buses are an internal high-speed

play02:39

connection much like roads they take

play02:41

cars to and from destinations in this

play02:44

case buses run from the CPU to other

play02:48

components

play02:49

address buses carry memory addresses

play02:52

from the processor to memory input

play02:55

storage or output devices

play02:58

data buses carry data while control

play03:01

buses carry control signals

play03:04

now the clock does exactly what it

play03:07

sounds like it keeps time

play03:09

all the processes in the CPU have to be

play03:12

timed precisely to the tune of billions

play03:16

of pulses per second this is not

play03:18

measured in seconds like the clocks that

play03:21

we're used to rather it is measured in

play03:24

hertz which are cycles per second

play03:28

a cycle is when a single instruction has

play03:31

been processed

play03:33

if you've ever heard someone referring

play03:35

to a processor as a 2.5 gigahertz

play03:38

processor that means that the processor

play03:41

does 2.5 billion processing cycles per

play03:45

second

play03:46

mind-blowing isn't it

play03:48

computers are built in pretty much the

play03:51

same way nowadays following the diagram

play03:54

we looked at a few seconds ago

play03:57

this is what is known as the Von Neumann

play04:00

architecture

play04:01

this architecture is built on the

play04:03

premise of storing program instructions

play04:05

in memory along with the instructions

play04:08

that operate on the data

play04:11

this means that each computer can be

play04:13

built to perform tasks that are

play04:15

literally limited by the programmer's

play04:18

imagination

play04:20

this design was proposed in 1945 by John

play04:24

Von Newman to improve this special

play04:27

purpose computers of the day

play04:29

in those days you had to hardwire a

play04:33

computer for a specific purpose

play04:35

this was a tedious error prone and

play04:39

time-consuming task making a mistake

play04:41

when you have thousands of wires to

play04:43

connect is no fun at all

play04:46

the programmers spend weeks underneath

play04:49

Bunches of wire trying to get things

play04:52

right for the particular task and this

play04:55

has to be done for each particular task

play04:58

imagine having to open up your computer

play05:00

to set it up for Excel then when you're

play05:04

done you open it up again to set it up

play05:06

for word

play05:08

the Von Neumann architecture uses five

play05:11

special registers

play05:13

the program counter PC holds the next

play05:17

instruction to be fetched the current

play05:20

instruction register cir holds the

play05:23

address of the instruction that is

play05:26

currently being decoded and executed

play05:29

the memory address register Mar holds

play05:33

the address of the current instruction

play05:34

that is to be fetched from memory or the

play05:38

memory address to which data is to be

play05:40

transferred

play05:41

the memory data register MDR holds the

play05:46

contents found at the address held in

play05:49

the mar or data which is to be

play05:51

transferred to main memory

play05:54

and then the accumulator ACC

play05:57

holds the data being processed and the

play06:00

results of processing

play06:02

with the coming of the Von Neumann

play06:04

architecture came programmable computers

play06:08

a computer is able to process data

play06:11

because it has what is called an

play06:13

instruction set

play06:14

an instruction set is the set of all

play06:17

possible commands that can be issued to

play06:20

the processor

play06:21

these instructions enhance the

play06:23

capabilities of the processor in defined

play06:26

contexts instruction sets in a less

play06:30

geeky language are commands that allow a

play06:32

program to tell the processor to switch

play06:34

relevant transistors on or off in order

play06:37

to perform an operation

play06:40

these evolved and ended up as two

play06:43

categories

play06:44

cisc and risc

play06:48

now cisc stands for complex instruction

play06:52

set computer and risec stands for

play06:55

reduced instruction set computer the

play06:59

complex instruction set computer has a

play07:02

primary goal of completing a task in as

play07:05

few lines of assembly code as possible

play07:08

a cisc computer typically has micro code

play07:12

that allows it to do this

play07:14

this micro code is a bunch of low level

play07:17

instructions stored in fast memory and

play07:21

is also updatable

play07:23

this means that the low level

play07:24

instructions issue to the processor are

play07:27

shorter but the processor still knows

play07:29

what to do for example for an addition

play07:32

operation the computer uses the micro

play07:35

code to determine that it needs to move

play07:37

the contents of one register and put

play07:40

them into another register and store the

play07:43

result

play07:44

this cannot be executed in one cycle

play07:47

because it involves several steps

play07:50

so it is spread out over a number of

play07:53

Cycles

play07:54

this has the overall effect of improving

play07:58

system performance but the hardware

play08:00

itself is very complex

play08:03

the reduced instruction set computer

play08:05

aims at simple instructions that can be

play08:08

completed in one clock cycle

play08:11

now the programmer needs to code each

play08:14

individual step in order to perform the

play08:16

same addition operation that we just

play08:18

looked at

play08:20

sounds inefficient right

play08:22

why would you do all that when you can

play08:25

just use cisc

play08:27

after all more lines of code mean more

play08:30

memory right

play08:32

well risc systems need less transistors

play08:36

to carry out the same task

play08:38

which leaves more room to add registers

play08:42

the instructions are executed in a

play08:45

uniform amount of time so they can be

play08:48

staggered using a process called

play08:50

pipelining

play08:51

the easiest way to compare cisc and risc

play08:55

is to compare a child and an adult while

play08:59

you can give an adult an instruction say

play09:02

to do the laundry an adult will

play09:04

immediately know that they're supposed

play09:06

to sort clothes put them in the washing

play09:08

machine ADD detergent turn the machine

play09:11

on and so on for a child this would be

play09:15

an overwhelming instruction which would

play09:18

have to be given step by step

play09:20

in this case

play09:22

cisc would be the adult and the child

play09:25

would be r-i-s-c unlike the child though

play09:29

the actual risc has several advantages

play09:33

it has simple standardized instructions

play09:36

which means the programmer has a much

play09:38

less of a headache when programming risc

play09:41

systems as the compiler does most of the

play09:44

hard work

play09:45

it does have drawbacks too such as

play09:49

needing more RAM which can cause

play09:51

bottlenecks if the ram is limited

play09:54

cisc on the other hand uses less RAM and

play09:59

has the ability to add more instruction

play10:00

sets which makes it more flexible

play10:04

also the micro code can be extended to

play10:07

add more features to the instruction set

play10:10

digital Computing at a very basic level

play10:13

is just a bunch of switches that are

play10:15

either on or off depending on the

play10:18

machine state

play10:20

the on state is represented by a one and

play10:24

the off state is represented by a zero

play10:28

this forms the basis of all modern

play10:31

Computing all your pictures music

play10:34

drawings robot movements and everything

play10:37

else you can think of is at the very

play10:39

basic level just a whole horde of ones

play10:43

and zeros

play10:44

each one of those States is called a bit

play10:48

and that system of numbering is called

play10:51

the binary system the binary system is a

play10:55

system of two digits no matter how many

play10:58

you add or multiply the sum will never

play11:01

go above two

play11:03

from zero add one and it becomes one but

play11:07

if we add another one we have a problem

play11:10

we can't write the number two

play11:13

remember from first grade math when you

play11:17

run out of numbers you add another

play11:19

column and start counting from zero

play11:21

like what we do when we count from eight

play11:24

nine and then add another column and

play11:27

write 10.

play11:29

so here we now write one zero

play11:33

from one zero we add another one and it

play11:36

becomes one one

play11:39

add another one and it becomes one zero

play11:43

zero

play11:44

the sequence ultimately becomes zero one

play11:48

one zero one one one one zero zero one

play11:53

zero one

play11:55

one one one and one zero zero zero and

play11:59

so on

play12:01

But Here Comes the problem

play12:04

there isn't much information that we can

play12:06

represent using two bits there was a lot

play12:09

of chaos back in the day when computers

play12:12

were using varied numbers of bits to

play12:14

represent characters

play12:17

somewhere along the course powers of two

play12:20

slowly became the standard so computers

play12:23

started using eight bits to represent

play12:25

data

play12:27

this became a standard unit known as a

play12:30

byte

play12:32

in the 1950s the term bite initially

play12:35

meant the addressable blocks of memory

play12:38

computers weren't that standardized back

play12:40

then so bite could mean 6 7 8 or even 9

play12:44

bits

play12:46

eight bits eventually became the

play12:48

standard and represents one byte of data

play12:51

to this day

play12:52

byte is a metaphor for what a computer

play12:55

chews on and this is a deliberate

play12:58

misspelling of bite in order to avoid

play13:01

accidental confusion

play13:03

by today's standards a byte is a very

play13:06

small unit of data it can represent only

play13:09

one dot on an image in comparison a 4K

play13:13

TV has 8 million two hundred and ninety

play13:16

four thousand and four hundred pixels

play13:19

pretty hefty we need a way to represent

play13:22

this in a readable manner

play13:24

this is where we use larger units of

play13:27

measurement unlike other units of data

play13:30

that we're used to such as kilograms one

play13:33

kilobyte is not a thousand kilobytes

play13:36

like we saw earlier a byte is an

play13:39

addressable area and these come in

play13:42

powers of two

play13:43

there are also units such as a nibble

play13:47

which is four bits and a word which is

play13:50

16 bits when you're going to store

play13:53

something in memory

play13:55

you need to tell the storage device what

play13:57

memory location to put it in

play13:59

that is done by providing a memory

play14:01

address or a set of addresses

play14:04

addresses are also provided in binary

play14:07

every time you increase the number of

play14:09

address bits you double the number of

play14:11

possible storage locations so generally

play14:14

new storage devices must be at least two

play14:18

times as large as the previous largest

play14:20

one and if it isn't there will be

play14:23

potential memory locations which are not

play14:25

actually addressable

play14:27

for example let's say your old storage

play14:30

device had four address lines

play14:33

this means that it can store data in 16

play14:36

different locations

play14:38

24 equals 16.

play14:42

if you add one address line it can now

play14:45

store data into 32 different locations

play14:48

making a storage device that would only

play14:50

hold 25 different pieces of data would

play14:53

mean that there were seven addresses

play14:55

that were not usable

play14:57

if nothing else that means some

play15:00

complexity must be added to the

play15:02

controller so that if you try to store

play15:04

something in one of those seven

play15:05

locations an error gets sent back

play15:08

otherwise one of the other locations

play15:11

will be incorrectly overwritten and or

play15:13

data will be lost

play15:15

since this is the natural progression of

play15:17

things in binary addressing people have

play15:20

a rarely made storage devices that

play15:22

didn't work in powers of two

play15:24

it's just asking for trouble

play15:26

storage in powers of two have become a

play15:29

de facto standard now

play15:31

DVDs are one of the exceptions of this

play15:34

rule as the information that is actually

play15:37

written to the disk may not be a Power

play15:39

of Two

play15:41

this is compensated by the fact that the

play15:44

drive controllers are fairly complex and

play15:47

can decode the data

play15:49

for most other controllers however

play15:51

things are confined to the powers of two

play15:55

Computing isn't an all Rosy world where

play15:59

you can just think of something and the

play16:01

computer does it for you

play16:03

in some instances humans are actually a

play16:06

lot better at performing tasks than

play16:08

computers are

play16:09

that said we can't set this in stone

play16:13

from lesson one and two we saw that

play16:16

computers grew phenomenally and can now

play16:19

do things that were unimaginable only 20

play16:21

years ago

play16:23

as much as artificial intelligence has

play16:25

advanced the one problem that is still

play16:28

puzzling computer scientists is getting

play16:30

computers to show emotion

play16:33

this is a branch of artificial

play16:35

intelligence known as Effective

play16:37

computing

play16:39

the computer scientists in this field

play16:41

are busy trying to get computers to

play16:43

analyze people's expressions and

play16:45

understand them

play16:47

humans from a very early age can make

play16:49

out facial expressions intuitively

play16:52

currently even the fastest computers

play16:55

have a hard time making out what a

play16:57

person's expressions mean

play16:59

humans can also make decisions based on

play17:02

emotion it is very hard for a computer

play17:04

to for example give a child a less

play17:08

complicated gain because the child looks

play17:10

nervous

play17:12

computers also have a hard time creating

play17:14

something completely new

play17:17

this is a new research field though as

play17:20

there are some pretty bizarre things

play17:21

happening

play17:23

Google created AI that can dream

play17:27

the machine was given a blank slate of

play17:29

white noise then generated some pretty

play17:32

bizarre images such as a pig snail and a

play17:35

camel bird

play17:37

talk about the wild imagination of a

play17:39

five-year-old

play17:40

this process of creating images from

play17:43

nowhere is called inceptionism

play17:47

ever watch the movie Inception

play17:50

check out the bizarre things that

play17:52

computers have generated

play17:56

this is multiple images layered and

play17:59

animated

play18:02

computers have a hard time improving

play18:04

themselves

play18:06

typically programmers have to keep

play18:08

cracking their heads to come up with

play18:10

better software for computers

play18:13

the number of transistors also Remains

play18:15

the Same as it was from day one

play18:18

humans grow from the few things that we

play18:21

knew at Birth in our tiny bodies and the

play18:23

vast amount of knowledge that we gather

play18:25

and the strength that our bodies gain as

play18:27

we grow as of now the only way for a

play18:30

computer to improve is through the hard

play18:32

work of Engineers and scientists

play18:35

there is progress however in artificial

play18:38

intelligence where a computer can be

play18:40

presented with simple tasks then it

play18:43

learns from that and ends up being able

play18:45

to perform more complex tasks

play18:47

if you read up on Alpha Blue by deepmind

play18:50

you will be blown away by how their

play18:53

program was able to teach itself games

play18:56

there was an old joke that says whenever

play18:59

a computer makes a decision that is the

play19:02

decision of someone in the development

play19:04

team

play19:05

computers are so far only as good as the

play19:09

information they are given

play19:11

all that processing is based on what the

play19:13

programmers said the computers should do

play19:15

that's it

play19:17

computers are pretty darn good at this

play19:19

however when it comes to Independent

play19:21

thought and decision making computers

play19:24

instantly appear dumb

play19:26

they rely on some form of input from

play19:29

somewhere so they can make a decision

play19:31

human thought looks like a simple

play19:34

process from the outside

play19:36

everyone just goes around walking making

play19:39

decisions about what to wear sharing

play19:41

opinions and all that but for a computer

play19:44

this is nearly impossible

play19:47

phones nowadays come with voice

play19:49

assistance such as Alexa Bixby Cortana

play19:52

Google assistant and Siri

play19:55

these sound smart when you ask them to

play19:57

turn off your lights or play your

play20:00

favorite music

play20:02

when you try to have a deep conversation

play20:04

with them things quickly go south

play20:07

deeper conversational traits such as

play20:09

continuing from the previous sentence

play20:11

topic change opinions and figurative

play20:14

language are all very challenging for

play20:17

computers

play20:18

as we saw earlier computers are based on

play20:21

the sole binary principle of whether to

play20:24

switch a transistor on or off

play20:27

this translates to the use of true or

play20:30

false as the decision-making handles

play20:33

in natural language processing a single

play20:36

word can be pronounced differently by a

play20:38

number of people

play20:40

this presents a challenge for the

play20:41

computer

play20:43

fuzzy logic is a branch of computer

play20:45

science that attempts to solve this

play20:47

problem by allowing the computer to

play20:49

reason without fitting into exact

play20:51

categories

play20:52

computers use a cycle known as the fetch

play20:55

decode execute cycle we which does

play20:58

exactly what it says instructions are

play21:01

fetched from Ram then the CPU makes

play21:04

sense of these instructions by decoding

play21:07

them the CPU then carries out the

play21:10

instruction in the execute stage

play21:12

a complete computer cycle includes all

play21:15

three stages

play21:17

remember the clock that we discussed

play21:19

earlier the Cycles are perfectly timed

play21:22

according to the clock

play21:24

a computer's speed is measured by the

play21:27

number of cycles that can be completed

play21:28

in a second

play21:31

did you know that modern computers can

play21:33

complete billions of these Cycles in a

play21:36

second a number of factors determine how

play21:39

fast a computer is clock speed cache

play21:42

size and number of cores

play21:45

as we saw earlier the clock is what

play21:48

determines how fast instructions are

play21:50

executed the more pulses the clock

play21:53

produces the faster the CPU

play21:57

cash makes it easier to fetch

play21:59

instructions and data faster

play22:02

since it is close to the CPU the larger

play22:05

the cash the faster the CPU the CPU also

play22:10

has individual processing units inside

play22:12

it known as cores

play22:14

the more cores a CPU has the better

play22:18

cores come in multiples of two

play22:21

a CPU with four cores has significantly

play22:25

greater processing power compared to a

play22:28

dual core CPU

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Computing SystemsCPU ArchitectureVon NeumannALUInstruction SetCISCRISCBinary SystemMemory AddressingDigital ComputingArtificial Intelligence