Early Computing: Crash Course Computer Science #1
Summary
TLDR在这段视频脚本中,Carrie Anne带领观众从计算机科学的基础知识出发,探讨了从比特、字节、晶体管到操作系统、虚拟现实和机器人等广泛的计算主题。她强调,尽管计算机已成为现代社会的命脉,但本系列不会教授编程,而是作为一门学科和技术来探索。计算机技术已经改变了我们生活的方方面面,从自动化农业到全球通信,再到虚拟现实和自动驾驶汽车等新领域。视频还追溯了计算的起源,从公元前2500年的美索不达米亚发明的算盘,到19世纪的机械计算机,再到20世纪的电子计算机,展现了计算设备如何随着时间推移而发展,以及它们如何降低计算的门槛并增强人类的能力。特别提到了查尔斯·巴贝奇和艾达·洛夫莱斯的贡献,以及美国人口普查如何推动了计算机技术的发展,最终导致了IBM的成立。这段历史概述不仅展示了计算机的复杂性,也揭示了它们在简化我们生活方面的作用。
Takeaways
- 🌐 计算机科学系列课程将从基础的比特、字节、晶体管、逻辑门等概念讲起,直至操作系统、虚拟现实和机器人等高级主题。
- 🚫 课程不会教授编程技能,而是探索计算作为一门学科和技术的多个领域。
- 💡 计算机是现代社会的命脉,如果它们突然停止工作,将导致电网关闭、交通瘫痪、金融市场冻结等严重后果。
- 🔄 工业革命期间的制造进步为人类文明带来了新的规模,而计算技术现在也在做同样的事情,从自动化农业到全球电信和教育机会。
- 📱 计算机可能看起来非常复杂,但它们实际上是通过多层抽象执行复杂操作的简单机器。
- 🔢 我们将从简单的1和0开始,逐步构建起逻辑单元、CPU、操作系统、整个互联网等概念。
- 🔧 历史上的计算设备,如算盘、星盘、滑尺和各种时钟,都是为了降低计算的难度并增强我们的思维能力。
- ⌛️ 最早的“计算机”一词出现在1613年,指的是进行计算的人,后来这个术语开始指代设备。
- 🔩 德国数学家莱布尼茨建造的步进计算器是能够进行加减乘除的机械装置,它的设计影响了接下来三个世纪的计算器设计。
- 🔍 查尔斯·巴贝奇提出了差分机和分析机的概念,后者被认为是通用计算机的前身,启发了第一代计算机科学家。
- 🏆 阿达·洛夫莱斯为分析机编写了假设程序,被认为是世界上第一位程序员。
- 📊 19世纪末,计算设备主要用于科学和工程领域的特殊任务,但很少出现在商业、政府或家庭生活中。
- 🇺🇸 1890年的美国人口普查面临严重问题,需要计算机提供的效率,这一需求催生了赫尔曼·霍勒里斯的制表机,它使用打孔卡片来代表数据。
- 📈 霍勒里斯的机器大大提高了数据处理的速度,为后续的商业计算和IBM公司的成立奠定了基础。
Q & A
CrashCourse Computer Science 系列将涵盖哪些内容?
-CrashCourse Computer Science 系列将从比特、字节、晶体管和逻辑门等基础知识开始,一直讲到操作系统、虚拟现实和机器人等高级主题。
为什么 CrashCourse Computer Science 系列不教授编程?
-该系列的目标是探索计算机科学作为一门学科和技术的广泛主题,而不是教授具体的编程技能。
计算机对现代社会有多重要?
-计算机是现代社会的命脉,如果它们突然全部关闭,将导致电网关闭、车辆和飞机事故、水处理厂停止工作、股市冻结等一系列严重后果。
工业革命期间的制造进步如何改变了人类文明?
-工业革命期间的制造进步带来了农业、工业和家庭生活的新规模,实现了更优越的收成、更便宜和更快的旅行与通信,以及通常更高质量的生活。
电子时代是如何定义的?
-电子时代是指我们生活的时代,这个时代可能会因为计算机技术带来的变革而被记住,比如自动化农业、医疗设备、全球电信、教育机会,以及虚拟现实和自动驾驶汽车等新领域。
计算机是如何通过抽象层次执行复杂操作的?
-计算机实际上是简单的机器,它们通过多个抽象层次执行复杂操作,从简单的1和0开始,构建到逻辑单元、CPU、操作系统、整个互联网等。
最早被认可的计算设备是什么?
-最早被认可的计算设备是算盘,它发明于公元前2500年左右的美索不达米亚,是一种手动操作的计算器,用于帮助添加和减去许多数字。
“计算机”这个词最早是用来指什么的?
-“计算机”这个词最早出现在1613年,由Richard Braithwait在他的书中使用,当时它并不是指机器,而是指一个职业头衔,指的是进行计算的人。
Gottfried Leibniz 构建的 Step Reckoner 是什么?
-Step Reckoner 是一种机械计算设备,它通过一系列齿轮转动来执行加法、减法、乘法和除法运算,是第一个能够执行所有四种运算的机器。
Charles Babbage 提出的 Difference Engine 是用来做什么的?
-Difference Engine 是一种机械设备,能够近似多项式,这些多项式可以描述多个变量之间的关系,如射程和气压,或者披萨数量和幸福感之间的关系。
Ada Lovelace 为何被认为可能是世界上第一位程序员?
-Ada Lovelace 为 Babbage 的 Analytical Engine 编写了假想程序,预言了一种新的、强大的、用于未来分析的语言,因此她经常被认为是世界上第一位程序员。
Herman Hollerith 的制表机如何帮助了1890年的美国人口普查?
-Herman Hollerith 的制表机是一种电子机械设备,使用打孔卡片来代表数据。这种机器大大提高了数据处理的速度,使得1890年的美国人口普查在两年半内完成,节省了大量时间和金钱。
IBM(国际商业机器公司)是如何成立的?
-IBM 是由 Herman Hollerith 成立的 The Tabulating Machine Company 与其他机器制造商在1924年合并而成的,它满足了商业对于提高劳动和数据处理效率的计算工具的需求。
Outlines
🌟 计算机科学概览
Carrie Anne欢迎观众来到CrashCourse计算机科学系列。她介绍了这个系列将从基础的比特、字节、晶体管和逻辑门开始,一直到操作系统、虚拟现实和机器人。虽然不会教授编程,但会探索计算机作为学科和技术的一系列主题。计算机是现代世界的核心,若突然停止工作,将导致电网关闭、交通事故、飞机坠落等严重后果。计算机技术已经改变了我们生活的方方面面,与工业革命带来的变化相似,从自动化农业到医疗设备,再到全球通信和教育机会,以及虚拟现实和自动驾驶汽车等新领域。我们将从计算机的起源开始,尽管电子计算机相对较新,但计算的需求早已存在。
📚 计算的起源和早期设备
最早的计算设备是算盘,约公元前2500年在美索不达米亚发明,是一种手动操作的计算器,帮助进行加减运算并存储计算状态。随着社会规模的扩大,算盘成为了必要的工具。算盘有多种变体,基本版本中每行代表不同的十进制数。算盘之后,人类发明了各种计算设备,如星盘、滑尺和各种时钟,这些设备使计算变得更快、更容易、更准确。然而,这些设备并不被称为“计算机”。最早记录的“计算机”一词出现在1613年,由Richard Braithwait所著的一本书中,指的是进行计算的人。直到19世纪末,“计算机”一词开始转变为指代设备,如莱布尼茨在1694年建造的步进计算器。
🔩 机械计算器和早期计算机的发展
莱布尼茨认为,优秀的人不应该浪费时间在计算上,因为任何农民都能用机器同样准确地完成工作。步进计算器通过齿轮转动来计算,类似于汽车里程表。该设备能够进行加法、减法、乘法和除法。尽管如此,现实世界的问题通常需要多步骤计算才能得出答案,这可能需要数小时甚至数天。此外,这些手工制造的机器昂贵且不普及。因此,在20世纪之前,大多数人通过预先计算的表格来体验计算。特别是在战场上,计算的准确性和速度尤为重要,军方是最早将计算应用于复杂问题的机构之一。查尔斯·巴贝奇在1822年提出了差分机的概念,这是一种更复杂的机械装置,能够近似多项式。尽管差分机项目最终被放弃,但其设计在1991年被历史学家根据巴贝奇的图纸和著作完成,并成功运行。巴贝奇还构想了分析机,这是一种“通用计算机”,能够进行多种计算,具有记忆功能和原始打印机。尽管分析机从未完全建成,但它的概念对后来的计算机科学家产生了深远影响,巴贝奇因此被誉为“计算机之父”。
📈 人口普查和计算机的商业应用
19世纪末,随着美国人口的激增,1890年的人口普查面临了严重的效率问题。人口普查需要每十年进行一次,以分配联邦资金、国会代表等。1880年的人口普查耗时七年手动编制,完成时已经过时。为了解决这个问题,人口普查局求助于赫尔曼·霍勒里斯,他建造了一种制表机。这台机器是“机电”式的,使用传统的机械系统进行计数,同时结合了电力驱动组件。霍勒里斯的机器使用穿孔卡片,卡片上的一系列孔位代表数据。这种机器比手动制表快了大约10倍,人口普查仅用两年半就完成了,节省了数百万美元。商业领域开始认识到计算的价值,并看到了它通过改善劳动密集型和数据密集型任务(如会计、保险评估和库存管理)来提高利润的潜力。为了满足这一需求,霍勒里斯成立了制表机公司,该公司后来在1924年与其他机器制造商合并,成为国际商业机器公司(IBM)。这些机电“商用机器”取得了巨大成功,改变了商业和政府,并在20世纪中叶,随着世界人口的爆炸和全球贸易的兴起,对更快、更灵活的数据处理工具的需求日益增长,为数字计算机的发展奠定了基础。
Mindmap
Keywords
💡计算机科学
💡抽象层次
💡算盘
💡工业革命
💡查尔斯·巴贝奇
💡艾达·洛夫莱斯
💡赫尔曼·霍勒瑞斯
💡打孔卡片
💡电子时代
💡数字计算机
💡国际商业机器公司(IBM)
Highlights
CrashCourse计算机科学系列将从比特、字节、晶体管和逻辑门等基础概念讲起,一直到操作系统、虚拟现实和机器人等高级主题。
虽然不会教授编程,但会探索计算作为一门学科和技术的广泛主题。
计算机是现代社会的命脉,如果它们突然全部关闭,将导致电网关闭、交通瘫痪、水处理厂停止运作等一系列严重后果。
计算技术正在改变我们生活的几乎每一个方面,类似于工业革命时期制造技术带来的变革。
最早的计算设备是算盘,约公元前2500年在美索不达米亚发明,它是一种手动操作的计算器。
算盘通过不同行代表不同数量级的十进制数来帮助进行加法和减法运算。
在接下来的4000年中,人类开发了各种巧妙的计算设备,如星盘、滑尺和各种类型的钟表。
最早的“计算机”一词出现在1613年,指的是进行计算的人,有时使用机器,有时不使用。
戈特弗里德·莱布尼茨在1694年建造了步进计算器,这是一种能够执行加法、减法、乘法和除法的机械装置。
查尔斯·巴贝奇提出了差分机的概念,这是一种可以近似多项式的更复杂的机械装置。
巴贝奇构想了一个更为复杂的机器——分析机,它是一种通用计算机,可以进行多种计算。
艾达·洛夫莱斯为分析机编写了假设程序,被认为是世界上第一位程序员。
赫尔曼·霍勒瑞斯的制表机使用打孔卡片,通过电气化组件提高了数据处理的效率。
1890年人口普查面临的问题促使美国政府寻求计算机的效率,霍勒瑞斯的机器使人口普查工作在两年半内完成。
霍勒瑞斯的公司后来与其他机器制造商合并,于1924年成为国际商业机器公司(IBM)。
电子计算机相对于之前的机械设备,提供了更快更灵活的工具,以应对世界人口的爆炸性增长和全球贸易的兴起。
本系列将从简单的1和0开始,逐步构建到逻辑单元、CPU、操作系统、整个互联网及更广阔的领域。
即使不需要了解网页是如何编程的,本系列也会在不依赖于前几集的情况下,逐集建立在之前的基础上。
希望通过本系列,观众能够更好地理解计算机在个人生活和社会中的作用,以及人类最伟大的发明——计算机,仍处于起步阶段,其最大的影响尚未到来。
Transcripts
Hello world, I’m Carrie Anne, and welcome to CrashCourse Computer Science!
Over the course of this series, we’re going to go from bits, bytes, transistors and logic
gates, all the way to Operating Systems, Virtual Reality and Robots!
We’re going to cover a lot, but just to clear things up - we ARE NOT going to teach
you how to program.
Instead, we’re going to explore a range of computing topics as a discipline and a
technology.
Computers are the lifeblood of today’s world.
If they were to suddenly turn off, all at once, the power grid would shut down, cars
would crash, planes would fall, water treatment plants would stop, stock markets would freeze,
trucks with food wouldn’t know where to deliver, and employees wouldn’t get paid.
Even many non-computer objects - like DFTBA shirts and the chair I’m sitting on – are
made in factories run by computers.
Computing really has transformed nearly every aspect of our lives.
And this isn’t the first time we’ve seen this sort of technology-driven global change.
Advances in manufacturing during the Industrial Revolution brought a new scale to human civilization
- in agriculture, industry and domestic life.
Mechanization meant superior harvests and more food, mass produced goods, cheaper and
faster travel and communication, and usually a better quality of life.
And computing technology is doing the same right now – from automated farming and medical
equipment, to global telecommunications and educational opportunities, and new frontiers
like Virtual Reality and Self Driving Cars.
We are living in a time likely to be remembered as the Electronic Age.
With billions of transistors in just your smartphones, computers can seem pretty complicated,
but really, they’re just simple machines that perform complex actions through many
layers of abstraction.
So in this series, we’re going break down those layers, and build up from simple 1’s
and 0’s, to logic units, CPUs, operating systems, the entire internet and beyond.
And don’t worry, in the same way someone buying t-shirts on a webpage doesn’t need
to know how that webpage was programmed, or the web designer doesn’t need to know how
all the packets are routed, or router engineers don’t need to know about transistor logic,
this series will build on previous episodes but not be dependent on them.
By the end of this series, I hope that you can better contextualize computing’s role
both in your own life and society, and how humanity's (arguably) greatest invention is
just in its infancy, with its biggest impacts yet to come.
But before we get into all that, we should start at computing’s origins, because although
electronic computers are relatively new, the need for computation is not.
INTRO
The earliest recognized device for computing
was the abacus, invented in Mesopotamia around 2500 BCE.
It’s essentially a hand operated calculator, that helps add and subtract many numbers.
It also stores the current state of the computation, much like your hard drive does today.
The abacus was created because, the scale of society had become greater than what a
single person could keep and manipulate in their mind.
There might be thousands of people in a village or tens of thousands of cattle.
There are many variants of the abacus, but let’s look at a really basic version with
each row representing a different power of ten.
So each bead on the bottom row represents a single unit, in the next row they represent
10, the row above 100, and so on.
Let’s say we have 3 heads of cattle represented by 3 beads on the bottom row on the right side.
If we were to buy 4 more cattle we would just slide 4 more beads to the right for a total of 7.
But if we were to add 5 more after the first 3 we would run out of beads, so we would slide
everything back to the left, slide one bead on the second row to the right, representing
ten, and then add the final 2 beads on the bottom row for a total of 12.
This is particularly useful with large numbers.
So if we were to add 1,251 we would just add 1 to the bottom row, 5 to the second row,
2 to the third row, and 1 to the fourth row - we don’t have to add in our head and the
abacus stores the total for us.
Over the next 4000 years, humans developed all sorts of clever computing devices, like
the astrolabe, which enabled ships to calculate their latitude at sea.
Or the slide rule, for assisting with multiplication and division.
And there are literally hundred of types of clocks created that could be used to calculate
sunrise, tides, positions of celestial bodies, and even just the time.
Each one of these devices made something that was previously laborious to calculate much
faster, easier, and often more accurate –– it lowered the barrier to entry, and at the same
time, amplified our mental abilities –– take note, this is a theme we’re going to touch
on a lot in this series.
As early computer pioneer Charles Babbage said: “At each increase of knowledge, as
well as on the contrivance of every new tool, human labour becomes abridged.”
However, none of these devices were called “computers”.
The earliest documented use of the word “computer” is from 1613, in a book by Richard Braithwait.
And it wasn’t a machine at all - it was a job title.
Braithwait said, “I have read the truest computer of times,
and the best arithmetician that ever breathed, and he reduceth thy dayes into a short number”.
In those days, computer was a person who did calculations, sometimes with the help of machines,
but often not.
This job title persisted until the late 1800s, when the meaning of computer started shifting
to refer to devices.
Notable among these devices was the Step Reckoner, built by German polymath Gottfried Leibniz
in 1694.
Leibniz said “... it is beneath the dignity of excellent men to waste their time in calculation
when any peasant could do the work just as accurately with the aid of a machine.”
It worked kind of like the odometer in your car, which is really just a machine for adding
up the number of miles your car has driven.
The device had a series of gears that turned; each gear had ten teeth, to represent the
digits from 0 to 9.
Whenever a gear bypassed nine, it rotated back to 0 and advanced the adjacent gear by one tooth.
Kind of like when hitting 10 on that basic abacus.
This worked in reverse when doing subtraction, too.
With some clever mechanical tricks, the Step Reckoner was also able to multiply and divide
numbers.
Multiplications and divisions are really just many additions and subtractions.
For example, if we want to divide 17 by 5, we just subtract 5, then 5, then 5 again,
and then we can’t subtract any more 5’s… so we know 5 goes into 17 three times, with
2 left over.
The Step Reckoner was able to do this in an automated way, and was the first machine that
could do all four of these operations.
And this design was so successful it was used for the next three centuries of calculator design.
Unfortunately, even with mechanical calculators, most real world problems required many steps
of computation before an answer was determined.
It could take hours or days to generate a single result.
Also, these hand-crafted machines were expensive, and not accessible to most of the population.
So, before 20th century, most people experienced computing through pre-computed tables assembled
by those amazing “human computers” we talked about.
So if you needed to know the square root of 8 million 6 hundred and 75 thousand 3 hundred
and 9, instead of spending all day hand-cranking your step reckoner, you could look it up in
a huge book full of square root tables in a minute or so.
Speed and accuracy is particularly important on the battlefield, and so militaries were
among the first to apply computing to complex problems.
A particularly difficult problem is accurately firing artillery shells, which by the 1800s
could travel well over a kilometer (or a bit more than half a mile).
Add to this varying wind conditions, temperature, and atmospheric pressure, and even hitting
something as large as a ship was difficult.
Range Tables were created that allowed gunners to look up environmental conditions and the
distance they wanted to fire, and the table would tell them the angle to set the canon.
These Range Tables worked so well, they were used well into World War Two.
The problem was, if you changed the design of the cannon or of the shell, a whole new
table had to be computed, which was massively time consuming and inevitably led to errors.
Charles Babbage acknowledged this problem in 1822 in a paper to the Royal Astronomical
Society entitled: “Note on the application of machinery to the computation of astronomical
and mathematical tables".
Let’s go to the thought bubble.
Charles Babbage proposed a new mechanical device called the Difference Engine, a much
more complex machine that could approximate polynomials.
Polynomials describe the relationship between several variables - like range and air pressure,
or amount of pizza Carrie Anne eats and happiness.
Polynomials could also be used to approximate logarithmic and trigonometric functions, which
are a real hassle to calculate by hand.
Babbage started construction in 1823, and over the next two decades, tried to fabricate
and assemble the 25,000 components, collectively weighing around 15 tons.
Unfortunately, the project was ultimately abandoned.
But, in 1991, historians finished constructing a Difference Engine based on Babbage's drawings
and writings - and it worked!
But more importantly, during construction of the Difference Engine, Babbage imagined
an even more complex machine - the Analytical Engine.
Unlike the Difference Engine, Step Reckoner and all other computational devices before
it - the Analytical Engine was a “general purpose computer”.
It could be used for many things, not just one particular computation; it could be given
data and run operations in sequence; it had memory and even a primitive printer.
Like the Difference Engine, it was ahead of its time, and was never fully constructed.
However, the idea of an “automatic computer” – one that could guide itself through a
series of operations automatically, was a huge deal, and would foreshadow computer programs.
English mathematician Ada Lovelace wrote hypothetical programs for the Analytical Engine, saying,
“A new, a vast, and a powerful language is developed for the future use of analysis.”
For her work, Ada is often considered the world’s first programmer.
The Analytical Engine would inspire, arguably, the first generation of computer scientists,
who incorporated many of Babbage’s ideas in their machines.
This is why Babbage is often considered the "father of computing".
Thanks Thought Bubble!
So by the end of the 19th century, computing devices were used for special purpose tasks
in the sciences and engineering, but rarely seen in business, government or domestic life.
However, the US government faced a serious problem for its 1890 census that demanded
the kind of efficiency that only computers could provide.
The US Constitution requires that a census be conducted every ten years, for the purposes
of distributing federal funds, representation in congress, and good stuff like that.
And by 1880, the US population was booming, mostly due to immigration.
That census took seven years to manually compile and by the time it was completed, it was already
out of date – and it was predicted that the 1890 census would take 13 years to compute.
That’s a little problematic when it’s required every decade!
The Census bureau turned to Herman Hollerith, who had built a tabulating machine.
His machine was “electro-mechanical” – it used traditional mechanical systems for keeping
count, like Leibniz’s Step Reckoner –– but coupled them with electrically-powered components.
Hollerith’s machine used punch cards which were paper cards with a grid of locations
that can be punched out to represent data.
For example, there was a series of holes for marital status.
If you were married, you would punch out the married spot, then when the card was inserted
into Hollerith’s machine, little metal pins would come down over the card – if a spot
was punched out, the pin would pass through the hole in the paper and into a little vial
of mercury, which completed the circuit.
This now completed circuit powered an electric motor, which turned a gear to add one, in
this case, to the “married” total.
Hollerith’s machine was roughly 10x faster than manual tabulations, and the Census was
completed in just two and a half years - saving the census office millions of dollars.
Businesses began recognizing the value of computing, and saw its potential to boost
profits by improving labor- and data-intensive tasks, like accounting, insurance appraisals,
and inventory management.
To meet this demand, Hollerith founded The Tabulating Machine Company, which later merged
with other machine makers in 1924 to become The International Business Machines Corporation
or IBM - which you’ve probably heard of.
These electro-mechanical “business machines” were a huge success, transforming commerce
and government, and by the mid-1900s, the explosion in world population and the rise
of globalized trade demanded even faster and more flexible tools for processing data, setting
the stage for digital computers, which we’ll talk about next week.
浏览更多相关视频
The Singularity, Skynet, and the Future of Computing: Crash Course Computer Science #40
The Cold War and Consumerism: Crash Course Computer Science #24
Natural Language Processing: Crash Course Computer Science #36
Robots: Crash Course Computer Science #37
Screens & 2D Graphics: Crash Course Computer Science #23
Keyboards & Command Line Interfaces: Crash Course Computer Science #22
5.0 / 5 (0 votes)