The First Programming Languages: Crash Course Computer Science #11

CrashCourse
10 May 201711:52

Summary

TLDR这一集的CrashCourse计算机科学课程由Carrie Anne主持,深入探讨了软件的发展历史。起初,编程工作非常繁琐且不灵活,因为需要直接在硬件层面上使用机器语言进行编程。为了简化编程过程,程序员开始寻求更通用的方法,于是软件的概念应运而生。视频介绍了机器语言、汇编语言、以及高级编程语言的发展历程。重点讲述了Grace Hopper博士对高级编程语言的贡献,以及她设计的编译器如何将源代码转换为机器代码。此外,还提到了FORTRAN和COBOL等早期高级语言,它们如何极大地降低了编程的门槛,使计算机科学从专家领域转变为广泛的通用工具。最后,视频展望了编程语言设计的黄金时代,以及它们如何与计算机硬件的进步同步发展,从而让更多人能够快速地实现更伟大的成就。

Takeaways

  • 💾 硬件层面编程既繁琐又缺乏灵活性,因此程序员寻求更通用的编程方式,即软件。
  • 📏 在设计的CPU中,指令的前四位是操作码(OPCODE),如0010表示LOAD_A指令,将值从内存移动到寄存器A。
  • 🔢 计算机硬件只能处理原始的二进制指令,这是机器语言或机器代码,早期编程需要完全使用机器代码。
  • 📝 程序员首先在纸上用高级语言(如英语)编写程序,称为伪代码,然后手动将其转换为二进制机器代码。
  • 🚀 20世纪40年代末至50年代,程序员开发了更易读的高级语言,使用助记符代替操作码,如LOAD_A 14。
  • 🔩 汇编器是将文本指令转换为二进制指令的程序,它隐藏了不必要的复杂性,使编程更容易。
  • 🔗 汇编语言与机器代码有直接的一对一映射,因此与底层硬件紧密相关,但仍然要求程序员考虑寄存器和内存位置。
  • 🤔 格蕾丝·霍珀博士设计了一种高级编程语言A-0,并在1952年构建了第一个编译器,将源代码转换为低级语言或机器代码。
  • 🔑 FORTRAN是IBM在1957年发布的编程语言,它比手写的汇编代码短20倍,并且由编译器转换为机器代码。
  • 📈 COBOL是一种高级、易于使用的编程语言,旨在跨不同计算机使用,它实现了“一次编写,到处运行”的概念。
  • 🌐 高级编程语言的出现降低了计算机领域的入门门槛,使科学家、工程师、医生等能够将计算融入工作。
  • 📚 从20世纪60年代到新千年,出现了多种编程语言,如ALGOL、LISP、BASIC、Pascal、C、Smalltalk、C++、Objective-C、Perl、Python、Ruby和Java等。

Q & A

  • 为什么硬件层面的编程被认为是笨重和不灵活的?

    -硬件层面的编程直接使用机器语言,这是一种由0和1组成的二进制指令,对人类来说难以阅读和编写。它缺乏高层次的抽象,使得编程工作变得复杂且容易出错,同时也难以适应程序需求的变化。

  • 在计算机科学中,什么是操作码(OPCODE)?

    -操作码(OPCODE)是机器语言指令中的前四个比特,用来指定计算机执行的操作类型。例如,在假设的CPU中,操作码0010代表LOAD_A指令,即将内存中的值移动到寄存器A。

  • 什么是伪代码(Pseudo-Code)?

    -伪代码是一种非正式的、高层次的程序描述,通常用英语或其他人类语言编写,用来描述程序的逻辑和步骤,而不涉及具体的编程语言细节。它帮助程序员在编写具体的机器代码之前,先在纸上设计和理解程序的流程。

  • 汇编器(Assembler)的主要功能是什么?

    -汇编器的主要功能是读取用汇编语言编写的程序,并将这些文本指令转换成对应的机器代码。它通过这种方式,使得程序员可以使用更容易理解的助记符和标签来编写程序,而不是直接使用复杂的二进制机器代码。

  • 为什么汇编语言仍然与底层硬件紧密相关?

    -汇编语言中的每条指令通常直接对应于机器指令,形成一对一的映射关系,因此它与特定的硬件架构紧密相关。汇编语言的程序员需要考虑具体的寄存器和内存位置,这限制了程序的可移植性,并要求对硬件有一定的了解。

  • Grace Hopper博士设计了哪种高级编程语言?

    -Grace Hopper博士设计了一种名为“A-0”的高级编程语言,这是为了提高编程的效率和可读性,减少对底层硬件操作的依赖。

  • 什么是编译器(Compiler)?

    -编译器是一种特殊的程序,它能够将用高级编程语言编写的“源代码”转换成低级语言,如汇编语言或CPU能直接处理的二进制机器代码。编译器的出现极大地简化了编程过程,使得程序员可以更加专注于程序逻辑而非底层实现。

  • FORTRAN编程语言是如何影响早期计算机编程的?

    -FORTRAN(Formula Translation的缩写)是由IBM在1957年发布的,它在早期计算机编程中占据主导地位。FORTRAN语言的平均程序长度比手写的汇编代码短20倍,而且FORTRAN编译器能够将这些代码转换成机器代码。尽管社区最初对它的性能持怀疑态度,但由于程序员能够更快地编写更多的代码,它在经济上是一个容易的选择。

  • COBOL语言的主要特点是什么?

    -COBOL(Common Business-Oriented Language的缩写)是一种高级、易于使用的编程语言,它允许程序员一次编写代码,然后在不同的计算机上运行,而不需要针对每种硬件重新编写代码。这种特性被称为“一次编写,到处运行”(write once, run anywhere),它减少了对特定硬件的依赖,提高了代码的可移植性。

  • 高级编程语言的出现如何降低了计算机领域的入门门槛?

    -高级编程语言的出现使得非计算机专家和爱好者,如科学家、工程师、医生、经济学家和教师等,都能够将计算纳入他们的工作。这些语言通过提供更高层次的抽象,隐藏了底层的复杂性,使得编程变得更加容易和可访问,从而降低了计算机领域的入门门槛。

  • 编程语言设计中的“黄金时代”是指什么时期?

    -编程语言设计中的“黄金时代”指的是20世纪60年代至90年代,这一时期出现了许多重要的编程语言,如ALGOL、LISP、BASIC、Pascal、C、Smalltalk、C++、Objective-C、Perl、Python、Ruby和Java等。这些语言的出现与计算机硬件的显著进步并行,推动了编程语言设计的快速发展。

  • 为什么说编程语言的抽象能力对于创建复杂程序至关重要?

    -编程语言的抽象能力允许程序员创建复杂的程序而不必深陷底层硬件细节。这种抽象使得程序员能够使用更接近自然语言的构造来表达算法和逻辑,从而减少了编写、维护和理解代码的工作量。高级编程语言的这种特性使得创建需要数百万甚至数千万行汇编代码的复杂程序成为可能。

Outlines

00:00

📚 计算机软件的起源与机器语言

本段落介绍了计算机科学的硬件基础,包括电力、电路、寄存器、RAM、算术逻辑单元(ALU)和中央处理器(CPU)。随后讨论了硬件编程的局限性,即其繁琐和缺乏灵活性,因此程序员寻求更高效的方式来编程,即软件。通过一个简单程序的示例,解释了机器语言的概念,包括操作码(OPCODE)和内存地址的概念。此外,还提到了伪代码(Pseudo-Code)的概念,以及汇编器(Assembler)的发明,它允许程序员使用更易读的文本指令来代替二进制机器代码,从而简化了编程过程。

05:03

🔍 汇编语言与高级编程语言的发展

该段落讲述了汇编语言虽然比机器语言更易于理解,但仍然与硬件紧密相关,并且需要程序员考虑寄存器和内存位置。接着,介绍了Grace Hopper博士和她的A-0编程语言,这是高级编程语言的早期形式,以及编译器的诞生,它能够将高级语言代码转换为机器代码。随后,IBM推出的FORTRAN语言成为早期计算机编程的主导语言。此外,还提到了COBOL语言的发展,它允许在不同计算机架构上使用相同的源代码,实现了“一次编写,到处运行”的理念。最后,强调了高级编程语言如何降低了计算机使用的门槛,使更多的人能够将计算融入到他们的工作中。

10:04

🚀 编程语言的未来与智能化

本段落探讨了编程语言的未来,提出了使用“普通英语”进行编程的终极目标,即通过自然语言与计算机交互。虽然这样的智能系统目前还是科幻小说中的概念,但未来的发展可能会实现这一点。同时,提到了CuriosityStream这个流媒体服务,它提供了大量纪录片和非虚构作品,包括关于互联网的系列节目“Digits”。最后,鼓励观众继续关注接下来的几集,以深入理解编程语言和软件如何实现酷炫和不可思议的事情。

Mindmap

Keywords

💡硬件

硬件指的是计算机的物理组件,如电力和电路、寄存器、随机存取存储器(RAM)、算术逻辑单元(ALU)和中央处理单元(CPU)。在视频中,硬件是计算机科学系列早期讨论的重点,它们是编程的基础,但直接在硬件层面编程既繁琐又缺乏灵活性。

💡软件

软件是指计算机系统中的非物理组件,它包括所有的程序和与之相关的可能被计算机执行的数据。视频中提到,由于硬件层面编程的不便,程序员寻求更灵活的编程方式,即软件,它提供了一种“更软”的媒介来编程计算机。

💡机器语言

机器语言,也称为机器码,是计算机硬件直接处理的原始二进制指令。它是处理器“母语”,是唯一能够被CPU直接执行的语言。视频中解释说,在计算机早期,人们不得不用机器语言编写整个程序,这是非常繁琐的过程。

💡伪代码

伪代码是一种非正式的、类似自然语言的高级程序描述,它不包含具体的编程语言语法,但提供了程序逻辑的概览。在视频中,伪代码被用作设计程序的初步阶段,程序员会在纸上用英语等语言写出程序的高级版本,然后再将其转换为机器码。

💡汇编语言

汇编语言是一种低级编程语言,它比机器语言更易于人类阅读,但仍然非常接近硬件。汇编语言使用助记符(mnemonics)来代表机器指令的操作码,并允许程序员使用文本指令而不是二进制码。视频中提到,汇编器(Assembler)是一种将汇编语言程序转换为机器码的程序。

💡汇编器

汇编器是一种将汇编语言编写的程序转换成机器码的软件工具。它使得程序员可以使用文本形式的指令而不是直接编写二进制代码。视频中强调了汇编器的重要性,它通过自动计算跳转地址等功能,减轻了程序员的工作负担。

💡抽象

在编程中,抽象是指隐藏实现细节,只展示必要的信息以简化问题解决过程。视频中提到,通过使用高级编程语言,程序员可以创建变量等抽象概念,而无需关心底层的寄存器或内存位置。抽象允许程序员专注于程序逻辑而非底层硬件细节。

💡编译器

编译器是一种软件工具,它将用高级编程语言编写的源代码转换成计算机能够执行的低级代码,如汇编语言或机器码。视频中提到,Grace Hopper博士设计了第一个编译器,它能够将高级语言编写的代码转换成机器指令,极大地提高了编程效率。

💡FORTRAN

FORTRAN(Formula Translation的缩写)是一种早期的高级编程语言,由IBM在1957年发布。它对早期计算机编程产生了重要影响,使得程序比手工编写的汇编代码短得多。视频中提到,FORTRAN的编译器会将高级代码转换成机器码,尽管社区最初对其性能持怀疑态度,但其提高的编程效率使其成为了一个经济上容易接受的选择。

💡COBOL

COBOL(Common Business-Oriented Language的缩写)是一种高级编程语言,设计用于商业数据处理。视频中指出,COBOL的一个关键特点是它的跨平台兼容性,即“一次编写,到处运行”的理念。每种计算架构都需要自己的COBOL编译器,但所有编译器都能接受相同的COBOL源代码。

💡编程语言发展

视频中概述了编程语言的发展历程,从20世纪50年代的FORTRAN和COBOL到21世纪的Swift、C#和Go。每种新语言都试图利用新的抽象来简化或增强编程的某些方面,或利用新兴的技术和平台。这个发展历史强调了编程语言在使计算机科学更加易于访问和应用方面的重要性。

Highlights

本集由CuriosityStream赞助,Carrie Anne主持,讨论了计算机科学的硬件和软件。

早期编程在硬件层面上非常繁琐和不灵活,因此程序员寻求更通用的编程方式。

介绍了软件的概念,以及它如何作为编程计算机的“更软”介质。

通过CPU设计的简单程序,解释了指令的二进制表示和操作码(OPCODE)。

说明了如何将高级程序伪代码转换为机器代码,并手动进行二进制编码。

20世纪40年代末到50年代,程序员开发了更易读的高级语言,并引入了助记符。

介绍了汇编器(Assembler)的概念,它将文本指令转换为二进制机器代码。

汇编器通过自动计算跳转地址和使用标签简化了编程过程。

尽管汇编语言提供了便利,但它仍然与机器代码紧密相关,并且需要程序员考虑底层硬件。

Grace Hopper博士设计了高级编程语言A-0,并在1952年构建了第一个编译器。

A-0语言和后来的变种并未广泛使用,而FORTRAN在1957年发布并主导了早期计算机编程。

FORTRAN允许程序员以更短的代码量编写程序,并通过编译器转换为机器代码。

1959年,行业、学术界和政府的计算机专家组成了一个联盟,开发了可在不同机器上使用的通用编程语言COBOL。

COBOL引入了“一次编写,到处运行”的概念,减少了对特定硬件的依赖。

高级编程语言的出现降低了进入计算机领域的门槛,使得非计算机专家也能将计算融入工作。

从1960年代到新千年,编程语言设计经历了黄金时代,与计算机硬件的进步同步发展。

许多编程语言如ALGOL、LISP、BASIC、Pascal、C、Smalltalk、C++、Objective-C、Perl、Python、Ruby和Java等相继出现。

编程语言的设计旨在利用新的抽象来简化编程的某些方面或利用新兴技术和平台。

许多编程语言和编译器只能在单一类型的计算机上运行,升级计算机通常需要重写所有代码。

编程语言的终极目标被认为是使用“纯英语”,使计算机能够理解并执行口头指令。

本集CrashCourse由CuriosityStream赞助,提供纪录片和非虚构作品的流媒体服务。

Transcripts

play00:03

This episode is brought to you by CuriosityStream.

play00:05

Hi, I’m Carrie Anne and welcome to CrashCourse Computer Science!

play00:08

So far, for most of this series, we’ve focused on hardware -- the physical components of

play00:12

computing -- things like: electricity and circuits, registers and RAM, ALUs and CPUs.

play00:17

But programming at the hardware level is cumbersome and inflexible, so programmers wanted a more

play00:21

versatile way to program computers - what you might call a “softer” medium.

play00:24

That’s right, we’re going to talk about Software!

play00:27

INTRO

play00:36

In episode 8, we walked through a simple program for the CPU we designed.

play00:40

The very first instruction to be executed, the one at memory address 0, was 0010 1110.

play00:46

As we discussed, the first four bits of an instruction is the operation code, or OPCODE

play00:50

for short.

play00:51

On our hypothetical CPU, 0010 indicated a LOAD_A instruction -- which moves a value

play00:57

from memory into Register A.

play00:59

The second set of four bits defines the memory location, in this case, 1110, which is 14

play01:04

in decimal.

play01:05

So what these eight numbers really mean is “LOAD Address 14 into Register A”.

play01:09

We’re just using two different languages.

play01:11

You can think of it like English and Morse Code.

play01:14

“Hello” and “.... . .-.. .-.. ---” mean the same thing -- hello! -- they’re just

play01:19

encoded differently.

play01:20

English and Morse Code also have different levels of complexity.

play01:23

English has 26 different letters in its alphabet and way more possible sounds.

play01:28

Morse only has dots and dashes.

play01:29

But, they can convey the same information, and computer languages are similar.

play01:33

As we've seen, computer hardware can only handle raw, binary instructions.

play01:37

This is the “language” computer processors natively speak.

play01:40

In fact, it’s the only language they’re able to speak.

play01:43

It’s called Machine Language or Machine Code.

play01:45

In the early days of computing, people had to write entire programs in machine code.

play01:49

More specifically, they’d first write a high-level version of a program on paper,

play01:53

in English, for example...

play01:55

“retrieve the next sale from memory, then add this to the running total for the day,

play01:59

week and year, then calculate any tax to be added”

play02:02

...and so on.

play02:03

An informal, high-level description of a program like this is called Pseudo-Code.

play02:06

Then, when the program was all figured out on paper, they’d painstakingly expand and

play02:10

translate it into binary machine code by hand, using things like opcode tables.

play02:15

After the translation was complete, the program could be fed into the computer and run.

play02:19

As you might imagine, people quickly got fed up with this process.

play02:21

So, by the late 1940s and into the 50s, programmers had developed slightly higher-level languages

play02:26

that were more human-readable.

play02:28

Opcodes were given simple names, called mnemonics, which were followed by operands, to form instructions.

play02:33

So instead of having to write instructions as a bunch of 1’s and 0’s, programmers

play02:37

could write something like “LOAD_A 14”.

play02:40

We used this mnemonic in Episode 8 because it’s so much easier to understand!

play02:43

Of course, a CPU has no idea what “LOAD_A 14” is.

play02:46

It doesn’t understand text-based language, only binary.

play02:49

And so programmers came up with a clever trick.

play02:51

They created reusable helper programs, in binary, that read in text-based instructions,

play02:56

and assemble them into the corresponding binary instructions automatically.

play03:00

This program is called -- you guessed it -- an Assembler.

play03:02

It reads in a program written in an Assembly Language and converts it to native machine

play03:07

code.

play03:07

“LOAD_A 14” is one example of an assembly instruction.

play03:10

Over time, Assemblers gained new features that made programming even easier.

play03:14

One nifty feature is automatically figuring out JUMP addresses.

play03:18

This was an example program I used in episode 8:Notice how our JUMP NEGATIVE instruction

play03:22

jumps to address 5, and our regular JUMP goes to address 2.

play03:25

The problem is, if we add more code to the beginning of this program, all of the addresses

play03:29

would change.

play03:30

That’s a huge pain if you ever want to update your program!

play03:32

And so an assembler does away with raw jump addresses, and lets you insert little labels

play03:37

that can be jumped to.

play03:38

When this program is passed into the assembler, it does the work of figuring out all of the

play03:41

jump addresses.

play03:42

Now the programmer can focus more on programming and less on the underlying mechanics under

play03:46

the hood enabling more sophisticated things to be built by hiding unnecessary complexity.

play03:51

As we’ve done many times in this series, we’re once again moving up another level

play03:55

of abstraction.

play03:56

A NEW LEVEL OF ABSTRACTION!

play04:02

However, even with nifty assembler features like auto-linking JUMPs to labels, Assembly

play04:07

Languages are still a thin veneer over machine code.

play04:09

In general, each assembly language instruction converts directly to a corresponding machine

play04:14

instruction – a one-to-one mapping – so it’s inherently tied to the underlying hardware.

play04:18

And the assembler still forces programmers to think about which registers and memory

play04:22

locations they will use.

play04:24

If you suddenly needed an extra value, you might have to change a lot of code to fit

play04:27

it in.

play04:28

Let’s go to the Thought Bubble.

play04:29

This problem did not escape Dr. Grace Hopper.

play04:32

As a US naval officer, she was one of the first programmers on the Harvard Mark 1 computer,

play04:37

which we talked about in Episode 2.

play04:38

This was a colossal, electro-mechanical beast completed in 1944 as part of the allied war effort.

play04:44

Programs were stored and fed into the computer on punched paper tape.

play04:47

By the way, as you can see, they “patched” some bugs in this program by literally putting

play04:51

patches of paper over the holes on the punch tape.

play04:54

The Mark 1’s instruction set was so primitive, there weren’t even JUMP instructions.

play04:58

To create code that repeated the same operation multiple times, you’d tape the two ends

play05:02

of the punched tape together, creating a physical loop.

play05:05

In other words, programming the Mark 1 was kind of a nightmare!

play05:08

After the war, Hopper continued to work at the forefront of computing.

play05:12

To unleash the potential of computers, she designed a high-level programming language

play05:15

called “Arithmetic Language Version 0”, or A-0 for short.

play05:19

Assembly languages have direct, one-to-one mapping to machine instructions.

play05:23

But, a single line of a high-level programming language might result in dozens of instructions

play05:27

being executed by the CPU.

play05:29

To perform this complex translation, Hopper built the first compiler in 1952.

play05:34

This is a specialized program that transforms “source” code written in a programming

play05:38

language into a low-level language, like assembly or the binary “machine code” that the

play05:42

CPU can directly process.

play05:44

Thanks, Thought Bubble.

play05:45

So, despite the promise of easier programming, many people were skeptical of Hopper’s idea.

play05:50

She once said, “I had a running compiler and nobody would touch it.

play05:54

… they carefully told me, computers could only do arithmetic; they could not do programs.”

play05:58

But the idea was a good one, and soon many efforts were underway to craft new programming

play06:03

languages -- today there are hundreds!

play06:04

Sadly, there are no surviving examples of A-0 code, so we’ll use Python, a modern

play06:09

programming language, as an example.

play06:10

Let’s say we want to add two numbers and save that value.

play06:14

Remember, in assembly code, we had to fetch values from memory, deal with registers, and

play06:18

other low-level details.

play06:19

But this same program can be written in python like so:

play06:22

Notice how there are no registers or memory locations to deal with -- the compiler takes

play06:25

care of that stuff, abstracting away a lot of low-level and unnecessary complexity.

play06:29

The programmer just creates abstractions for needed memory locations, known as variables,

play06:34

and gives them names.

play06:35

So now we can just take our two numbers, store them in variables we give names to -- in this

play06:40

case, I picked a and b but those variables could be anything - and then add those together,

play06:45

saving the result in c, another variable I created.

play06:47

It might be that the compiler assigns Register A under the hood to store the value in a,

play06:52

but I don’t need to know about it!

play06:54

Out of sight, out of mind!

play06:55

It was an important historical milestone, but A-0 and its later variants weren’t widely used.

play07:01

FORTRAN, derived from "Formula Translation", was released by IBM a few years later, in

play07:06

1957, and came to dominate early computer programming.

play07:08

John Backus, the FORTRAN project director, said: "Much of my work has come from being

play07:12

lazy.

play07:13

I didn't like writing programs, and so ... I started work on a programming system to make

play07:17

it easier to write programs."

play07:19

You know, typical lazy person.

play07:21

They’re always creating their own programming systems.

play07:23

Anyway, on average, programs written in FORTRAN were 20 times shorter than equivalent handwritten

play07:27

assembly code.

play07:28

Then the FORTRAN Compiler would translate and expand that into native machine code.

play07:32

The community was skeptical that the performance would be as good as hand written code, but

play07:36

the fact that programmers could write more code more quickly, made it an easy choice

play07:40

economically: trading a small increase in computation time for a significant decrease

play07:45

in programmer time.

play07:46

Of course, IBM was in the business of selling computers, and so initially, FORTRAN code

play07:50

could only be compiled and run on IBM computers.

play07:53

And most programing languages and compilers of the 1950s could only run on a single type

play07:57

of computer.

play07:58

So, if you upgraded your computer, you’d often have to re-write all the code too!

play08:02

In response, computer experts from industry, academia and government formed a consortium

play08:06

in 1959 -- the Committee on Data Systems Languages, advised by our friend Grace Hopper -- to guide

play08:12

the development of a common programming language that could be used across different machines.

play08:16

The result was the high-level, easy to use, Common Business-Oriented Language, or COBOL

play08:20

for short.

play08:21

To deal with different underlying hardware, each computing architecture needed its own

play08:24

COBOL compiler.

play08:25

But critically, these compilers could all accept the same COBOL source code, no matter

play08:30

what computer it was run on.

play08:31

This notion is called write once, run anywhere.

play08:33

It’s true of most programming languages today, a benefit of moving away from assembly

play08:37

and machine code, which is still CPU specific.

play08:40

The biggest impact of all this was reducing computing’s barrier to entry.

play08:44

Before high level programming languages existed, it was a realm exclusive to computer experts

play08:48

and enthusiasts.

play08:49

And it was often their full time profession.

play08:51

But now, scientists, engineers, doctors, economists, teachers, and many others could incorporate

play08:56

computation into their work .

play08:58

Thanks to these languages, computing went from a cumbersome and esoteric discipline

play09:02

to a general purpose and accessible tool.

play09:04

At the same time, abstraction in programming allowed those computer experts – now “professional

play09:08

programmers” – to create increasingly sophisticated programs, which would have taken

play09:12

millions, tens of millions, or even more lines of assembly code.

play09:16

Now, this history didn’t end in 1959.

play09:18

In fact, a golden era in programming language design jump started, evolving in lockstep

play09:22

with dramatic advances in computer hardware.

play09:25

In the 1960s, we had languages like ALGOL, LISP and BASIC.

play09:28

In the 70’s: Pascal, C and Smalltalk were released.

play09:31

The 80s gave us C++, Objective-C, and Perl.

play09:34

And the 90’s: python, ruby, and Java.

play09:36

And the new millennium has seen the rise of Swift, C#, and Go - not to be confused with

play09:40

Let it Go and Pokemon Go.

play09:42

Anyway, some of these might sound familiar -- many are still around today.

play09:45

It’s extremely likely that the web browser you’re using right now was written in C++

play09:49

or Objective-C.

play09:50

That list I just gave is the tip of the iceberg.

play09:53

And languages with fancy, new features are proposed all the time.

play09:56

Each new language attempts to leverage new and clever abstractions to make some aspect

play10:00

of programming easier or more powerful, or take advantage of emerging technologies and

play10:03

platforms, so that more people can do more amazing things, more quickly.

play10:07

Many consider the holy grail of programming to be the use of “plain ol’ English”,

play10:10

where you can literally just speak what you want the computer to do, it figures it out,

play10:14

and executes it.

play10:15

This kind of intelligent system is science fiction… for now.

play10:18

And fans of 2001: A Space Odyssey may be okay with that.

play10:21

Now that you know all about programming languages, we’re going to deep dive for the next couple

play10:25

of episodes, and we’ll continue to build your understanding of how programming languages,

play10:29

and the software they create, are used to do cool and unbelievable things.

play10:33

See you next week.

play10:34

Hey guys, this week’s episode was brought to you by CuriosityStream which is a streaming

play10:39

service full of documentaries and non­fiction titles from some really great filmmakers,

play10:43

including exclusive originals.

play10:45

I just watched a great series called “Digits” hosted by our friend Derek Muller.

play10:49

It’s all about the Internet - from its origins, to the proliferation of the Internet of Things,

play10:53

to ethical, or white hat, hacking.

play10:55

And it even includes some special guest appearances… like that John Green guy you keep mentioning

play11:00

in the comments.

play11:01

And Curiosity Stream offers unlimited access starting at $2.99 a month, and for you guys,

play11:07

the first two months are free if you sign up at curiositystream.com/crashcourse

play11:12

and use the promo code "crash course" during the sign-up process.

Rate This

5.0 / 5 (0 votes)

Related Tags
计算机科学编程语言机器码汇编语言高级语言编译器FORTRANCOBOL抽象历史软件发展
Do you need a summary in English?