Computer Scientist Answers Computer Questions From Twitter

WIRED
29 Aug 202314:27

Summary

TLDR视频脚本由哈佛大学计算机科学教授David J. Malan主讲,他回答了关于搜索引擎速度、AI对编程工作的影响、微芯片工作原理、计算机科学教育内容、二进制编码的原因、Windows解决方案、操作系统选择、计算机价格趋势、云计算、计算机内存工作原理、Web3以及固件与软件之间区别的问题。教授以通俗易懂的方式解释了这些复杂的技术概念,旨在提高公众对计算机科学的认识和兴趣。

Takeaways

  • 🌐 搜索引擎之所以能快速工作,是因为它们依赖于分布式计算,拥有遍布全球的成千上万的服务器。
  • 🤖 人工智能不太可能在未来5到10年内取代计算机编程工作,而是会提高人类生产力。
  • 💻 微芯片是电子设备的基础,它们可以解释信号、执行数学运算或存储信息。
  • 🎓 计算机科学专业的学生在大学不仅仅学习编程,还包括数学、网络、图形学等更广泛的领域。
  • 🌐 互联网建立在层层叠加的概念之上,从二进制数字出发,通过抽象层的构建实现信息的传输。
  • 🔢 计算机使用二进制编码而非三进制,是因为二进制更易于实现,对潜在错误更鲁棒。
  • 🔄 重启电脑是解决Windows系统中常见问题的一种直接方法,因为它能重置计算机状态。
  • 🚀 最佳操作系统的选择取决于个人偏好和特定应用场景,没有绝对的“最佳”选项。
  • 💰 计算机硬件趋于降价,但由于我们的期望值不断提高,价格并没有显著下降。
  • ☁️ 云计算允许用户通过租用或时间共享的方式使用他人服务器,这是一种经济高效的资源使用方式。
  • 🔌 计算机内存通过大量的开关(晶体管)来控制电流的通断,以此存储和表示数据。
  • 🌐 Web3 旨在从中心化模式过渡到分布式模式,使用区块链等技术实现数据的分布式存储和集体所有权。
  • 📂 固件是嵌入硬件中的软件,通常与硬件紧密相关,负责硬件的基本操作和功能。

Q & A

  • 搜索引擎是如何快速工作的?

    -搜索引擎通过分布式计算实现快速工作。它们不仅仅拥有一个服务器,而是拥有成百上千甚至可能数十万个服务器遍布全球。当你在Google或Bing中输入关键词进行搜索时,这个关键词最终会在多个服务器之间分发,一些服务器负责抓取前10个结果,其他服务器负责接下来的结果,从而你看到的是一系列整合后的结果。这种分布式处理消除了如果所有信息都来自一个特定服务器可能造成的瓶颈。

  • 未来5到10年内,人工智能会取代计算机编程工作吗?

    -不会。我们已经看到,早期人们手工编写HTML代码来创建网站,随后出现了像Dreamweaver这样的工具来辅助生成代码。现在,像Squarespace和Wix这样的网站允许用户通过点击生成网站。AI在某些领域是这种趋势的演进,它并没有使人类失业,而是提高了人类和AI的生产力。AI和自然语言编程的能力将增强我们逻辑上已经能做的事情,但以更机械的方式。此外,世界上有很多软件中的错误和人类希望在现有及未来的产品中拥有的功能,这个待办事项列表比我们一生中能完成的要长得多。因此,人工智能提高我们的生产力并与我们一起工作,意味着我们和世界可以共同解决更多问题,以更快的速度前进。

  • 微芯片是如何工作的?

    -微芯片工作在所谓的逻辑板或主板上。主板上有许多端口,比如音频端口、网络端口、USB端口等。这些端口连接到主板上的多个芯片,这些芯片能够解释来自端口的信号。主板上最大的芯片通常是中央处理器(CPU),它是电脑的大脑。主板上有大量的线路,这些线路连接着各种微芯片。微芯片可能仅仅是解释来自端口的信号,或者执行数学运算处理信号以转换输入输出,或者最终存储信息。事实上,主板上有各种类型的存储器,比如RAM或ROM,一些芯片可能就是用来存储信息的。

  • 如果任何人都可以学习编程,那么计算机科学家在大学四年里学什么?

    -在计算机科学或计算机工程等相关专业中,学生不仅仅学习编程,还会学习更多关于该领域的知识。他们可能会学习数学、网络、图形学等基础学科,这些知识超越了中学所学的范畴,可以用于解决更宏大的现实世界问题。他们可能会学习如何从数学和其他领域中提取思想,来实现自己的人工智能,使用概率和统计信息来预测智能个体或计算机可能的回答。因此,计算机科学是一个非常广泛的领域,编程只是学习过程中的一个工具。

  • 为什么计算机使用二进制编码而不是三进制,难道三进制不是更快吗?

    -三进制系统并不一定比二进制系统更快,因为二进制使用零和一,实现起来更简单,也更健壮,更能抵抗潜在的错误。如果你熟悉电压水平,比如电池中的电压,计算机很容易区分零伏特或三伏特,但如果我们尝试在两者之间划线,计算机就有可能错误地将1.5伏特这样的中间电压水平判断为接近关闭或打开。尽管三进制在数学上可能有效率,但在现实世界中,由于我们的世界现在运行在电力上,二进制有如此多的动量,它往往是一个净正面效应。

  • 为什么Windows的解决方案总是'你尝试过重启吗?',为什么这总是有效?

    -这是一个非常直接的解决方案,用于解决软件中通常只是错误或失误的问题。重启计算机就是从头开始。计算机的所有短期记忆都会丢失,一切以程序员在微软预期的方式开始,没有可能的干扰,比如计算机进入了程序员没有预料到的奇怪状态或条件。

  • 什么是最好的操作系统?

    -这是计算机领域中的一个宗教问题,因为它会引发关于哪个操作系统最好的宗教式的辩论。当然,最流行的操作系统包括Windows和macOS,但还有一个你可能没有听说过的操作系统,叫做Linux,它在企业界非常普遍。许多今天的服务器实际上运行Linux,而许多桌面或笔记本电脑则运行Windows或macOS。这并不是说有一个最好的操作系统,而是说人们使用的操作系统与他们的应用有关。因此,选择操作系统有时来自于最合适的、最受欢迎的、最支持的,但有时也来自于工程师的个人偏好。

  • 为什么计算机没有变得更便宜?

    -计算机或至少计算机内部的部件确实变得更便宜了。问题在于,你和我的期望一直在上升。我们希望手机、笔记本电脑、台式电脑运行的软件越来越多,游戏越来越多,性能越来越快。因此,即使这些部件变得更便宜,你和我希望它们做得越来越多,速度越来越快,数量越来越大,结果就是价格并没有像你希望的那样下降。也就是说,现在你可以用过去的同样数量的钱获得更多的计算能力,这在某些情况下对我们是有利的。

  • 云计算是什么,能像对五岁孩子那样解释吗?

    -云计算基本上就是你使用别人支付租金或时间共享的服务器。这不是一个新技术,而是多年来在计算机世界和现实世界中使用的一种技术的更好品牌。例如,像Google、Microsoft或Apple这样的公司现在可能能够负担得起大量服务器,然后将这些服务器部分提供给我、你和其他许多客户。

  • 计算机内存是如何工作的?

    -可以把计算机内存想象成由一堆开关驱动的,这些开关可以打开或关闭。例如,如果我把这里的电灯开关关掉,我可以简单地说这个开关代表二进制中的数字零。但如果我打开开关,现在我可以说它代表数字一。当然,我只能用一个电灯开关从零数到一,但如果我再拿一个电灯开关过来,如果我们先这样设置为零,然后我现在更有创意地把这个关掉这个打开,现在声称这是计算机内存如何表示数字二。如果我把这个开关再打开,得到第四种模式,这就是我可能如何表示数字三。当然,如果我们增加越来越多的这些开关,越来越多的这些灯泡,我们可以数得比三更高。实际上,这就是计算机内存最终在做的事情。它使用大量小小的开关,也就是晶体管,来打开和关闭电流,然后它还有其他类型的硬件,比如电容器,它们有能力保持一些电力,就像灯泡亮着一样。

  • 如何向人们解释Web3?

    -Web3、Web2和回顾性地Web1,实际上是描述互联网或我们所知的全球信息网的阶段的流行词。例如,早期的全球信息网,现在可能被称为Web版本1,信息主要是静态的。如果你在互联网上创建一个网站,你会输入你的代码,输入你的内容,把它放在某个服务器上,人们可以阅读这些信息,但是是你,网站开发者,或者你,网站所有者,为其他人创建这些内容来阅读和消费。在Web2中,世界在近年来变得更加动态,现在的网站往往有数据库,并且更加复杂,以至于今天网站中的大部分内容实际上来自我和你的用户。但在Web2中,一切都是非常集中的,无论是Twitter还是Facebook,现在Meta,或其他公司,所有来自我和你的数据,甚至在社交媒体世界中,实际上是存储在这些公司服务器上的。所以Web 3.0或Web3,可以说是潜在地从这种非常集中的模型转变为更分散的模型,其中你和我所创造的数据,以及我们所消费的数据,实际上是分布在多个服务器上,通过一种叫做区块链的技术,例如在某些情况下,没有数据的一个所有者,而是集体所有和因此验证数据确实来自我和你。

  • 固件和软件之间有什么区别?

    -固件实际上是一种软件的同义词。固件只是软件,但它往往是内置在你的硬件中的软件。你可以简单地想象,在最简单的情况下,固件是完全集成到硬件中的软件,本身不能被改变或甚至升级。但这是一个有点过于简化的说法,因为即使是固件,在计算机中,在电话中,或其他设备中,通常也可以更新。为什么?因为固件是与硬件最接近的软件,在这个意义上,它可能是最重要的。如果固件出现问题,你可能甚至无法打开那个设备,无论是电话、计算机,还是甚至是现在的冰箱。

Outlines

00:00

🤖 计算机科学支持 - 分布式计算与人工智能

本段落介绍了分布式计算的概念,解释了搜索引擎如谷歌和必应如何通过全球数以千计的服务器快速工作。同时,讨论了人工智能对计算机编程工作的影响,强调AI作为一种工具,可以提高人类生产力,而不是取代人类。此外,还探讨了软件中存在的众多错误和待开发功能,以及人工智能如何帮助我们更高效地解决问题。

05:00

🌐 互联网的构建 - 二进制与操作系统

这一部分解释了互联网是如何基于多层理念构建的,从二进制数字开始,通过标准化的模式来表示数字、字母和颜色等。讨论了计算机使用二进制而不是三进制的原因,强调了二进制在现实世界中的实施简便性和对错误的鲁棒性。同时,还回答了关于Windows操作系统常见问题解决方案的问题,并讨论了不同操作系统的适用性和个人偏好。

10:00

💡 云计算与网络 - 技术解释与未来趋势

这一段通过简单的类比解释了云计算的概念,将其比作租赁或分时共享服务器。探讨了计算机内存的工作原理,用开关的开和关来形象地描述二进制数的表示。此外,还介绍了Web3的概念,即将互联网从中心化转向分布式,以及固件与软件的区别。最后,总结了本次计算机科学支持视频的主要问题和学习点。

Mindmap

Keywords

💡搜索引擎

搜索引擎是一种网络服务,它允许用户通过关键字或短语来查找和检索互联网上的信息。在视频中,提到了Google和Bing等搜索引擎通过分布式计算来快速处理用户的查询请求。搜索引擎通过在世界各地部署成千上万的服务器,使得搜索请求可以被分散处理,从而加快了搜索结果的返回速度。

💡人工智能

人工智能是指由人造系统所表现出来的智能行为,这些系统能够执行通常需要人类智能才能完成的任务,如视觉识别、语言识别、决策和翻译等。视频中提到,人工智能并不是取代人类工作,而是通过提高生产力来增强人类的能力。

💡微芯片

微芯片,也称为集成电路,是一种将成千上万的微型电子元件如晶体管、电阻、二极管等集成到一个小型的半导体材料片上的技术。微芯片是现代电子设备的核心组成部分,负责执行各种计算和数据处理任务。

💡计算机科学

计算机科学是研究计算机及其应用的科学,涵盖了从算法和数据结构到软件工程和人工智能等多个领域。在视频中,教授解释了计算机科学不仅仅是编程,而是一个广泛的领域,编程只是学习过程中的一个工具。

💡二进制

二进制是一种基于两个状态(通常表示为0和1)的数字系统,它是计算机内部处理信息的基础。二进制系统的优点在于它简化了电子设备的电路设计,因为电路只需要区分两种电压状态。

💡互联网

互联网是一个全球性的计算机网络,它通过一系列标准化的通信协议连接了数十亿的设备。互联网允许这些设备之间进行信息的交换和共享,从而实现了数据、视频、音频等多种类型的通信。

💡操作系统

操作系统是管理计算机硬件资源和提供软件运行环境的系统软件。它为计算机上运行的应用程序提供服务,并管理计算机的内存、进程和所有软件和硬件资源。

💡云计算

云计算是一种通过互联网提供计算资源(如服务器、存储、数据库、网络、软件等)的服务模式。用户可以根据需要远程访问这些资源,而无需自己购买和维护物理服务器。

💡计算机内存

计算机内存是计算机中用于暂时存储数据和程序的硬件设备,它允许计算机快速访问和处理信息。内存通常以随机存取存储器(RAM)的形式存在,其中存储的数据在断电后会丢失。

💡Web3

Web3,或称为Web 3.0,是互联网发展的一个新阶段,它强调去中心化和区块链技术的使用。Web3旨在创建一个更加开放、互联和智能的网络环境,其中数据和内容不再集中存储在单一服务器上,而是分布在多个节点上。

💡固件

固件是嵌入到硬件设备中的软件,它提供了硬件设备的基本功能和控制。固件通常是硬件设备启动和运行所必需的,它负责管理硬件的低级功能,如启动过程、设备初始化和硬件错误检测等。

Highlights

分布式计算是搜索引擎快速工作的秘诀,例如谷歌和必应拥有成千上万的服务器遍布全球。

搜索引擎通过多服务器处理来避免瓶颈,提高搜索效率。

AI不会在未来5到10年内取代编程工作,而是提高人类生产力。

网站构建工具的发展是AI在网站设计领域应用的例证。

微芯片是计算机逻辑板或主板上众多芯片的一部分,负责不同功能。

CPU是计算机中最大的芯片,相当于计算机的大脑。

计算机科学不仅仅是编程,还包括数学、网络、图形学等广泛的领域。

互联网建立在多层概念之上,从二进制开始,逐步构建复杂的数据表示。

二进制编码之所以被广泛使用,是因为它简单且对电压级别的区分更为明确。

重启计算机能解决许多软件问题,因为它重置了计算机的状态。

操作系统的选择往往与个人偏好、应用需求和支持程度有关。

尽管计算机硬件趋向于变得更便宜,但随着我们期望值的提高,价格并未显著下降。

云计算本质上是使用他人支付维护费用的服务器。

计算机内存的工作方式类似于一系列开关,通过电流通断来存储信息。

Web3代表着互联网从中心化向分布式转变的趋势,可能基于区块链技术。

固件是嵌入硬件中的软件,通常与硬件紧密相关且至关重要。

教授David J. Malan通过回答Twitter上的问题来支持计算机科学教育。

教授David J. Malan强调了计算机科学是一个广泛的领域,编程只是学习过程中的一个工具。

教授David J. Malan解释了为什么计算机使用二进制而不是三进制编码。

教授David J. Malan讨论了操作系统的优劣取决于个人需求和偏好。

Transcripts

play00:00

- Hello world.

play00:00

My name is Professor David J. Malan,

play00:02

I teach computer science at Harvard,

play00:04

and I'm here today to answer your questions from Twitter.

play00:06

This is "Computer Science Support".

play00:08

[upbeat music]

play00:13

First up from tadproletarian,

play00:14

"How do search engines work so fast?"

play00:16

Well, the short answer really is distributed computing,

play00:20

which is to say that Google and Bing,

play00:22

and other such search engines,

play00:24

they don't just have one server

play00:25

and they don't even have just one really big server,

play00:27

rather they have hundreds, thousands,

play00:29

probably hundreds of thousands or more servers nowadays

play00:32

around the world.

play00:33

And so when you and I go in and to Google or Bing

play00:36

and maybe type in a word to search for like, "cats,"

play00:39

it's quite possible that when you hit enter

play00:41

and that keyword like cats is sent over the internet

play00:44

to Google or to Bing, it's actually spread out ultimately

play00:47

across multiple servers,

play00:49

some of which are grabbing the first 10 results,

play00:51

some of which are grabbing the next 10 results,

play00:52

the next 10 results,

play00:53

so that you see just one collection of results,

play00:56

but a lot of those ideas,

play00:57

a lot of those search results came from different places.

play01:00

And this eliminates

play01:01

what could potentially be a bottleneck of sorts

play01:03

if all of the information you needed

play01:05

had to come from one specific server

play01:07

that might very well be busy when you have that question.

play01:10

Nick asks, "Will computer programming jobs be taken

play01:13

over by AI within the next 5 to 10 years?"

play01:16

This is such a frequently asked question nowadays

play01:17

and I don't think the answer will be yes.

play01:19

And I think we've seen evidence of this already

play01:22

in that early on when people were creating websites,

play01:24

they were literally writing out code

play01:26

in a language called HTML by hand.

play01:28

But then of course, software came along,

play01:29

tools like Dreamweaver that you could download

play01:31

on your own computer

play01:32

that would generate some of that same code for you.

play01:34

More recently though, now you can just sign up for websites

play01:36

like Squarespace, and Wix, and others

play01:38

whereby click, click, click

play01:39

and the website is generated for you.

play01:41

So I dare say certainly in some domains,

play01:44

that AI is really just an evolution of that trend

play01:46

and it hasn't put humans out of business

play01:48

as much as it has made you and AI much more productive.

play01:51

AI, I think, and the ability soon to be able

play01:54

to program with natural language

play01:56

is just going to enhance what you and I

play01:58

can already do logically, but much more mechanically.

play02:00

And I think too it's worth considering

play02:02

that there's just so many bugs

play02:03

or mistakes in software in the world

play02:05

and there's so many features

play02:06

that humans wish existed in products present and future

play02:10

that are to-do list, so to speak,

play02:11

is way longer than we'll ever have time

play02:14

to finish in our lifetimes.

play02:15

And so I think the prospect

play02:17

of having an artificial intelligence boost our productivity

play02:19

and work alongside us, so to speak,

play02:22

as we try to solve problems, is just gonna mean

play02:24

that you and I and the world together

play02:26

can solve so many more problems

play02:28

and move forward together at an even faster rate.

play02:31

All right, next up Sophia, who asks,

play02:33

"How do microchips even work?

play02:34

It's just a green piece of metal."

play02:36

Well, here for instance, we have a whole bunch of microchips

play02:39

on what's called a logic board

play02:41

or sometimes known as a motherboard.

play02:42

There's a lot of ports

play02:44

that you might be familiar with, for instance.

play02:45

Like here's some ports for audio,

play02:47

here's some ports for networking,

play02:49

here's some ports for USB and some other devices as well.

play02:51

And those ports meanwhile are connected

play02:54

to lots of different chips on this board

play02:56

that know how to interpret the signals from those ports.

play02:58

And perhaps the biggest chip on this motherboard

play03:00

tends to be this thing here called the CPU,

play03:03

or the central processing unit,

play03:04

which is really the brains of the computer.

play03:06

And what you can't necessarily quite see,

play03:08

'cause most of this is actually paint and not traces,

play03:10

but if I flip this around, you'll actually see,

play03:12

in the right light and with the right angle,

play03:15

a whole bunch of traces running up,

play03:17

down, left, and right on this logic board

play03:18

that's connecting all of these various microchips.

play03:21

And by trace, I mean a tiny little wire

play03:23

that's been etched into the top

play03:25

or the bottom of this circuit board

play03:26

that connects two parts they're on.

play03:28

Now, what might these microchips be doing?

play03:30

Well, again, they might be simply interpreting signals

play03:32

that are coming in from these ports,

play03:34

two, they might be performing mathematical operations,

play03:36

doing something with those signals

play03:38

in order to convert input into output,

play03:40

or they might just be storing information ultimately.

play03:42

In fact, there's all different types of memory

play03:44

on a logic board like this, be it RAM, or ROM, or the like,

play03:47

and so some of those chips

play03:49

might very well be storing information

play03:50

for as long as the computer's plugged in,

play03:52

or in some cases, depending on the device,

play03:54

even when the power goes off.

play03:56

All right, next a question from Nke_chi.

play03:58

"So if anyone can learn coding,

play04:00

what do computer scientists do

play04:01

for four years in university?"

play04:03

Typically, in an undergraduate program in computer science,

play04:06

or computer engineering, or a similar field,

play04:09

someone spends much more time learning

play04:11

about the field itself than about programming specifically.

play04:14

So as such, you might study not only a bit of programming,

play04:17

but also mathematics, certain fundamentals

play04:18

that transcend the particular classes you might've taken

play04:21

in middle school or high school,

play04:23

but that can be used to solve grander real world problems,

play04:26

you might learn something about networks,

play04:27

how you can send information from point A to point B,

play04:30

you might learn about graphics,

play04:31

how you can display things on the screen

play04:33

or even create interactive animations or the like,

play04:35

you might learn how to leverage certain ideas

play04:37

from mathematics and other fields

play04:39

to implement your very own artificial intelligence nowadays,

play04:42

whereby you use probability and statistics

play04:45

and information more generally to try to predict

play04:48

what a intelligent individual, or in this case computer,

play04:51

might say in response to a question.

play04:53

So computer science itself is a very broad field

play04:55

and programming is really just a tool

play04:58

that you tend to learn along the way.

play05:00

From mayashelbyy,

play05:01

"How do zeros and ones turn into the internet?"

play05:04

Well, I think the simplest answer there

play05:06

is that the internet is built

play05:07

upon layers and layers and layers of ideas.

play05:10

And if we start at the lowest of those levels,

play05:12

zeros and ones, you have something called binary

play05:14

where zeros and ones can be used

play05:16

to represent any other numbers as well.

play05:18

And if we use more and more zeros and ones,

play05:20

more and more binary digits or bits so to speak,

play05:23

we can count up higher and higher and higher.

play05:24

And then if you and I agree that all right,

play05:26

well, let's not just use these patterns

play05:28

of zeros and ones to represent numbers,

play05:30

what if we reserve some of these patterns

play05:32

to represent letters of like the English alphabet,

play05:35

and then maybe you and I can decide

play05:36

to reserve certain patterns of zeros and ones

play05:38

to represent colors like red and green and blue

play05:41

and combinations thereof.

play05:42

Well, once we have the ability to represent colors,

play05:45

we could then represent whole pictures,

play05:47

because what's a picture on your phone or a computer screen?

play05:49

Well, it's really just a grid of dots,

play05:52

each of which has its own color.

play05:53

So this is all to say that even if we start

play05:55

at this lowest level of just zeros and ones,

play05:57

so long as you and I and all of the devices we use

play06:00

agree to follow some standard like this,

play06:03

we can build these layers and layers of abstraction,

play06:05

so to speak, on top of one another until finally,

play06:08

you and I come up with a pattern of zeros and ones

play06:10

that represents "Send this piece of information

play06:13

from me over there."

play06:14

And thus, we have something like the internet.

play06:17

majinbuu asks, "Can someone that knows computer science

play06:20

explain to me why computers use binary coding

play06:22

and not trinary when trinary is supposed to be faster?"

play06:25

So it's not necessarily the case that a trinary system,

play06:28

which would use three symbols,

play06:30

for instance, zero, one, and two,

play06:31

would necessarily be faster than binary,

play06:34

because binary, using just zero and one,

play06:36

tends to be simpler to implement

play06:38

and also more robust to potential errors.

play06:40

Or if you're familiar with voltage levels,

play06:42

like in a battery, it's very easy for a computer

play06:44

to distinguish something for like zero volts or three volts,

play06:48

but it gets a little harder

play06:49

if we try to draw the lines somewhere in between,

play06:51

because there's just a higher probability

play06:53

that a computer might mistake a voltage level,

play06:56

like 1.5 in the middle,

play06:57

as maybe being a little closer to off than on

play07:00

or to on than off.

play07:02

Here too is where

play07:03

even though there might be mathematical efficiencies

play07:05

in real world efficiencies to using trinary,

play07:07

otherwise known as ternary, like a zero, a one,

play07:10

and a two digit instead of just zeros and ones,

play07:13

it turns out because our world runs on electricity nowadays

play07:15

and there's so much momentum behind binary

play07:17

that it just tends to be a net positive.

play07:20

rachaelp95 asks, "Why is every Windows solution,

play07:24

'Have you tried restarting?'

play07:25

And why does that always work?"

play07:26

So that's a very heavy handed solution

play07:28

to what are typically just bugs or mistakes in software,

play07:31

for instance, Windows in this case.

play07:33

Restarting a computer just starts everything from scratch.

play07:36

So all of the computer's short-term memory is lost

play07:39

and everything starts in pristine condition,

play07:40

which is to say that it starts

play07:42

in exactly the way that the programmers

play07:43

at Microsoft intended without potentially the distractions

play07:46

of the computer being in some weird state

play07:48

or condition that the programmers just didn't anticipate.

play07:51

Maybe you clicked on some buttons in a weird order,

play07:53

maybe you opened a strange file,

play07:55

but you maybe you got the computer into a state

play07:57

that just wasn't programmed for properly.

play07:59

Jason Witmer now asks, "What's the best operating system?"

play08:02

Well, this is one of these questions

play08:03

in computing we would call a religious question,

play08:05

since it evokes a religious debate

play08:07

as to which might be best.

play08:08

Of course, among the most popular operating systems

play08:10

out there are Windows and macOS,

play08:13

but there's also one you might not have heard of,

play08:15

which is called Linux, which is actually very omnipresent

play08:17

in the enterprise world.

play08:19

So many of today's servers actually run Linux

play08:21

and so many of today's desktops

play08:22

or laptops though run Windows or macOS.

play08:25

Now, that's not to say you couldn't run

play08:26

all of those operating systems in different contexts,

play08:29

and some of us do actually run Linux on our own computers,

play08:31

so a lot of it is really boils down to personal preference.

play08:34

I wouldn't even say that there's one best operating system,

play08:37

but there tend to be correlations

play08:39

between the operating systems people use

play08:41

and the applications they have in mind.

play08:42

So Windows, for instance, is so popular

play08:44

in the world of PCs and desktops and laptops.

play08:47

macOS is to some extent,

play08:48

particularly in academia and certain countries,

play08:50

but not necessarily on the same scale.

play08:52

Linux, by contrast, is again, very much used heavily

play08:55

in the server side industry, but so is Windows as well.

play08:58

So a lot of the choice for operating systems

play09:00

sometimes comes from just what's most appropriate,

play09:03

what's most popular, what's most supportive,

play09:05

but some of it comes too from just personal preference

play09:07

of the engineer, maybe the first engineer that you hire

play09:09

to make one of those decisions.

play09:11

So it's more about what's best for you

play09:13

and not so much best in general.

play09:15

Next, Giulio Magnifico asks,

play09:17

"Why aren't computers getting cheaper?"

play09:19

Well, computers, or at least computer parts

play09:21

inside of computers, do tend to get cheaper.

play09:24

The catch is that your expectations

play09:25

and my expectations just keep rising.

play09:28

We want our phones, our laptops,

play09:30

our desktops to do more and more

play09:31

in the way of the software that they run,

play09:33

the games that we use,

play09:34

and just how quickly they perform for us.

play09:36

So even though some of those parts

play09:37

are getting less expensive,

play09:39

you and I want them to do more and more

play09:41

and be faster and larger in quantity,

play09:43

and so as a result, I dare say,

play09:45

that the price isn't going down as far as you might hope.

play09:48

That said, nowadays you can get,

play09:50

for the same amount of money from yesteryear,

play09:52

much, much more in the way of computing power.

play09:55

So arguably, it's working to our benefit in some cases.

play09:58

Next up from DairoNabilah,

play10:00

"Can someone explain cloud computing

play10:02

to me like a five-year-old?"

play10:03

Cloud computing is essentially

play10:05

you using someone else's servers

play10:07

that someone is paying to rent, for instance, or timeshare.

play10:10

So this isn't really a new idea or a new technology,

play10:13

rather it's a better branding

play10:14

of a technique that's been used for years,

play10:16

not just in the computer world,

play10:17

but in the real world as well,

play10:18

whereby someone like Google or Microsoft or Apple

play10:21

or others nowadays might be able to afford lots and lots

play10:24

and lots of servers and then make those servers available

play10:27

in part to me, to you, and many other customers as well.

play10:30

"Hey, I'm Marcus."

play10:31

Hey, Marcus.

play10:32

Well, Marcus asks, "How does computer memory work?"

play10:35

Think of computer memory as really being driven

play10:37

by a whole bunch of switches

play10:38

that can either be turned on and off.

play10:40

So for instance, if I take this here light switch,

play10:43

which is currently off, I could simply say

play10:45

that this switch here

play10:46

is representing the number zero in binary.

play10:49

But if I turn the switch on,

play10:50

well now I can say that I'm representing the number one.

play10:53

Now, of course, I can only count as high as zero to one

play10:56

with a single light switch,

play10:57

but what if I bring over a second light switch,

play11:00

like this one here?

play11:01

If we started zero in this way,

play11:04

turn on this switch first and claim that it's one,

play11:07

let me now be more creative

play11:08

and turn this one off and this one on,

play11:11

and now claim this is how a computer's memory

play11:13

could represent the number two.

play11:14

And now if I turn this switch back on,

play11:16

giving me a fourth pattern,

play11:18

this is how I might represent the number three.

play11:21

Now, of course, if we add more and more of these switches,

play11:23

more and more of these light bulbs,

play11:24

we can count even higher than three.

play11:26

And indeed that's what a computer's memory

play11:28

is ultimately doing.

play11:29

It's using lots and lots of little tiny switches,

play11:32

otherwise known as transistors,

play11:33

to turn the flow of electricity on and off,

play11:36

and then it's got other types of hardware

play11:38

called, for instance, capacitors

play11:40

that have a capacity to hold onto some of that electricity

play11:43

just like the light bulb there being on.

play11:45

All right, next, Donny asks,

play11:47

"How do you explain Web3 to people?"

play11:49

So Web3, like Web 2 and retrospectively, Web 1,

play11:53

are really just buzzwords that describe sort of phases

play11:56

of the internet or the worldwide web as you and I know it.

play11:59

For instance, back in the day,

play12:00

when there was just the worldwide web,

play12:02

now perhaps referred to as Web version one

play12:05

information was largely static.

play12:07

If you were to create a website on the internet,

play12:09

you would type up your code, you would type up your content,

play12:11

you would put it on a server somewhere,

play12:13

and someone could read that information,

play12:15

but it was you, the web developer,

play12:18

or you, the owner of the website,

play12:19

that was creating that content

play12:21

for other people to actually read and consume.

play12:23

In Web 2, the world became much more dynamic in recent years

play12:27

whereby now websites tend to have databases

play12:30

and they have more sophistication,

play12:32

so that a lot of the content in websites today

play12:34

are actually coming from me and from you.

play12:36

So if you think of any social media site,

play12:38

it's not the owners of those sites

play12:40

that are creating most of the content,

play12:41

it's you and me as the users of those same websites.

play12:45

But in Web 2, everything is nonetheless very centralized,

play12:48

whether you're Twitter or Facebook, now Meta,

play12:50

or other companies, all of that data,

play12:52

even in the world of social media,

play12:53

that's coming from me and you

play12:55

is actually being stored centrally on those company servers.

play12:58

So Web 3.0 or Web3, so to speak,

play13:01

is really about transitioning away potentially

play13:04

from that very centralized model

play13:06

to one that's more distributed, where the data

play13:08

that you and I are creating,

play13:10

whereby the data you and I are consuming,

play13:12

is actually distributed over multiple servers

play13:14

over a technique called blockchain,

play13:16

for instance in some cases,

play13:18

whereby there's not necessarily one owner of that data,

play13:21

but really collective ownership and therefore verification

play13:24

that the data maybe indeed came from me and you.

play13:28

Next, a question from gomotigers,

play13:30

"Can someone explain to me the difference

play13:32

between firmware and software?

play13:33

Hardware is physical, software is code, wtf is firmware?"

play13:38

Firmware is really a synonym for a type of software.

play13:40

So firmware is just software,

play13:42

but it tends to be software

play13:43

that comes built into your hardware.

play13:46

And you can think of in the simplest scenario

play13:48

that firmware is software

play13:50

that is just completely integrated into the hardware

play13:52

and itself cannot be changed or even upgraded.

play13:55

But that's a bit of an oversimplification,

play13:56

because even firmware typically,

play13:59

when it comes in a computer,

play14:00

when it comes in a phone, or some other device,

play14:02

can very often be updated.

play14:04

Why?

play14:04

Because the firmware is the software

play14:06

that's really closest to the hardware,

play14:07

and in that sense, it might very well be the most important.

play14:10

And if anything goes wrong with the firmware,

play14:12

you might not even be able to turn that device on,

play14:14

whether it's a phone, a computer,

play14:15

or even your refrigerator nowadays.

play14:17

All right, that's all the questions for today.

play14:19

We hoped you learned a little something along the way.

play14:21

We'll see you next time.

Rate This

5.0 / 5 (0 votes)

Related Tags
搜索引擎AI影响微芯片原理计算机科学编程教育互联网基础操作系统云计算技术趋势
Do you need a summary in English?