Inside a Google data center

Google Workspace
16 Dec 201405:28

Summary

TLDRThe video script explores the intricate world of Google's data centers, highlighting their crucial role in the internet's infrastructure. Joe Kava, VP of Data Centres, discusses the design, operation, and security measures, emphasizing the innovative cooling technologies and the dedication of the team maintaining these facilities 24/7.

Takeaways

  • 🧠 Data centres are the 'brains' and 'engine' of the Internet, housing numerous computers that work together to provide services for companies like Google.
  • 🏢 Data centres are large buildings that require significant power, cooling systems, and specialized hardware to operate efficiently.
  • 🛠️ Joe Kava, Google's Vice-President of Data Centres, oversees the global teams responsible for the design, construction, and operation of Google's data centres, as well as their environmental and safety standards.
  • 🌐 The data centre in South Carolina is part of a global network, emphasizing the interconnectedness of these facilities worldwide.
  • 🔒 Security at Google's data centres is paramount, with a multi-layered approach that includes pre-authorized access lists, biometric verification, and high-level security technologies like laser beams for intrusion detection.
  • 👨‍🔧 The staff at data centres are passionate about their work, maintaining the facilities 24/7 and ensuring continuous operation.
  • 🎉 The data centre team recently celebrated three million man-hours without a lost-time incident, highlighting their commitment to safety.
  • 🌡️ Google operates its data centres at a warmer temperature (around 80 degrees Fahrenheit) to improve efficiency.
  • 🔌 The data centre uses overhead power distribution and custom-designed racks for optimal server performance and efficiency.
  • 🛡️ Google has a rigorous chain-of-custody process for managing server drives, ensuring data security through erasure and physical destruction when necessary.
  • 💧 Innovation in cooling technologies at Google's data centres includes direct contact of server racks with air-conditioning units and a water-based cooling system that transfers heat to an external cooling plant.
  • 🚀 The pace of innovation at Google's data centres is relentless, with a culture that constantly challenges and improves existing practices.

Q & A

  • What is the primary function of a data centre as described in the script?

    -A data centre is described as the 'brains' and 'engine' of the Internet, housing numerous machines that work together to provide the services that make companies like Google function.

  • What is Joe Kava's role at Google?

    -Joe Kava is the Vice-President of Data Centres at Google, responsible for managing the teams that design, build, and operate Google's data centres globally, as well as overseeing environmental health, safety, sustainability, and carbon offsets.

  • How does the script characterize the employees who operate the data centres?

    -The script characterizes the employees as passionate and dedicated, working hard to keep the data centres running 24/7.

  • What is the significance of the 'three-million-man-hour' mark mentioned in the script?

    -The 'three-million-man-hour' mark signifies a significant achievement in safety, indicating that there have been no lost-time incidents over a substantial period of time with a large number of people on site.

  • How does Google ensure the security of its data centres?

    -Google ensures data centre security through various layers of security measures, including pre-authorised access lists, biometric iris scanners, and high-level security technologies such as underfloor intrusion detection via laser beams.

  • What is the typical temperature inside a Google data centre, and why is it set this way?

    -The typical temperature inside a Google data centre is about 80 degrees Fahrenheit. It is set higher than most because it helps with efficiency.

  • What is unique about the power distribution system in Google's data centres?

    -Google's data centres have an overhead power distribution system that brings in high-voltage power from the outside and distributes it across customised bus taps for plug-in extension cords.

  • How does Google manage the replacement and disposal of failed or outdated drives?

    -Google has an end-to-end chain-of-custody process for managing drives, which includes checking them out from the server, bringing them to an ultra-secure cage for erasure and crushing if necessary, and ensuring they are 100% clean before disposal.

  • What is the innovative approach Google uses for cooling its data centres?

    -Google uses a cooling system where server racks are placed against air-conditioning units, utilizing cool water flowing through copper coils to transfer heat from the air to the water, which is then cooled outside and returned to the data centre.

  • How does the script describe the pace of innovation at Google's data centres?

    -The script describes the pace of innovation at Google's data centres as remarkable, with a constant challenge to the status quo and a refusal to accept that innovation in certain areas is over.

Outlines

00:00

🌐 The Heart of the Internet: Data Centres

This paragraph introduces the concept and importance of data centres as the core infrastructure of the Internet. It describes a data centre as a massive building equipped with substantial power and cooling systems, housing numerous computers that work in unison to support services like Google. Joe Kava, Google's Vice-President of Data Centres, speaks about his role in managing global teams responsible for the design, construction, and operation of these facilities. He also discusses the environmental and safety aspects, as well as the security measures in place, emphasizing the high level of security and the unique challenges faced by the passionate team that operates the data centres around the clock.

05:01

🛠️ Innovation and Excellence in Data Centre Operations

The second paragraph delves into the operational excellence and relentless innovation within Google's data centres. It highlights the team's commitment to safety, evidenced by the achievement of three million man-hours without lost-time incidents. Joe Kava underscores the unparalleled security provided by Google's data centres, attributing this to the top-notch information security team. The paragraph also details the multi-layered security protocols, including biometric verification and laser-based intrusion detection. Furthermore, it touches upon the efficiency-focused design of the data centre's infrastructure, such as the use of warmer operating temperatures and innovative cooling technologies that have evolved multiple times over the years.

Mindmap

Keywords

💡Data Centre

A data centre is a large facility that houses numerous servers, storage systems, and networking equipment. It serves as the backbone of the internet, where data is processed, stored, and managed. In the video's context, data centres are described as the 'brains of the Internet' and the 'engine of the Internet,' highlighting their critical role in enabling online services and operations for companies like Google.

💡Environmental Health and Safety

Environmental health and safety refer to the practices and measures taken to protect the environment and ensure the well-being of individuals within a specific setting. In the video, Joe Kava mentions that his team is responsible for these aspects of Google's data centres, emphasizing the company's commitment to sustainability and the safe operation of its facilities.

💡Sustainability

Sustainability in the context of data centres involves the efficient use of resources and the reduction of environmental impact. The script mentions that the data centre teams are responsible for sustainability, indicating Google's efforts to minimize the carbon footprint of its operations and promote long-term environmental health.

💡Carbon Offsets

Carbon offsets are a method of compensating for the carbon emissions a company produces by investing in projects that reduce or remove greenhouse gas emissions elsewhere. Joe Kava discusses the role of his team in managing carbon offsets for Google's data centres, which is part of the company's broader strategy to mitigate its environmental impact.

💡Information Security

Information security is the practice of protecting digital information from unauthorized access, use, disclosure, disruption, modification, or destruction. The script highlights Google's strong information security team and their role in safeguarding user data, which is a fundamental aspect of the company's data centre operations.

💡Biometric Iris Scanner

A biometric iris scanner is a security device that uses the unique patterns in a person's iris to verify their identity. In the video, Joe Kava demonstrates the use of such a scanner as part of the multi-layered security measures in place at Google's data centres, illustrating the high level of security implemented to protect the facilities.

💡Efficiency

Efficiency in the context of data centres refers to the optimization of energy and resource use to perform tasks with the least waste possible. The script mentions that Google runs its data centres at a warmer temperature to improve efficiency, demonstrating the company's focus on optimizing performance while reducing energy consumption.

💡Custom Designed Racks

Custom designed racks in data centres are specifically engineered to meet the unique needs of the organization's hardware and infrastructure. The script describes how Google's racks are not traditional and are optimized for hyper-efficiency and high-performance computing, showing the company's commitment to tailoring its infrastructure for maximum operational effectiveness.

💡Chain-of-Custody

A chain-of-custody is a process that documents and verifies the handling of physical or digital evidence or assets to ensure integrity and accountability. The script refers to Google's thorough chain-of-custody process for managing drives, which underscores the company's focus on security and proper handling of sensitive data storage media.

💡Innovation

Innovation in the context of the video refers to the continuous development and implementation of new ideas and technologies to improve existing processes or create new ones. Joe Kava expresses his amazement at the pace of innovation within Google's data centres, indicating that the company is always pushing boundaries and challenging the status quo in its pursuit of excellence.

💡Cooling Technologies

Cooling technologies in data centres are systems and methods used to dissipate the heat generated by servers and other equipment. The script discusses the evolution of Google's cooling technologies, including a unique approach where server racks are placed against air-conditioning units for direct heat exchange, highlighting the company's innovative approach to thermal management.

Highlights

A data centre is described as the brains and engine of the Internet.

Google data centres are large buildings with significant power, cooling, and computer resources.

Joe Kava, Vice-President of Data Centres at Google, manages global teams responsible for designing, building, and operating data centres.

Google's data centres emphasize environmental health, safety, sustainability, and carbon offsets.

Only a small percentage of Google employees are authorized to enter data centre campuses.

Data centre employees work passionately 24/7 to maintain operations.

Google provides a fun work environment to balance the hard work of data centre employees.

Google data centres achieved three million man-hours without lost-time incidents, a significant safety milestone.

Google data centres offer unmatched security levels, including various layers of higher-level security as one moves closer to the centre.

Advanced security measures include biometric iris scanners and underfloor intrusion detection via laser beams.

Google runs data centres warmer than most, around 80 degrees Fahrenheit, for efficiency.

Custom-designed server racks optimize hyper-efficiency and high-performance computing.

Failed drives are thoroughly managed, erased, and physically destroyed if necessary to ensure data security.

Google's cooling technology has evolved multiple times, now using a system where hot air is cooled by water-cooled copper coils.

Innovation is a constant at Google data centres, always challenging and improving current processes.

Transcripts

play00:00

MALE SPEAKER 1: A data centre's the brains of the Internet.

play00:02

MALE SPEAKER 2: The engine of the Internet.

play00:04

FEMALE SPEAKER 1: It is a giant building with a lot of power,

play00:07

a lot of cooling and a lot of computers.

play00:10

MALE SPEAKER 3: It's row, upon row, upon row of machines,

play00:12

all working together to provide the services that

play00:15

make Google function.

play00:17

JOE KAVA: I love building and operating data centres.

play00:21

I'm Joe Kava, Vice-President of Data Centres at Google.

play00:23

I'm responsible for managing the teams globally that design,

play00:27

build and operate Google's data centres.

play00:30

We're also responsible for the environmental health

play00:32

and safety, sustainability and carbon offsets for our data

play00:37

centres.

play00:37

This data centre, here in South Carolina,

play00:40

is one node in a larger network of data centres

play00:43

all over the world.

play00:44

Of all the employees at Google, a very, very small percentage

play00:48

of those employees are authorised to even enter

play00:50

a data centre campus.

play00:52

The men and women who run these data

play00:53

centres and keep them up 24 hours a day, seven days a week,

play00:57

they are incredibly passionate about what they're doing.

play01:00

MALE SPEAKER 2: In layman's terms, what do I do here?

play01:03

FEMALE SPEAKER 1: I typically refer to myself

play01:05

as the herder of cats.

play01:06

MALE SPEAKER 4: I'm an engineer.

play01:07

MALE SPEAKER 3: Hardware site operations manager.

play01:09

MALE SPEAKER 2: We keep the lights on.

play01:10

MALE SPEAKER 1: And we enjoy doing it.

play01:12

JOE KAVA: And they work very hard,

play01:13

so we like to provide them with a fun environment where they can also

play01:16

play hard as well.

play01:18

FEMALE SPEAKER 2: We just went past the three-million-man-hour

play01:22

mark for zero lost-time incidents.

play01:24

Three million man-hours is a really long time,

play01:26

and with the number of people we have on site, that

play01:30

is an amazing accomplishment.

play01:31

play01:35

JOE KAVA: I think that the Google data centres really

play01:37

can offer a level of security that almost no other company

play01:41

can match.

play01:41

We have an information security team

play01:45

that is truly second to none.

play01:46

You have the expression, "they wrote the book on that."

play01:49

Well, there are many of our information security

play01:52

team members who really have written

play01:54

the books on best practices in information security.

play01:58

Protecting the security and the privacy

play02:01

of our users' information is our foremost design criterion.

play02:05

We use various layers of higher-level security

play02:08

the closer into the centre of the campus you get.

play02:10

So, just to enter this campus, my badge

play02:12

had to be on a pre-authorised access list.

play02:15

Then, to come into the building, that

play02:16

was another level of security.

play02:18

To get into the secure corridor that leads to the data centre,

play02:21

that's a higher level of security.

play02:23

And the data centre and the networking rooms

play02:25

have the highest level of security.

play02:27

And the technologies that we use are different.

play02:29

Like, for instance, in our highest-level areas,

play02:31

we even use underfloor intrusion detection via laser

play02:35

beams.

play02:35

So, I'm going to demonstrate going into the secure corridor

play02:38

now.

play02:39

One, my badge has to be on the authorised list.

play02:42

And then two, I use a biometric iris scanner

play02:45

to verify that it truly is me.

play02:48

OK, here we are on the data centre floor.

play02:50

The first thing that I notice is that it's

play02:52

a little warm in here.

play02:53

It's about 80 degrees Fahrenheit.

play02:54

Google runs our data centres warmer

play02:56

than most because it helps with the efficiency.

play03:00

You'll notice that we have overhead power distribution.

play03:03

Coming from the yard outside, we bring in the high-voltage power

play03:08

distributed across the bus bars to all of the customised bus

play03:12

taps that are basically plugs, where we plug

play03:15

in all the extension cords.

play03:17

All of our racks don't really look like a traditional server

play03:21

rack.

play03:21

These are custom designed and built for Google

play03:25

so that we can optimise the servers

play03:26

for hyper-efficiency and high-performance computing.

play03:30

It's true that sometimes drives fail,

play03:32

and we have to replace them to upgrade them,

play03:34

because maybe they're no longer efficient to run.

play03:36

We have a very thorough end-to-end chain-of-custody

play03:39

process for managing those drives

play03:41

from the time that they're checked out from the server

play03:44

til they're brought to an ultra-secure cage, where

play03:47

they're erased and crushed if necessary.

play03:50

So any drive that can't be verified as 100%

play03:52

clean, we crush it first and then we

play03:55

take it to an industrial wood chipper,

play03:58

where it's shredded into these little pieces like this.

play04:02

In the time that I've been at Google – for almost six

play04:04

and a half years now – we have changed

play04:06

our cooling technologies at least five times.

play04:10

Most data centres have air-conditioning units

play04:12

along the perimeter walls that force cold air under the floor.

play04:16

It then rises up in front of the servers

play04:18

and cools the servers.

play04:20

With our solution, we take the server racks

play04:23

and we butt them right up against our air-conditioning unit.

play04:26

We just use cool water flowing through those copper

play04:29

coils that you see there.

play04:30

So the hot air from the servers is contained in that hot aisle.

play04:34

It rises up, passes across those coils,

play04:37

where the heat from the air transfers

play04:39

to the water in those coils, and then

play04:42

that warm water is then brought outside the data centre

play04:44

to our cooling plant, where it is cooled down

play04:46

through our cooling towers and returned

play04:49

to the data centre.

play04:50

And that process is just repeated over and over again.

play04:52

play04:57

To me, the thing that amazes me about Google and the data

play05:01

centres is the pace of innovation

play05:03

and always challenging the way we're doing things.

play05:07

So, when people say that innovation in a certain area

play05:10

is over, that we've kind of reached the pinnacle of what

play05:13

can be achieved, I just laugh.

play05:18

[MUSIC PLAYING]

play05:19

Rate This

5.0 / 5 (0 votes)

Связанные теги
Data CentersGoogleInnovationSecurityEfficiencySustainabilityCarbon OffsetsCooling TechHardware OpsEnvironmental Health
Вам нужно краткое изложение на английском?