Google Data Center Efficiency Best Practices Full Video

haitian tang
3 Oct 202010:01

Summary

TLDRGoogle's data centers, crucial for delivering web services, are designed for energy and resource efficiency. Innovations in design have reduced energy usage by half compared to typical centers. Key practices include optimizing Power Usage Effectiveness (PUE), managing airflow to prevent hot and cold air mixing, increasing cold aisle temperatures, utilizing free cooling, and minimizing power conversion stages. These strategies not only save millions in energy costs but are also applicable to data centers of all sizes, promoting sustainability.

Takeaways

  • 🌐 Google prioritizes data center efficiency to deliver web services with minimal resource use.
  • 🔍 Data centers can range from small closets to large warehouse-scale buildings filled with servers.
  • 💡 Google has innovated in data center design to reduce energy, water, and other resource consumption.
  • 🌡️ The ICT sector, including data centers, contributes about 2% to global greenhouse gas emissions, with data centers accounting for 0.15%.
  • 🛠️ Energy-efficient design choices and best practices can significantly improve data center performance and reduce energy use.
  • 📈 Google's efforts in data center efficiency have resulted in substantial energy savings, amounting to millions of dollars.
  • 🔬 Measuring Power Usage Effectiveness (PUE) is crucial for managing data center efficiency and should be done continuously.
  • 🚫 Proper air flow management, including eliminating hot and cold air mixing, is essential for reducing energy load.
  • 🌡️ Increasing the temperature in the cold aisle can save energy costs and is supported by ASHRAE guidelines up to 80 degrees Fahrenheit.
  • 🌡️ Free cooling, which utilizes ambient temperatures, can significantly reduce the need for energy-intensive cooling equipment.
  • 💧 Evaporative cooling and sea water cooling are examples of efficient water use in data centers, leading to substantial water savings.
  • 🔌 Minimizing power conversion stages and using efficient power supplies can save energy and reduce costs in data centers.
  • 🔄 Implementing these efficiency practices is cost-effective and can be applied to both small and large data centers.

Q & A

  • Why are data centers crucial for Google's operations?

    -Data centers are essential for Google as they deliver all of their web services to users. They can range from small closets with a few machines to large-scale buildings optimized for power use and IT computing, housing thousands of servers.

  • What has Google done to innovate data center design and construction?

    -Google has spent a significant amount of time innovating the design and construction of their data centers to minimize energy, water, and other resource usage, focusing on sustainability and efficiency.

  • How much energy does the ICT sector contribute to global greenhouse gas emissions?

    -The ICT sector, which includes mobile phones, computers, monitors, cell phone towers, etc., contributes roughly about 2% of global greenhouse gas emissions.

  • What is the significance of the data center's energy usage in the ICT sector's total emissions?

    -The data center portion of the ICT sector is responsible for about 15% of the total emissions within the sector.

  • What is the Power Usage Effectiveness (PUE) and why is it important?

    -PUE is the ratio of total facility energy to IT equipment energy within a data center. It measures how effectively power and cooling are delivered to the IT equipment. A lower PUE indicates more efficient energy use and is desirable.

  • What was the typical PUE of an enterprise data center in 2006 and what has changed since then at Google?

    -In 2006, the typical PUE of an enterprise data center was 2.0. At Google, they have continuously improved their PUE, reaching a Time-Weighted Average (TTM) PUE of 1.16 over the last 12 months.

  • Why is it important to measure PUE frequently and over time?

    -Frequent and time-based measurement of PUE provides meaningful results that reflect the actual operation of the data center. Snapshots in time can be misleading and do not accurately represent the data center's efficiency.

  • How does Google manage PUE measurements in their data centers?

    -Google incorporates PUE measurements into their building management system across all sites, ensuring easy access to data and enabling efficient operation of their data centers.

  • What is the significance of managing air flow in a data center?

    -Managing air flow is crucial for reducing the data center's energy load. It involves eliminating the mixing of hot and cold air, which can be achieved through various containment approaches and CFD analysis.

  • How does Google use thermal modeling to improve air flow in their data centers?

    -Google uses thermal modeling to identify hot spots and understand how air flow is directed in their data centers. This helps them make simple design choices to improve air flow, such as using sheet metal extensions to increase intake to CRACs (Computer Room Air Conditioners).

  • What is the role of blanking panels in a data center and how do they contribute to energy savings?

    -Blanking panels are used in the rack space where not all equipment is populated. They help create a tight environment, similar to weatherizing a house, preventing air from bypassing the IT equipment and thus improving cooling efficiency and saving energy costs.

  • Why is raising the temperature in the cold aisle of a data center beneficial?

    -Raising the temperature in the cold aisle allows for more efficient cooling and reduces the need for heavy energy-consuming equipment like chillers. ASHRAE recommends running up to 80 degrees Fahrenheit, which Google implements, saving thousands of dollars in energy costs annually.

  • What is free cooling and how does Google utilize it in their data centers?

    -Free cooling is the practice of using ambient temperatures outside the data center for cooling without operating energy-intensive equipment. Google uses free cooling in all their data centers, which is reflected in their PUE data and results in significant efficiency gains.

  • How does Google implement evaporative cooling in their European data centers?

    -In Belgium, Google uses evaporative towers without chillers, taking advantage of local ambient conditions. In Finland, they use sea water cooling from the Bay of Finland, which cools the servers and then is tempered before returning to the bay, maintaining environmental balance.

  • What are some of the energy-efficient practices Google has implemented in their data centers?

    -Google has implemented several practices, including eliminating conversion stages in power supply, using battery on board trays, and ensuring AC-DC power supplies are energy efficient, which collectively save over 30% per year per server.

  • How do the efficiency best practices mentioned in the script apply to data centers of different sizes?

    -The efficiency best practices are applicable to both small and large data centers. They involve following five key steps to reduce energy use, making them a universal approach for improving data center efficiency.

Outlines

00:00

🌐 Energy Efficiency in Google's Data Centers

This paragraph discusses the significance of data centers at Google and their role in delivering web services. It highlights Google's efforts to innovate in data center design to reduce energy, water, and other resource usage. The speaker emphasizes the importance of adhering to best practices for energy efficiency, which have resulted in Google using half the energy of a typical data center. The paragraph introduces the concept of Power Usage Effectiveness (PUE) as a measure of energy efficiency and mentions Google's impressive PUE of 1.16, which indicates a highly efficient data center operation. The speaker also stresses the importance of continuous measurement for meaningful results and the incorporation of PUE measurements into building management systems.

05:01

🛠️ Best Practices for Data Center Efficiency

The second paragraph delves into specific best practices for improving data center efficiency. It starts with managing air flow to prevent the mixing of hot and cold air, using computational fluid dynamics (CFD) analysis to model and improve airflow. The use of cost-effective solutions like meat locker curtains and sheet metal doors to create hot and cold aisle containment is mentioned. The paragraph also addresses the misconception that data centers need to be kept cold, citing ASHRAE guidelines that recommend operating temperatures up to 80 degrees Fahrenheit, which Google follows. The benefits of free cooling, which utilizes ambient temperatures for cooling without heavy energy-consuming equipment, are discussed, along with the use of evaporative cooling and sea water cooling in Google's European data centers. The paragraph concludes with the importance of minimizing power conversion stages and using efficient power supplies to save energy, and it underscores the cost-effectiveness and applicability of these best practices to data centers of all sizes.

Mindmap

Keywords

💡Data Centers

Data Centers are facilities that house a large number of servers and networking equipment, crucial for delivering web services to users. In the video's context, they are central to Google's operations and are designed to be energy and resource-efficient. The script mentions that Google has innovated in the design and construction of these facilities to minimize their environmental impact.

💡Energy Efficiency

Energy Efficiency refers to the measure of how well a device, system, or process uses energy to deliver services. The video emphasizes Google's efforts to reduce energy consumption in their data centers, which is a key aspect of their sustainability practices. The script provides examples of how Google has achieved this, such as optimizing power usage effectiveness (PUE).

💡Power Usage Effectiveness (PUE)

PUE is a metric that indicates the efficiency of a data center's power usage, calculated as the ratio of total facility energy to IT equipment energy. A lower PUE indicates more efficient power and cooling delivery to IT equipment. The script explains that Google aims to reduce PUE to as close to 1.0 as possible, which represents ideal efficiency.

💡ICT Sector

ICT stands for Information and Communication Technology. The script mentions that the ICT sector, which includes devices like mobile phones and infrastructure like cell phone towers, contributes about 2% to global greenhouse gas emissions. Data centers are a significant part of this sector, and their energy efficiency is critical to reducing the sector's environmental impact.

💡Airflow Management

Airflow Management is the process of controlling the movement of air in a data center to ensure optimal cooling of IT equipment. The script discusses the importance of eliminating the mixing of hot and cold air for efficient cooling. Google uses techniques like CFD analysis and physical barriers to manage airflow effectively.

💡Hot and Cold Aisles

In data center design, hot aisles and cold aisles refer to the organization of racks to separate hot air exhaust from cold air intake, enhancing cooling efficiency. The script describes how Google uses curtains and sheet metal doors to create physical separation between hot and cold aisles, improving airflow and reducing energy use.

💡Blanking Panels

Blanking Panels are used in data centers to cover empty spaces in server racks, preventing the mixing of hot and cold air and improving cooling efficiency. The script mentions that Google uses these panels to 'weatherize' their data centers, likening the practice to sealing gaps in a house to improve energy efficiency.

💡Free Cooling

Free Cooling is a method of using outside ambient temperatures to cool a data center without the need for energy-intensive mechanical cooling systems like chillers. The script explains that Google utilizes free cooling at all of its data centers, which results in significant energy savings and efficiency gains.

💡Evaporative Cooling

Evaporative Cooling is a technique that uses the evaporation of water to cool air, which is then used to cool IT equipment in a data center. The script provides examples of Google's use of evaporative cooling in Belgium and Finland, highlighting the water and energy savings achieved through this method.

💡UPS System

UPS stands for Uninterruptible Power Supply, a system that provides emergency power to a data center in the event of a power outage. The script discusses the energy losses associated with traditional UPS systems due to multiple conversion stages between AC and DC power. Google has innovated by implementing battery on board trays to eliminate these conversion stages, increasing efficiency.

💡AC-DC Power Supply

AC-DC Power Supply refers to the conversion of alternating current (AC) to direct current (DC), which is required by most IT equipment. The script mentions that by ensuring the efficiency of AC-DC power supplies, significant energy savings can be achieved. Google saves over 30 percent per year per server by implementing efficient power supplies.

Highlights

Google data centers are critical for delivering web services.

Data centers range from small closets with a few machines to large warehouse-scale buildings.

Google has innovated the design and construction of data centers to minimize energy, water, and resource use.

Google's data centers use half the energy of typical data centers.

The ICT sector is responsible for about 2% of global greenhouse gas emissions, with data centers contributing 15% of that.

Implementing best practices in energy efficiency can significantly improve data center performance.

Google's energy efficiency practices save millions of dollars annually.

Power Usage Effectiveness (PUE) is crucial for measuring data center efficiency, with Google achieving a PUE of 1.16.

Regular PUE measurements are important for accurate and meaningful results.

Airflow management, particularly eliminating the mixing of hot and cold air, is key to reducing energy load.

CFD analysis helps model airflow and make design improvements in data centers.

Simple retrofits, like meat locker curtains and sheet metal doors, can significantly enhance airflow management.

Raising cold aisle temperatures to recommended levels (up to 80°F) can save thousands of dollars in energy costs.

Free cooling, utilizing ambient temperatures, can yield substantial efficiency gains.

Eliminating unnecessary power conversion stages and using efficient components can reduce energy consumption.

Google's on-board battery approach eliminates multiple conversion steps, enhancing energy efficiency.

Efficient power supplies, such as those with Energy Star labels, can further reduce energy use.

Google saves over $30 per server annually through these efficiency measures.

Many efficiency practices are cost-effective within 12 months and applicable to both small and large data centers.

Transcripts

play00:00

[Music]

play00:08

here at google

play00:09

data centers are very very important to

play00:10

us they are how we deliver all of our

play00:12

web services to

play00:14

all of our users and data center can

play00:16

mean a variety of things it can mean a

play00:18

small closet filled with a couple of

play00:19

machines all the way to very large

play00:21

warehouse scale

play00:22

buildings that are optimized for power

play00:24

use and

play00:25

i.t computing and filled with thousands

play00:27

of servers and google we've spent a lot

play00:29

of time

play00:30

innovating the way in which we design

play00:32

and build these facilities

play00:33

to minimize the amount of energy and

play00:35

water and other resources that these

play00:37

computing facilities use

play00:39

in terms of the results of all of the

play00:41

work that we've been doing

play00:42

over many many years now who use half

play00:45

the energy

play00:46

of the typical data center to put things

play00:49

into perspective the entire ict sector

play00:52

that includes mobile phones computers

play00:54

monitors cell phone towers

play00:56

is roughly about two percent of global

play00:58

greenhouse gas emissions

play00:59

of that two percent the data center

play01:02

portion is responsible for about 15

play01:08

there's design choices that you can make

play01:10

for energy efficiency that improve

play01:12

right the performance of your data

play01:14

center and these things are just best

play01:16

practices and adhering

play01:17

well to best practices that's how you

play01:19

can actually make the most improvements

play01:21

in terms of energy use

play01:23

the results of these types of activities

play01:25

return google

play01:26

millions of dollars in energy savings so

play01:29

the results are significant

play01:31

we've invited several members of our

play01:33

data center team here to explain some of

play01:35

these best practices

play01:36

to all of you

play01:42

the first step in managing the

play01:43

efficiency of your data center is to

play01:44

make sure you have the instrumentation

play01:46

in place

play01:47

to measure the pue or power usage

play01:49

effectiveness

play01:50

pue is the ratio of total facility

play01:52

energy to i.t equipment energy within

play01:54

your data center

play01:55

it's a measure of how effectively you

play01:57

deliver power and cooling to the it

play01:59

equipment

play02:00

in 2006 the typical pue of an enterprise

play02:03

data center was 2.0 which means

play02:05

for every one watt of i.t energy

play02:07

consumed one watt of overhead was

play02:09

consumed by the facility to deliver the

play02:11

power and cooling

play02:12

reducing the overhead is really what you

play02:14

want you want pue to get to as close to

play02:16

1.0 as possible

play02:18

over the last 12 months our ttm pue was

play02:22

1.16

play02:23

we've continuously measured that and has

play02:24

gone down nearly every quarter since we

play02:26

began reporting it back in 2008.

play02:29

last quarter the lowest data center was

play02:31

1.09

play02:34

ideally you should measure it as fast as

play02:36

you can as often as you can

play02:37

every second or so and the more often

play02:39

you can measure it the

play02:40

more meaningful the results will be it's

play02:43

important to

play02:43

you measure pue over the course of a

play02:46

year annually or quarterly

play02:48

to get meaningful results if you just

play02:50

take snapshots in time

play02:51

the information won't be realistic and

play02:54

it won't

play02:55

really be an actual measure of how well

play02:57

your data center is operating

play02:59

one way to make it easier to manage is

play03:01

incorporate the pue measurements into

play03:02

your building management system

play03:04

we do this all of our sites at google

play03:07

without having easy access to this data

play03:09

we wouldn't be able to operate our data

play03:11

centers as efficiently as we do

play03:19

once you have the ability to measure and

play03:21

manage your pue the first step in terms

play03:23

of reducing your data center energy load

play03:25

is to focus on the management of the air

play03:28

flow

play03:29

the most important thing here is to

play03:30

eliminate the mixing of the hot and the

play03:32

cold

play03:33

air and there's no one right way to do

play03:35

this containment can be achieved through

play03:37

many different approaches

play03:38

one thing that we found very useful at

play03:40

google is cfd analysis to see

play03:42

where are your hot spots and how is your

play03:44

airflow going to actually

play03:46

be directed in your data center by doing

play03:48

so you can actually model the way in

play03:50

which airflow will go and it helps you

play03:51

make very simple design choices to

play03:53

improve the airflow in your data center

play03:56

for example in one of our computing and

play03:58

networking rooms we call them cnrs

play04:00

we actually did some thermal modeling to

play04:03

see exactly what airflow was doing

play04:05

through that modeling we realized that

play04:06

the intake to our cracks was too low

play04:08

and that by simply piecing together some

play04:10

sheet metal we could create extensions

play04:13

it would dramatically increase the

play04:14

airflow quality

play04:16

into the cracks we also did a bunch of

play04:18

other retrofits

play04:20

here in this corporate data center at

play04:21

google we've implemented

play04:24

meat locker curtains that are very

play04:26

inexpensive and easy to install

play04:28

these are hung from the overhead

play04:30

structure and they separate

play04:32

the cold aisle which is actually hot in

play04:34

the hot aisle

play04:35

which is actually hotter we are set now

play04:37

to enter the hot aisle containment

play04:39

door and we we incorporated the simple

play04:43

inexpensive sheet metal doors

play04:46

to separate very tightly

play04:50

the cold aisle from the hot aisle now

play04:52

over here

play04:53

we've got the hot air from the racks are

play04:56

coming up

play04:56

going overhead up through the returner

play04:59

plenum back to the crack units to give

play05:01

you a nice high

play05:02

temperature differential across your

play05:03

pregnancy

play05:05

a very important step is to

play05:10

the rack space where you don't quite

play05:12

have all of your equipment populated and

play05:14

it's very easy to do

play05:15

with these blanking panels it's almost

play05:18

like weatherizing your house

play05:19

make sure that you've got a nice tight

play05:22

environment

play05:24

all told we spent about twenty five

play05:25

thousand dollars in parts and those

play05:27

twenty five thousand dollars saved us

play05:29

over sixty five thousand dollars in

play05:31

energy costs

play05:32

yearly

play05:33

[Music]

play05:37

once you manage your airflow properly

play05:39

the next step in data center efficiency

play05:41

is to increase the temperature of your

play05:43

cold aisle

play05:44

it's long been believed by many data

play05:46

center operators that the data center

play05:48

has to be cold

play05:49

to keep all the equipment at a

play05:50

temperature that it will run safely at

play05:52

and in fact that's just false so if you

play05:54

look at recommended guidelines from

play05:56

ashrae

play05:57

they recommend you running all the way

play05:58

up to 80 degrees fahrenheit

play06:00

and at google that's exactly what we do

play06:03

we've got a small

play06:04

corporate data center here it's about

play06:06

200 kilowatts of load

play06:08

simply raising the temperature from 72

play06:10

degrees to 80 degrees

play06:12

saves us thousands of dollars in energy

play06:14

costs every single year

play06:15

what's nice about that is it also allows

play06:17

our employees to come to work

play06:19

in shorts

play06:20

[Music]

play06:25

whenever possible we'd recommend people

play06:27

to free cool

play06:28

pre-cool means utilizing ambient

play06:30

temperatures outside of your data center

play06:32

to be able to provide cooling without

play06:34

operating very heavy energy consuming

play06:36

equipment like chillers

play06:38

we use free cooling at all of our data

play06:40

centers and you can see this in our

play06:42

publicly reported pue data where the pub

play06:44

values go up

play06:46

in the summer time and down in the

play06:47

winter time and this is just a

play06:50

reality of running your our operations

play06:52

with

play06:53

free cooling and it yields tremendous

play06:55

efficiency gains

play06:56

in europe we have two data centers that

play06:58

have no chillers

play06:59

whatsoever we're able to take advantage

play07:02

of the local constraints and

play07:04

conditions in belgium we use evaporative

play07:07

towers

play07:07

without any chillers given the ambient

play07:10

conditions

play07:11

in finland we use sea water cooling sea

play07:13

water from the bay of finland

play07:15

cools the servers and then we temper the

play07:17

water returning to the bay of finland so

play07:19

there's no

play07:20

temperature gradients returning to the

play07:21

bay evaporative cooling

play07:24

uses water on site but what we found

play07:26

through our studies is that

play07:29

by the use of evaporative cooling in a

play07:31

very efficient fashion we save water

play07:33

on the whole so for every gallon of

play07:35

water we use in the evaporative cooling

play07:37

plants

play07:38

we eliminate the use of two gallons of

play07:41

water on the energy production side

play07:43

so this translates into hundreds of

play07:44

millions of gallons per year in water

play07:46

savings

play07:48

there's no one right way to deliver free

play07:51

cooling

play07:51

the important point is that you should

play07:53

examine these opportunities and take

play07:55

advantage of them

play07:56

to eliminate or reduce substantially the

play07:58

mechanical cooling

play08:03

in a data center you pull power in from

play08:06

the electrical grid

play08:07

and you convert it down to the voltages

play08:09

that are needed for all the components

play08:10

in the data center

play08:11

and there's a lot of conversion stages

play08:12

in there by minimizing those conversion

play08:14

stages you can

play08:16

save money and save energy also by

play08:17

making each conversion stage more

play08:19

efficient

play08:20

you can save energy as well

play08:22

traditionally one of the

play08:23

the big losses is the ups system the

play08:26

uninterruptible power supply

play08:29

typically there's a giant room of

play08:30

batteries batteries are dc voltage and

play08:33

the power coming in to charge those

play08:35

batteries is ac

play08:36

and so you need to convert the ac down

play08:38

to dc with a rectifier

play08:40

in order to charge the batteries and

play08:42

then when the batteries are needed in a

play08:44

power event you need to convert that

play08:46

back to ac with an inverter

play08:47

and then the ac needs to be converted

play08:49

back down to dc for all the components

play08:51

in the data center

play08:52

so you've got three conversion stages in

play08:53

there that are not necessary

play08:55

what google has done is put a battery on

play08:57

board the tray so you're eliminating

play08:59

those three conversion steps you just

play09:00

have dc right into the server components

play09:02

in a typical server configuration you

play09:04

have a server with an ac dc power supply

play09:06

attached to it by making sure that ac dc

play09:08

power supply is efficient

play09:09

you can save a lot of energy things like

play09:11

energy star labels will point you to

play09:13

power supplies that are 90 plus percent

play09:14

efficient

play09:15

google is able to save over 30 per year

play09:17

per server by implementing all of these

play09:19

features

play09:21

there really are very simple effective

play09:24

approaches that all of us can implement

play09:26

to reduce

play09:26

the data center energy use and most of

play09:29

them are cost effective within 12 months

play09:31

of operation

play09:32

so a lot of efficiency best practices

play09:34

should be adopted by just about everyone

play09:36

they're applicable to small data centers

play09:38

or large data centers

play09:40

it's simply following the five steps

play09:42

that we go through here

play09:44

to make sure that you're able to reduce

play09:46

your energy use

play09:50

[Music]

play10:00

you

Rate This

5.0 / 5 (0 votes)

相关标签
Data CentersGoogleEfficiencySustainabilityGreenhouse EmissionsICT SectorBest PracticesAirflow ManagementFree CoolingPower SupplyEnergy Savings
您是否需要英文摘要?