Google Data Center Efficiency Best Practices -- Full Video

Google Sustainability
23 May 201110:01

Summary

TLDRThis video discusses how Google optimizes its data centers to minimize energy and resource consumption. Through innovations like measuring Power Usage Effectiveness (PUE), managing airflow, adjusting temperature settings, utilizing free cooling methods, and optimizing power distribution, Google has achieved significant energy savings. These best practices have allowed their data centers to use half the energy of typical ones, resulting in millions in savings. The video highlights that even small improvements, such as raising temperatures or reducing unnecessary power conversion steps, can lead to impactful efficiency gains for both large and small data centers.

Takeaways

  • 💡 Google innovates data center designs to reduce energy and resource usage significantly.
  • 💻 Data centers vary from small closets with few machines to large warehouses optimized for IT computing.
  • 🌱 Google uses 50% less energy than typical data centers due to years of innovation in efficiency.
  • ⚡ Power Usage Effectiveness (PUE) is critical for measuring data center efficiency, with Google achieving a PUE as low as 1.09.
  • ❄️ Managing airflow is key to energy efficiency; eliminating hot and cold air mixing improves cooling effectiveness.
  • 📈 Thermal modeling helps improve airflow, reducing energy consumption through small, simple changes like adjusting CRAC units.
  • 🛠️ Google's use of low-cost containment methods, like meat locker curtains and sheet metal doors, results in significant savings.
  • 🌡️ Raising the data center temperature from 72°F to 80°F saves thousands of dollars annually without compromising equipment safety.
  • 🌊 Google leverages free cooling techniques, such as sea water cooling and evaporative towers, to reduce mechanical cooling costs.
  • 🔋 Optimizing power distribution and reducing conversion stages increases energy efficiency, especially with UPS improvements.

Q & A

  • What are data centers used for at Google?

    -Data centers at Google are critical for delivering web services to users. They can range from small rooms with a few machines to large warehouse-scale facilities optimized for power use and IT computing.

  • What is PUE, and why is it important for data centers?

    -PUE, or Power Usage Effectiveness, is a ratio that measures how efficiently a data center uses energy. It compares total facility energy to IT equipment energy, helping to identify areas where overhead energy usage, such as cooling, can be reduced.

  • What was the typical PUE of an enterprise data center in 2006, and how does that compare to Google’s recent PUE?

    -In 2006, the typical PUE of an enterprise data center was 2.0, meaning for every watt of IT energy consumed, one watt was consumed for overhead. Google’s PUE has improved significantly, with a 12-month average PUE of 1.16 and their lowest data center achieving 1.09.

  • Why is airflow management crucial in data center efficiency?

    -Airflow management prevents the mixing of hot and cold air, which helps maintain efficient cooling. Techniques like CFD analysis and simple design changes can dramatically improve airflow, resulting in reduced energy consumption.

  • What is free cooling, and how does Google implement it?

    -Free cooling refers to using ambient temperatures, such as outside air, to cool data centers without relying on energy-intensive equipment like chillers. Google uses free cooling at all its data centers and even utilizes sea water in Finland and evaporative towers in Belgium.

  • What are the benefits of raising the temperature in data centers, according to Google?

    -By raising the temperature in their data centers from 72°F to 80°F, Google saves thousands of dollars in energy costs annually while ensuring equipment runs safely at higher temperatures, aligning with ASHRAE guidelines.

  • How does Google minimize energy losses in power distribution?

    -Google minimizes energy losses by reducing conversion stages in power distribution. Instead of converting AC power multiple times, they directly use DC power, eliminating unnecessary steps and improving efficiency.

  • What impact does Google’s approach to evaporative cooling have on water usage?

    -Although evaporative cooling uses water, Google’s efficient use of this method saves more water overall by reducing the water needed for energy production, translating into hundreds of millions of gallons saved annually.

  • What is the significance of measuring PUE frequently, according to Google?

    -Frequent PUE measurements allow for more meaningful results, providing accurate data on a data center’s performance. Google recommends measuring PUE as often as possible, ideally every second, to better manage energy efficiency.

  • What are the key steps Google follows to reduce energy usage in data centers?

    -Google follows five key steps to reduce data center energy use: 1) Measure PUE, 2) Manage Airflow, 3) Adjust Thermostat, 4) Utilize Free Cooling, and 5) Optimize Power Distribution. These steps are cost-effective and applicable to both small and large data centers.

Outlines

00:00

🏢 Google Data Centers: Innovation and Efficiency

Erik Teetzel introduces Google's approach to data centers, which vary in size from small server rooms to large warehouse facilities optimized for power and computing. Google emphasizes energy efficiency and innovation, achieving significant reductions in energy and water use. These efforts have resulted in Google using half the energy of typical data centers, contributing to a significant reduction in global greenhouse gas emissions. Best practices, such as design choices, are key to improving energy efficiency, and Google's efforts have yielded millions in energy savings.

05:03

🔋 Measuring Power Usage Effectiveness (PUE)

Kevin Dolder explains that managing a data center's efficiency starts with measuring PUE, a ratio of total facility energy to IT equipment energy. A lower PUE indicates better energy efficiency. The typical PUE in 2006 was 2.0, but Google has driven its PUE down to 1.16 in the last year, with the lowest data center reaching 1.09. Accurate and frequent PUE measurements are crucial for meaningful results, and integrating them into a building management system is essential for optimal operation.

💨 Optimizing Airflow for Energy Savings

Erik Teetzel discusses the importance of managing airflow in data centers to reduce energy consumption. Preventing the mixing of hot and cold air is vital, and Google uses CFD analysis to identify hotspots and direct airflow. Simple design changes, such as adjusting CRAC intake positions, have improved airflow efficiency. Google implemented cost-effective retrofits like meat locker curtains and sheet metal doors to separate cold and hot aisles, leading to improved airflow and significant energy savings.

🧊 Sealing and Temperature Control for Efficiency

Erik Teetzel outlines further steps to enhance data center efficiency, such as sealing unused rack space with blanking panels to create a tightly controlled environment. Raising the cold aisle temperature from 72°F to 80°F saves thousands of dollars in energy costs annually, without compromising equipment safety. Free cooling, using ambient outdoor temperatures to reduce the need for mechanical cooling, is another strategy Google employs at all its data centers, resulting in tremendous efficiency gains.

🌊 Free Cooling and Water Efficiency

Chris Malone highlights Google's use of free cooling and evaporative cooling to eliminate or reduce mechanical cooling. In Europe, two of Google's data centers operate without chillers, utilizing local conditions such as sea water cooling in Finland and evaporative towers in Belgium. These methods not only enhance energy efficiency but also save significant amounts of water. For every gallon used in evaporative cooling, two gallons are saved in energy production, leading to hundreds of millions of gallons saved annually.

⚡ Optimizing Power Distribution in Data Centers

Tracy Van Dyk discusses how Google minimizes power losses by reducing unnecessary conversion stages in data centers. Traditional setups involve multiple conversions between AC and DC power, which lead to inefficiencies. Google eliminated these by integrating batteries directly into server trays, simplifying the process. Ensuring efficient AC/DC power supplies, such as Energy Star-certified models, also helps reduce energy use. These innovations save Google over $30 per server annually.

🔑 Best Practices for Data Center Efficiency

Erik Teetzel summarizes Google's five key steps to data center energy efficiency: 1) Measure PUE, 2) Manage airflow, 3) Adjust thermostat, 4) Utilize free cooling, and 5) Optimize power distribution. These cost-effective approaches are applicable to data centers of all sizes, and many of them return savings within 12 months. By following these practices, significant reductions in energy consumption can be achieved, contributing to both environmental and financial benefits.

Mindmap

Keywords

💡Data Center

A data center is a facility used to house computer systems and related components such as storage systems and telecommunications. In the video, Google data centers are described as vital to delivering web services. They range from small server rooms to massive warehouses filled with thousands of servers, optimized for power use and IT computing.

💡PUE (Power Usage Effectiveness)

PUE is a metric used to measure the energy efficiency of a data center. It is calculated as the ratio of total facility energy to IT equipment energy. The video emphasizes the importance of minimizing overhead energy and getting PUE as close to 1.0 as possible, meaning minimal energy waste in power and cooling systems.

💡Energy Efficiency

Energy efficiency refers to minimizing the amount of energy consumed to perform a task. The video discusses various design choices and best practices used at Google data centers to reduce energy use, such as better airflow management, higher temperatures in cold aisles, and using free cooling techniques.

💡Airflow Management

Airflow management in a data center is the process of optimizing the flow of cold and hot air to improve cooling efficiency. The video highlights the importance of eliminating the mixing of hot and cold air and using CFD (Computational Fluid Dynamics) analysis to identify airflow issues, helping reduce energy costs.

💡Free Cooling

Free cooling refers to utilizing ambient outside air or water to cool the data center without relying on energy-intensive equipment like chillers. In the video, Google's data centers in Belgium and Finland employ free cooling methods, including using sea water or evaporative towers, to significantly reduce energy consumption.

💡ASHRAE Guidelines

ASHRAE (American Society of Heating, Refrigerating, and Air-Conditioning Engineers) provides recommended temperature ranges for data center environments. The video explains how Google follows these guidelines, allowing the temperature in cold aisles to reach up to 80°F, reducing energy costs by eliminating the need for overly cool environments.

💡Conversion Stages

Conversion stages refer to the multiple steps in converting electrical power to the necessary voltages for data center equipment. The video outlines how Google reduces energy losses by minimizing unnecessary conversion stages and using more efficient DC power delivery, particularly with on-board batteries for servers.

💡UPS (Uninterruptible Power Supply)

A UPS system provides backup power during a power outage. In traditional systems, UPS introduces energy losses due to multiple AC/DC conversion stages. Google’s approach, as described in the video, is to integrate the battery directly with the server tray to minimize these losses, saving energy.

💡Containment

Containment refers to separating hot and cold aisles in a data center to prevent the mixing of cold air meant for cooling and hot air expelled by servers. The video details how Google uses various containment strategies, including meat locker curtains and sheet metal doors, to maintain optimal airflow and cooling efficiency.

💡Cost Savings

Cost savings refers to the reduction in operational expenses through energy efficiency improvements. The video provides examples of Google saving millions of dollars annually through initiatives like managing airflow, raising temperatures, and free cooling, making energy-saving measures highly cost-effective.

Highlights

Google designs its data centers to minimize energy, water, and resource use, resulting in the use of half the energy of a typical data center.

Data centers represent 15% of the 2% of global greenhouse gas emissions produced by the ICT sector, making efficiency critical.

Power Usage Effectiveness (PUE) measures total facility energy against IT equipment energy. Google's lowest PUE is 1.09, compared to the typical 2.0 in 2006.

Regularly measuring PUE, as often as every second, gives more meaningful data, helping data centers operate more efficiently.

Google uses CFD analysis and other methods to optimize airflow, reducing energy costs by improving how air moves in data centers.

Simple retrofits like sheet metal extensions and meat locker curtains can improve airflow, saving $65,000 in energy costs annually for Google.

Google saves thousands of dollars by raising the temperature of their cold aisles to 80°F, rather than the previously believed necessary lower temperatures.

Free cooling methods, such as using ambient outdoor temperatures, are employed at all Google data centers to reduce mechanical cooling needs.

Google’s data centers in Belgium and Finland use chiller-less cooling, relying on evaporative cooling and seawater cooling, reducing energy and water use.

Google's evaporative cooling systems save water by eliminating the need for two gallons of water on the energy production side for every gallon used.

Minimizing power conversion stages and optimizing power supply efficiency, such as using DC instead of AC in servers, saves both energy and money.

On-board batteries eliminate three unnecessary power conversion stages in Google servers, enhancing efficiency and reducing energy loss.

By making power supplies more efficient, including using Energy Star-certified supplies, Google saves over $30 per server annually.

Many best practices in energy efficiency can deliver cost savings within 12 months, making them applicable for both small and large data centers.

Google recommends five steps for data center efficiency: Measure PUE, Manage Airflow, Adjust Thermostat, Utilize Free Cooling, and Optimize Power Distribution.

Transcripts

play00:08

ERIK TEETZEL: Here at Google, data centers are very

play00:10

important to us.

play00:11

They are how we deliver all of our web services

play00:13

to all of our users.

play00:15

A data center can mean a variety of things.

play00:17

It can mean a small closet filled with a couple of

play00:19

machines all the way to very large warehouse scale

play00:22

buildings that are optimized for power use and IT computing

play00:26

and filled with thousands of servers.

play00:28

At Google, we spend a lot of time innovating the way in

play00:31

which we design and build these facilities to minimize

play00:34

the amount of energy, and water, and other the resources

play00:36

that these computing facilities use.

play00:39

In terms of the results of all of the work that we've been

play00:42

doing over many, many years, now we use half of the energy

play00:46

of the typical data center.

play00:48

To put things into perspective, the entire ICT

play00:51

sector, that includes mobile phones, computers, monitors,

play00:55

cell phone towers, represents roughly about 2% of global greenhouse

play00:58

gas emissions.

play00:59

Of that 2%, the data center portion is

play01:02

responsible for about 15%.

play01:08

There's design choices that you can make for energy

play01:10

efficiency that improve the performance

play01:13

of your data center.

play01:14

And these things are just best practices.

play01:16

And adhering well to best practices, that's how you can

play01:19

actually make the most improvement in

play01:21

terms of energy use.

play01:22

The results of the these types of activities return

play01:26

Google millions of dollars in energy savings.

play01:28

So the results are significant.

play01:31

We've invited several members of our data center team

play01:34

to explain some of these best practices to all of you.

play01:42

KEVIN DOLDER: The first step in managing the efficiency of

play01:44

your data center is to make sure you have the

play01:45

instrumentation in place to measure the PUE, or power

play01:48

usage effectiveness.

play01:50

PUE is the ratio of total facility energy to IT

play01:53

equipment energy within your data center.

play01:55

It's a measure of how effectively you deliver power

play01:57

and cooling to the IT equipment.

play02:00

In 2006, the typical PUE of an enterprise

play02:03

data center was 2.0.

play02:04

Which means that for every one watt of IT energy consumed, one watt of

play02:08

overhead was consumed by the facility to deliver the power

play02:11

and cooling.

play02:12

ERIK TEETZEL: Reducing the overhead is

play02:13

really what you want.

play02:14

You want PUE to get to as close to 1.0 as possible.

play02:18

KEVIN DOLDER: Over the last 12 months, our TTM PUE was 1.16.

play02:23

We've continuously measured that and it's gone down nearly

play02:25

every quarter since we began reporting it back in 2008.

play02:29

Last quarter the lowest data center was 1.09.

play02:34

Ideally, you should measure PUE as fast as you can, as often

play02:36

as you can, every second or so.

play02:38

And the more often you can measure it, the more

play02:40

meaningful the results will be.

play02:43

It's important to measure PUE over the course of a year --

play02:47

annually or quarterly -- to get a meaningful result.

play02:49

If you just take snapshots in time the information won't be

play02:53

realistic and it won't really be an actual measure of how

play02:57

well your data center is operating.

play02:59

One way to make it easier to manage is to incorporate the

play03:01

PUE measurements into your building management system.

play03:04

We do this at all of our sites at Google.

play03:07

Without having easy access to this data we wouldn't to be

play03:10

able to operate our data centers as

play03:12

efficiently as we do.

play03:19

ERIK TEETZEL: Once you have the ability to measure and

play03:21

manage your PUE, the first step in terms of reducing your

play03:24

data center energy load is to focus on the

play03:27

management of the air flow.

play03:28

The most important thing here is to eliminate the mixing of

play03:32

the hot and the cold air.

play03:33

And there's no one right way to do this.

play03:35

Containment can be achieved through many different

play03:37

approaches.

play03:38

One thing we found very useful at Google is CFD analysis to

play03:42

see where are your hot spots and how is your air flow going

play03:45

actually be directed in your data center?

play03:48

By doing so, you can actually model the way in which air

play03:50

flow will go and it helps you make very simple design

play03:53

choices to improve the air flow in your data center.

play03:56

For example, in one of our computing and networking

play03:59

rooms, we call them CNRs, we actually did some thermal

play04:02

modeling to see exactly what air flow was doing.

play04:05

Through that modeling we realized that the intake to

play04:07

our CRACs was too low.

play04:08

And that by simply piecing together some sheet metal we

play04:11

could create extensions that would dramatically increase

play04:14

the air flow quality into the CRACs.

play04:17

We also did a bunch of other retrofits.

play04:20

KEN WONG: Here in this corporate data center at

play04:22

Google, we've implemented meat locker curtains that are very

play04:26

inexpensive and easy to install.

play04:28

These are hung from the overhead structure and they

play04:31

separate the cold aisle, which is actually hot, and the hot

play04:35

aisle, which is actually hotter.

play04:36

We are set now to enter to hot aisle containment door.

play04:40

And we incorporated these simple, inexpensive, sheet metal doors

play04:46

to separate very tightly the cold aisle from the hot aisle.

play04:52

Now over here, we've got the hot air from the racks

play04:56

coming up, going over head, up through the return

play04:58

air plenum

play04:59

back to the CRAC units to give you a nice high temperature

play05:02

differential across your CRAC units.

play05:05

A very important step is to seal the rack space where you

play05:12

don't quite have all of your equipment populated.

play05:14

And it's very easy to do with these blanking panels.

play05:17

It's almost like weatherizing your house to make sure that

play05:21

you've got a nice, tight environment.

play05:24

ERIK TEETZEL: All totalled, we spent about $25,000 in parts.

play05:27

And those $25,000 saved us over $65,000 in

play05:31

energy costs yearly.

play05:37

Once you manage your air flow properly, the next step in

play05:40

data center efficiency is to increase the temperature of

play05:43

your cold aisle.

play05:44

It's long been believed by many data center operators

play05:47

that the data center has to be cold to keep all the equipment

play05:50

at a temperature that it will run safely at.

play05:52

And in fact, that's just false.

play05:54

So if you look at recommended guidelines from ASHRAE, they

play05:57

recommend you running all the way up to 80 degrees

play05:59

Fahrenheit.

play06:01

And at Google, that's exactly what we do.

play06:03

We've got a small corporate data center here.

play06:05

It's about 200 kilowatts of load.

play06:08

Simply raising the temperature from 72 degrees to 80 degrees

play06:12

saves us thousands of dollars in energy costs

play06:14

every single year.

play06:15

What's nice about that is it also allows our employees to

play06:18

come to work in shorts.

play06:25

Whenever possible, we recommend people to free cool.

play06:29

Free cooling means utilizing ambient temperatures outside

play06:31

of your data center to be able to provide cooling without

play06:34

operating very heavy energy consuming

play06:36

equipment like chillers.

play06:38

CHRIS MALONE: We use free cooling at

play06:39

all of our data centers.

play06:41

And you can see this in our publicly recorded PUE data

play06:44

where the PUE values go up in the summertime and down in the

play06:47

wintertime.

play06:48

And this is just a reality of running our operations

play06:52

with free cooling.

play06:54

And it yields tremendous efficiency gains.

play06:56

In Europe, we have two data centers that have no chillers

play06:59

whatsoever.

play07:00

We're able to take advantage of the local constraints and

play07:04

conditions.

play07:05

In Belgium, we use evaporative towers without any chillers

play07:09

given the ambient conditions.

play07:11

In Finland, we use sea water cooling.

play07:13

Sea water from the Bay of Finland cools the servers.

play07:16

And then we temper the water returning to the Bay of

play07:19

Finland so there's no temperature gradience

play07:21

returning to the bay.

play07:23

Evaporative cooling uses water on site, but what we found

play07:26

through our studies is that by the use of evaporative cooling

play07:31

in a very efficient fashion, we save water on the whole.

play07:34

So for every gallon of water that we use in the evaporative

play07:37

cooling plants, we eliminate the use of two gallons of

play07:41

water on the energy production side.

play07:43

This translates into hundreds of millions of gallons per

play07:45

year in water savings.

play07:48

There's no one right way to deliver free cooling.

play07:51

The important point is that you should examine these

play07:54

opportunities and take advantage of them to eliminate

play07:57

or reduce substantially the mechanical cooling.

play08:03

TRACY VAN DYK: In the data center, you pull power in from

play08:06

the electrical grid and you convert it down to the

play08:08

voltages that are needed for all the

play08:10

components in the data center.

play08:11

And there's a lot conversion stages in there.

play08:13

By minimizing those conversion stages, you can save

play08:16

money and save energy.

play08:17

Also by making each conversion stage more efficient you can

play08:21

save energy, as well.

play08:22

Traditionally, one of the biggest losses is UPS,

play08:26

Uninterruptible Power Supply.

play08:29

Typically, there's a giant room of batteries.

play08:31

The batteries are DC voltage.

play08:33

And the power coming in to charge those batteries is AC.

play08:36

And so you need to convert from AC down to DC with a

play08:40

rectifier in order to charge the batteries.

play08:41

And then when the batteries are needed in a power event,

play08:44

you need to convert that back to AC with an inverter.

play08:47

And then the AC needs to be converted back down to DC for

play08:50

all the components in the data center.

play08:51

So you've got three conversion stages in there

play08:53

that are not necessary.

play08:55

What Google has done is put a battery on board the tray.

play08:58

So you're eliminating those three conversion steps.

play09:00

You just have DC right into the server components.

play09:03

In a typical server configuration, you have a

play09:04

server with an AC/DC power supply attached to it.

play09:07

By making sure that AC/DC power supply is efficient, you

play09:10

can save a lot of energy.

play09:11

Things like Energy Star labels will point you to power

play09:13

supplies that are 90% plus efficient.

play09:15

Google is able to save over $30 dollars per year per

play09:17

server by implementing all of these features.

play09:21

ERIK TEETZEL: There really are very simple, effective

play09:24

approaches that all of us can implement to reduce the data

play09:27

center energy use.

play09:28

And most of them are cost effective within

play09:30

12 months of operation.

play09:32

So a lot of efficiency best practices should be adopted by

play09:35

just about everyone.

play09:36

They're applicable to small data centers

play09:38

or large data centers.

play09:40

It's simply following the five steps that we go through here

play09:43

to make sure that you're able to reduce your energy use.

play09:47

1. Measure PUE 2. Manage Airflow 3. Adjust Thermostat 4. Utilize Free Cooling 5. Optimize Power Distribution

Rate This

5.0 / 5 (0 votes)

Связанные теги
Data CentersEnergy EfficiencyGoogle InnovationPUE MeasurementAirflow ManagementFree CoolingPower OptimizationSustainabilityGreen TechnologyCost Savings
Вам нужно краткое изложение на английском?