New Sustainability, New Power Handling Ideas are Part of Data Center Evolution

01 Dec, 2021

The increase of machine learning and artificial intelligence data applications is causing more energy efficiency concerns as centers proliferate across the country

Data centers are increasingly important to standard business operations all across the world. More data collected, more data organized, and more data being used that generates even more data are all common practices in most industries today.

Now the issue is where to store it, how safe it is wherever it is stored, and what and how data centers are both maintained and updated to handle growing data storage and use.

A data center snapshot

Take a look at the consumer demand for data storage and usage. According to the U.S. Office of Energy Efficiency and Renewable Energy (EERE), the vast majority of Americans own a computer or use the internet on a regular basis.

Approximately 85 percent of American households have a computer, 70 percent are connected to the internet, and 75 percent of American adults use social media.

Any computer connected to the internet uses data centers for storage. There are about 3 million data centers in the United States, or one data center for every 100 people.

Most data centers are housed in small- and medium-sized businesses. Many data centers are small server rooms and closets found in a variety of buildings owned by small- and medium-sized businesses and organizations. Others are located in multi-tenant data centers, which are increasingly being used.

The larger data centers owned by the major cloud providers and national super computer centers make up less than 10 percent of the server market. But they are growing fast.

New data from Synergy Research Group, a market intelligence and analytics company for the networking and telecoms industry, shows that the total number of large data centers increased to 597 at the end of 2020, having more than doubled since the end of 2015, with the U.S. accounting for almost 40 percent of the major cloud and internet data center sites.

The companies with the broadest data center footprint are the leading cloud providers—Amazon, Microsoft, Google and IBM. Each has 60 or more data center locations, according to Synergy data.

These massive data centers around the country are impressive, and growing. Google operates 14 in the U.S. and continues to expand them. For example, in July of 2019, Google officially broke ground on a $600 million data center in Henderson, Nevada.

In 2019, Google finished construction on the first phase of two data centers in Loudoun County, Virginia, part of a $1.2 billion initial investment.

Virginia is a leading data center state. The northern Virginia data center market is the largest in the world, with over 2,100 megawatts of power supply. Northern Virginia has the largest inventory of data center space globally, accounting for approximately 60 percent of overall demand for data centers in North America, according to JLL, a real estate research and services company.

Amazon Web Services has more than 4 million square feet of data center space in northern Virginia, and is expected to add another 1.2 million square feet by the first quarter of 2021, according to Data Center Frontier, a data center reporting company.

From 2010 to 2019, Facebook invested more than $16 billion in U.S. data center construction and operations, which supported over 238,000 jobs, according to the Facebook website. The company opened three new data centers this year: the $1 billion, 2.5 million square foot data center in Huntsville, Alabama; the $800 million, 950,000 square foot data center in Mesa, Arizona; and the $1 billion, 2.4 million square foot data center in Eagle Mountain, Utah.

All three of these new Facebook data centers, plus 15 others built from 2013-2020, operate with 100 percent renewable energy, according to Facebook.

And these centers are economic engines wherever they are located. For example, in 2016, Google data centers in operation at the time generated $1.3 billion in economic activity, $750 million in labor income, and 11,000 jobs throughout the United States, according to a report on the economic impact of Google data centers. “The greatest value of landing a Google data center may come from seeding future economic growth and diversification in regions that need a boost,” the report stated.

Sustainability becomes a sharper focus

A 2016 report by the Berkeley National Laboratory about U.S. data center energy usage showed that these centers were projected to consume approximately 73 billion kilowatts in 2020, representing 1.8 percent of total U.S. electricity consumption.

Between 2010 and 2018, data center computing grew by 500 percent, while data center energy use only grew by six percent, according to U.S. Environmental Protection Agency (EPA), indicating that data center engineers are finding more energy efficient ways to build and operate the centers.

Still, data centers are one of the most energy-intensive building types, the EPA reports, consuming 10 to 50 times more energy per square foot than a typical office building.

The EERE reported that, in a typical data center, every kilowatt saved with IT equipment can potentially result in nearly 2 kilowatts saved in powering the data center. Half of the energy supplied to a typical data center is used for cooling the power infrastructure.

Kevin Kent, senior data center operations manager for Critical Facilities Efficiency Solutions, a global data center industry energy efficiency and sustainability consultancy service, has worked in data centers for over 30 years—such as the Ohio State University Wexner Medical Centers data center.

He says that he was drawn to the mechanical/electrical side of data centers because that is where 40 percent of the cost of operating the center is going. He worked to reduce the carbon footprint at the Wexner Medical facility.

That work snowballed into a global consultancy business that he now operates. “The work that I’ve been doing started out as manual optimization where I would just go in and basically baseline all the thermal attributes that were happening in the data center and all of the attributes associated with the electrical system, and, based on that data, would make recommendations on how to save money, cut costs, and lower power usage effectiveness (PUE),” Kent says. “The PUE really seems to be the one metric that data centers keep hanging on to that shows that they are effectively using their power. If they’re doing that, they are generally very efficient.”

PUE only measures the efficiency of the building infrastructure of a given data center, and indicates nothing about the efficiency of the data center equipment itself.

On the mechanical cooling side of the data center operations, he says that there are thousands of monitoring points within the cooling system, but that there is a disconnect with these cooling systems. “They’re using a lot of energy,” Kent says. “And we’re trying to keep our equipment cool so we don’t have any disruption of service. Servers, switches, and storage can shut down if they get too hot. Some data centers have disruption of service that impacts their revenue. But what’s different (at a medical center) is if we have disruption of service, it could mean loss of human life.”

With the cooling systems at the Wexner Medical Facility, he says, there was not a good understanding about what was going on at the rack level. “They were supplying all of this nice cold air to specific areas, and they really didn’t know how much air was needed in these areas.”

There were hundreds of thousands of lines of data and algorithms in the center to go through manually and figure out what was useful and what wasn’t. “What is this data telling me? How can I use it to become more efficient? That just wasn’t possible to do manually,” Kent says. “But the machine learning and the artificial intelligence as applied to the center basically does data mining where it looks at all of this information. It learns how the environment is reacting, how the server, switches, storage are reacting. And it gives us recommendations and tells us this is how you can become more efficient.”

The machine learning and the AI applications that can be installed in data centers to help operators make appropriate cooling choices “removes this risk” of operational shutdowns, Kent says. “We have sort of flipped the model where if we can reduce the risk, we can increase the reliability and we can also increase the profit.”

The evolution of data center sustainability and efficiency has been about airflow management—separating or containing the supply air from the return air—and closing up every hole and gap. “Seal up windows, insulate, do everything you can to become more efficient,” Kent says. “But remember: You can’t manage what you do not measure. And so many of the facilities I go in, they just don’t know really where their power is going. The metering or the monitoring maybe is there, but they’re not paying attention to it. And when they really start to measure it, and get a good picture and a baseline of what’s going on, then it points us to the areas where we’re making ineffective use of the power. Then we can start to change it.”

Kent says that the cloud is “incredibly secure,” but there is an evolution coming. “It’s interesting that 10 years ago or so, everybody was saying ‘Go to the cloud, go to the cloud, go to the cloud.’ Last year, we’ve seen a sort of repatriation where people are trying to come back from the cloud. It seems like we’re settling into this hybrid where it’s, ‘Hey, we want to put all of our Microsoft Office, our email, and what we maybe consider lower tier applications in the cloud, and let’s keep our homegrown apps or our clinical apps or other apps that are more vital to our business. We feel more comfortable with them here.’”

The data center of tomorrow

Kent says the newest technology that will be sweeping across data centers is liquid cooling. “Full liquid immersion cooling is a little more difficult for an existing facility to transition to.”

Liquid cooling means submerging computer components, or full servers, into a thermally, but not electrically, conductive liquid coolant. IT hardware or servers cooled this way don’t require fans. It provides benefits like higher energy efficiency, smaller footprint, lower total cost of ownership (TOC), and enhanced server reliability, all with little to no noise, and helps cool data centers that use new, hotter running microchips for big data analytics, AI, and machine learning algorithms.

“A liquid cooling company can go into any data center, take any specific hardware service and, much like a radiator in a car or an automobile, cool your equipment that way. It’s incredibly efficient.”

If a data center facility has a PUE of 1.3 to 1.4, that’s an incredibly efficient operation. “Liquid cooling can take PUEs down to as low as 1.1 to 1.2,” Kent says. “In a 510 or 520 megawatt facility, that’s huge savings.”

David Hodes

David Hodes is a freelance writer living in Washington, D.C. He can be reached at dhodes11@gmail.com.

More Posts