What does CUE(Carbon Usage Effectiveness) Mean for Data Centers

Nov 17, 2023Leave a message
Introduction

 

 

The fast growth of digital transformation across various industries, coupled with the widespread application of new technologies such as 5G, artificial intelligence, and the Internet of Things, has made a lot more data in society. Data centers, serving as the digital base for how information systems work in different industries, have become indispensable critical infrastructure in the economic and social landscape, playing a crucial role in the development of the digital economy. However, data centers use a lot of energy and create a lot of emissions. To tackle this, we need good measures to cut down on emissions and see how sustainable they are. That's where the Carbon Usage Effectiveness (CUE) metric comes in.

 

Define "CUE"(Carbon Usage Effectiveness)

 

The Green Grid introduced the Carbon Usage Effectiveness (CUE) metric in 2010 to figure out the greenhouse gas (GHG) emissions per unit of IT energy consumption in data centers. It has become part of ISO/IEC 30134-8 for assessing the sustainability of data centers in terms of carbon emissions. CUE is analogous to carbon intensity, considering both Scope 1 and Scope 2 emissions but divided by IT load, similar to Power Usage Effectiveness (PUE). The metric provides an effective way to measure the carbon footprint of data centers and evaluate their sustainability in carbon emissions.

To calculate CUE when using grid electricity, carbon emissions can be based on government data published for that region. When using on-site generated power, ideally, actual emission data from local meters should be used. However, it's also possible to use emission and fuel source data from the generator manufacturer for calculations.

 

The Formula of CUE

To calculate CUE, the formula is as follows.

calculating CUE

A lower CUE ratio signifies a lower carbon footprint, indicating higher carbon usage efficiency in data centers. The ideal CUE value is 0.0, indicating no carbon emissions during data center operations.

It's crucial to note that CUE varies significantly based on the energy sources data centers rely on. Data centers powered by renewable energy sources generally have lower CUE, even with the same PUE, compared to those relying on fossil fuel power.

 

What does China Contribute to Improving its CUE?

 

In 2021, China introduced the concepts of "carbon peaking" and "carbon neutrality" in its government work report. Carbon peaking aims to achieve a plateau in carbon dioxide emissions by 2030, gradually decreasing after reaching its peak. Carbon neutrality involves offsetting carbon dioxide produced in the production process through measures like afforestation and energy conservation, achieving "zero emissions" of carbon dioxide. Aligned with global efforts to combat climate change and China's "double carbon" strategy, the data center industry is continually improving energy efficiency levels, increasing the use of renewable energy, and striving to achieve carbon neutrality at the earliest.

 

As liquid cooling technologies mature, various liquid cooling methods such as immersion cooling, cold plate liquid cooling, and spray liquid cooling are increasingly applied in data centers. Beyond liquid cooling, the methods of data center cooling have diversified in recent years. Emerging cooling methods like indirect evaporative cooling and magnetic levitation chillers provide new options. The combination of multiple cooling methods has become common in data centers.

 

The efficiency of Uninterruptible Power Supplies (UPS) has also become a significant market for data center power distribution vendors. High-efficiency UPS, with efficiency exceeding 97%, is considered standard. Modular UPS at low load rates has surpassed high-voltage direct current power supply in data center efficiency, indicating the potential for high-frequency UPS to become one of the optimal solutions for energy efficiency in data center power distribution.

 

At the same time, with the continuous development of technology, Uninterruptible Power Supply (UPS) has also become a significant market where data center power distribution providers compete. Major domestic and international players in the power distribution industry, such as Huawei, Vertiv, Kehua, ABB, Schneider, have corresponding product layouts in this field. Achieving efficiency exceeding 97% is now considered a "basic operation" in the high-end UPS industry. It's worth noting that, at low load rates, data center efficiency using modular UPS has surpassed that of high-voltage direct current power supply. In the author's view, considering the trends in technological development, high-frequency UPS is likely to become one of the optimal solutions for reducing energy consumption and increasing efficiency in data center power distribution.

 

During the operation of data centers, IT equipment generates a significant amount of excess heat. Utilizing heat pump technology to recover and reuse this excess heat has found many applications in data centers, with a promising future. Rough estimates indicate that the total recoverable excess heat in data centers in the northern region of China is approximately 10 GW, theoretically supporting heating for around 300 million square meters of buildings. Numerous data centers in China, including Alibaba's Qindao Lake Data Center, Tencent's Tianjin Data Center, China Telecom's Chongqing Cloud Computing Base, Wanguo Data's Beijing Data Center 3, and UCloud's Wulanchabu Cloud Computing Center, have already implemented heat recovery technology, providing heating for both the internal and surrounding areas of the data centers.

 

As mentioned earlier, data center cooling constitutes over 20% of the overall energy consumption. Deploying data centers underwater, using the temperature of seawater to dissipate the heat generated by the data center, could significantly reduce energy consumption, contributing to the optimization of various indicators in data center operations.

 

In China, HIGHLANDER was the first to introduce the concept of Underwater Data Centers (UDC) and identified three major advantages of underwater data centers:

First, UDC, being located underwater and filled with inert gas, eliminates the risk of fire.

Second, UDC remains discreet in its underwater location, making precise external localization impossible.

Third, Continuous 24-hour monitoring of UDC helps effectively prevent potential damage and infiltration of the data center.

 

Underwater data centers have unique advantages in both energy conservation and safety. However, they are significantly influenced by geographical factors, requiring proximity to the sea for construction. In the current state of network infrastructure, the author believes that underwater data centers are suitable primarily for computing hot data, storing warm-cold data, and serving users with low-latency requirements, such as those engaged in machine learning and video rendering, particularly in coastal cities.

 

Conclusion

 

In summary, to lessen the impact of data centers on the environment, a diverse plan is needed. Using energy-efficient equipments, renewable energy sources, and improving how resources are used are crucial strategies. By applying these solutions, data centers can greatly cut down on energy use, improve how they operate, and show they are dedicated to being sustainable. As attention grows on environmental responsibility and energy expenses rise, reducing data centers' environmental impact is no longer a choice but a must. By taking an all-encompassing approach to sustainability, data centers can gain economic and environmental advantages, serving as a model for other industries.