PCIe Webinar Banner
WP_Term Object
    [term_id] => 3611
    [name] => IoT
    [slug] => iot-internet-of-things
    [term_group] => 0
    [term_taxonomy_id] => 3611
    [taxonomy] => category
    [description] => Internet of Things
    [parent] => 0
    [count] => 546
    [filter] => raw
    [cat_ID] => 3611
    [category_count] => 546
    [category_description] => Internet of Things
    [cat_name] => IoT
    [category_nicename] => iot-internet-of-things
    [category_parent] => 0

Let’s Reduce Wasted Energy in Server Farms

Let’s Reduce Wasted Energy in Server Farms
by Alex Lidow on 02-05-2016 at 4:00 pm

 With the growth in streaming video and the promises of 50 billion IoT gadgets making our lives oh-so-much better, there is an alarming demand for online computational horsepower and bandwidth.

Why alarming? In 2014, data centers in the United States consumed approximately 100 billion kilowatt hours (kWh) of energy. According to Sudeep Pasricha, an associate professor in the Department of Electrical and Computer Engineeringat Colorado State University, “that’s almost twice the electricity needed to power the whole state of Colorado for a year.” Further,this growing and insatiable desire for digital content is actually polluting the environment: the massive data centers that house all this digital content on servers are now responsible for an astounding 2 percent of global greenhouse gas emissions, a similar share to today’s aviation industry.

Inefficient grid
To add insult to injury, the power needed to support this rapidly growing demand comes from an electrical grid that is wildly inefficient and is based on infrastructure that was created, in large part, more than a century ago. To put it simply, electricity goes through several conversion stages: first, from its origination at the power plant, then on to transmission through power stations before finally feeding the remaining energy through semiconductor chips to provide computer power to servers. And due to aging equipment, a significant amount of power is lost as it travels from the power plant to the computer chip that does all the actual computing work.

Just how significant is this waste?It turns out that the power grid supplies 150W of power to meet the demands of a digital chip that may need only 100W. Moreover, the amount of wasted energy is even greater because every watt of power lost through power conversion is transferred into heat. And it is necessary to remove that heat from the server farm by expensive and energy-intensive air conditioning. It takes about 1W of air conditioning to remove 1W of power losses, effectively doubling the inefficiency of this power conversion process. Not to mention the enormous amount of carbon-dioxide that these air conditioning units emit in an effort to convert all that wasted energy.

In aggregate, the combined waste across the United States due to data center power conversion is enough to power over half of the state of Colorado.

Also Read: Submerging the Data Center

Limits of silicon
And if the inefficiencies and waste in the power grid aren’t enough, the power conversion process has been built around post World War II silicon-based semiconductors, which have reached their theoretical power conversion performance limitations. Subsequently, these chips are responsible for creating additional power inefficiencies, with great financial and environmental costs.

However, new materials have emerged that can convert electricity more efficiently and at a lower cost. In short, superior crystal properties in these materials enable the elimination of the most wasteful final stages of conversion. It’s a dynamic similar to the evolution of air travel in the post WWII era. Initially, air travel across the country required at least one stop for refueling. When jet powered flight became commercially available, the increased fuel efficiency resulted in not only non-stop coast-to-coast travel, but also significantly reduced costs of the journey.

By eliminating the inefficiencies in this final stage in the server farm power architecture we can realize a direct saving of 7 billion kWh per year. This is doubled when air conditioning energy costs are added, bringing the total to about 14 percent of the total energy consumed by servers in the US alone. The cost savings are also significant. At the average cost of $0.12 per kWh, that’s a savings of $1.7 billion annually, which does not include the additional savings in system cost resulting from fewer power converters and air conditioners.

While the need for computing power is only likely to increase in the upcoming years, technologies are appearing that will help reduce waste and drive subsequent environmental and financial savings that benefit future generations of information gluttons the world over.

We wrote a book about this subject. You can find it at: http://epc-co.com/epc/Products/Publications/DC-DCConverterHandbook.aspx

Share this post via:


0 Replies to “Let’s Reduce Wasted Energy in Server Farms”

You must register or log in to view/post comments.