On Earth Day 2012, it’s good to take a step back and see how far we’ve come since the first Earth Day in 1970, which should give us the confidence we need to take on the big challenges ahead of us.

Data is Exploding

The cost of computing has declined enormously. In 1970, you could buy 10 calculations per second for US$ 1000. In 2010, you could buy 10 billion calculations per second for the same US$ 1000. In 1970, there were approximately 31,744 terabytes (TB) of data on computerized records -- remember, today you can buy a 4 TB external hard drive. In 2010, that number was 1.4 billion TB. In 2015, it’s predicted that mobile devices alone will generate 6.3 exabytes of data -- that’s 6.6 million TB. The total amount of data will grow 50 times in the next 8 years.

Data Centers Intensely Use Energy

It takes a lot of energy to provide the always-on computing that we’ve come to expect and support the growth of computing worldwide. A 25-megawatt data center uses as much electricity as 20,000 average US households.

The US Environmental Protection Agency (EPA) estimates that servers and data centers are responsible for up to 1.5 percent of the total electricity consumption in the U.S., or roughly 0.5% of US greenhouse gas emissions. And it’s going to go up if we do nothing to change it: as computing grows, so does global data center energy use -- upwards of 12 percent this year alone.

There is a substantial opportunity to improve efficiency in data centers.

In some cases, almost half of the total power load can be from cooling systems. And when data center computing assets are idle (running but not being utilized for useful work), or have low utilization, that is wasted potential.

Today’s chief information officer (CIO) has to consider energy management. A recent Forbes article by Dan Woods provides a checklist for CIOs to evaluate the maturity of their approach to energy management. Moving to shared data centers in the cloud is one strategy to consider.

Cloud Computing Can Help

Cloud computing makes use of economies of scale by running coordinated, actively managed clusters of low-cost computing hardware on behalf of multiple clients, taking advantage of the latest in CPU, rack, cooling and building-systems technology. These cloud tools are helping make the future brighter for all of us.

Infrastructure Allocation

Server, networking and storage usage is rarely stable. As a result, an organization’s infrastructure deployment needs to be sized to accommodate the peak load that will be experienced. In many situations this can be many times the average load.

Unlike individual companies, however, cloud providers are able to smooth the peaks of their various customers across their infrastructure. Since every organization has a different peak profile, that is peaks occur at different times, the total cloud infrastructure requirements will be less in aggregate than when a company individually provisions.

Multitenancy

Dedicated servers cannot intelligently share their computing power. With virtualization, the cloud environment has a "central brain" that optimally senses and provisions those resources through additional intelligent management services to provision on demand and consolidate load, making more efficient use of resources.

Server Utilization Rates

Hosted data centers do a much better job of maximizing server resource utilization. In-house data centers are like loosely packed suitcases that are never full. By actively managing workloads and assigning CPU power exactly where and when it’s needed, the cloud packs that suitcase tight, so that each server operates at peak efficiency.

Data Center Efficiency

Technology infrastructure changes so fast that running a modern data center is like trying to service a car while driving it. When you use a shared cloud data center, you drive the car -- build applications for your business -- and leave the experts to worry about maintenance.

Cloud providers run data centers all day; they are specialists. Newer hardware runs faster and uses less energy by far than older servers. Locating data centers in cooler climates and near renewable energy is another key factor for saving energy. The best in class providers build the most efficient buildings and constantly monitor and optimize their performance.

Ecosystem Investments

In the cloud, we’re always thinking about how to make mass computing more open, transparent and greener.

The Open Compute project shows what’s possible. In Prineville, Oregon, Facebook committed to using the best technology and environmental strategies available. As a result:

  • The data center consumes 52 percent less energy and 72 percent less water than other typical data centers.
  • The Prineville center is 38 percent more efficient and 24 percent cheaper to build than any of Facebook’s standard data centers.
  • The location of the data center is important. Taking advantage of renewable energy sources such as wind, solar and hydro and cooler climates, reduces the amount of power needed to cool the data center and eliminates carbon emissions.

The Future is in the Cloud

We’ve talked about data centers, but the entire computing industry has a role to play. The SMART 2020 report predicted that the entire information and communications technology (ICT) sector could facilitate emissions reductions of 5 times the sector’s own footprint, up to 7.8 Giga-Tonnes (GT) of CO2 equivalent, or 15 percent of total expected worldwide emissions by 2020***. But this will only happen if we take advantage of massive technological and spatial economies of scale, the latest in smart building design, and renewable energy. These are the kinds of investments cloud providers are making every day. That’s why we believe the future is green, and it’s in the cloud.

Rackspace_Earth Day_infographic_Final_04202012.jpg

Rackspace® — [INFOGRAPHIC] How Cloud Computing is Saving the Earth

Editor's Note: You may also be interested in reading: