Preview Data Center 2.0 – The Sustainable Data Center

Data Center 2.0: The Sustainable Data Center is an in-depth look into the steps needed toData Center 2.0 transform modern-day data centers into sustainable entities.

To get an impression of the book you can read the prologue right here.

Prologue

In large parts of the world, computers, Internet services, mobile communication, and cloud computing have become a part of our daily lives, professional and private. Information and communication technology has invaded our life and is recognized as a crucial enabler of economic and social activities across all sectors of our society. The opportunity of anytime, anywhere being connected to communicate and interact and to exchange data is changing the world.

During the last two decades, a digital information infrastructure has been created whose functioning is critical to our society, governmental, and business processes and services, which depend on computers. Data centers, buildings to house computer servers along with network and storage systems, are a crucial part of this critical digital infrastructure. They are the physical manifestation of the digital economy and the virtual and digital information infrastructure, were data is processed, stored, and transmitted.

A data center is a very peculiar and special place. It is the place were different worlds meet each other. It is a place where organizational (and individual) information needs and demands are translated in bits and bytes that are subsequently translated in electrons that are moved around the world. It is the place where the business, IT, and energy worlds come together. Jointly they form a jigsaw puzzle of stakeholders with different and sometimes conflicting interests and objectives that are hard to manage and to control.

Electricity is the foundation of all digital information processing and digital services that are mostly provided from data centers. The quality and availability of the data center stands or falls with the quality and availability of the power supply to the data center.

For data centers, the observation is made that the annualized costs of power-related infrastructure has, in some cases, grown to equal the annualized capital costs of the IT equipment itself. Data centers have reached the point that the electricity costs of a server over its lifetime will equal or pass the price of the hardware. Also, it is estimated that data centers are responsible for about 2% of the total world electricity consumption.

It is therefore easy to understand why the topic of electricity usage of data centers is a subject of discussion.

Electricity is still mostly generated with fossil fuel-based primary energy resources such as coal, gas, and oil. But this carbon-constrained power sector is under pressure. Resilience to a changing climate makes the decarburization of these energy sources mandatory to ensure sustainability.

From different parts of society the sustainability of data centers is questioned. Energy efficiency and indirect CO2 emissions caused by the consumption of carbon-based electricity are criticized.

The data center industry is working hard on these issues. According to the common view, it comes down to implementing technical measures. The idea is that more efficient power usage of servers, storage and network components, improved utilization, and better power and cooling management in data centers will solve the problems.

This idea can be questioned. Data centers are part of complex supply chains and have many stakeholders with differing perspectives, incomplete, contradictory, and changing requirements and complex interdependencies. In this situation there is no simple, clear definition of data center efficiency, and there is no simple right or optimal solution.

According to the Brundtland Commision of the United Nations, sustainability is “to meet the needs of the present without compromising the ability of future generations to meet their own needs.”

Given the fact that we are living in a world with limited resources and the demand for digital infrastructure is growing exponentially, there will be limits that will be encountered. The limiting factor to future economic development is the availability and the functioning of natural capital. Therefore, we need a new and better industrial model.

Creating sustainable data centers is not a technical problem but an economic problem to be solved.

A sustainable data center should be environmentally viable, economically equitable, and socially bearable.

This book takes a conceptual approach to the subject of data centers and sustainability. The proposition of the book is that we must fundamentally rethink the “data center equation” of “people, planet, profit” in order to become sustainable.

The scope of this search goes beyond the walls of the data center itself. Given the great potential of information technology to transform today’s society into one characterized by sustainability what is the position of data centers?

The data center is the place where it all comes together: energy, IT, and societal demands and needs.

Sustainable data centers have a great potential to help society to optimize the use of resources and to eliminate or reduce wastes of capital, human labor and energy.

The idea is that a sustainable data center is based on economics, organization, people and technology. This book offers at least multiple views and aspects on sustainable data centers to allow readers to gain a better understanding and provoke thoughts on how to create sustainable data centers.

Creating a sustainable data center calls for a multidisciplinary approach and for different views and perspectives in order to obtain a good understanding of what is at stake.

The solution is, at the end of the day, a question of commitment.

Data Center 2.0 – The Sustainable Data Center (Update)

Data Center 2.0: The Sustainable Data Center is an in-depth look into the steps needed to transform modern-day data centers into sustainable entities. The book will be published at the beginning of the summer.

To get an impression see the following slide deck.

DC20_Flyer_Blog.001 DC20_Flyer_Blog.002 DC20_Flyer_Blog.003 DC20_Flyer_Blog.004 DC20_Flyer_Blog.005 DC20_Flyer_Blog.006 DC20_Flyer_Blog.007 DC20_Flyer_Blog.008 DC20_Flyer_Blog.009

Data Center 2.0 – The Sustainable Data Center

DC20_SustainableDataCenter
Currently busy with the final steps to get the forthcoming book ‘Data Center 2.0 – The Sustainable Data Center’ (ISBN 978-1499224689) published at the beginning of the summer.

Some quotes from the book:

“A data center is a very peculiar and special place. It is the place where different worlds meet each other. A place where organizational (and individual) information needs and demands are translated in bits and bytes that are subsequently translated in electrons that are moved around the world. It is the place where the business, IT and energy world come together. Jointly they form a jigsaw puzzle of stakeholders with different and sometimes conflicting interests and objectives that are hard to manage and to control.Data Center 2.0

Given the great potential of Information Technology to transform today’s society into one characterised by sustainability what is the position of data centers?

……..

The data center is the place were it all comes together: energy, IT and societal demands and needs.

…….

A sustainable data center should be environmentally viable, economically equitable, and socially bearable. To become sustainable, the data center industry must free itself from the shackles of 19th century based ideas and concepts of production. They are too simple for our 21th century world.

The combination of service-dominant logic and cradle-to-cradle makes it possible to create a sustainability data center industry.

Creating sustainable data centers is not a technical problem but an economic problem to be solved.”

The book takes a conceptual approach on the subject of data centers and sustainability. It offers at least multiple views and aspects on sustainable data centers to allow readers to gain a better understanding and provoke thoughts on how to create sustainable data centers.

The book has already received endorsements of Paul-Francois Cattier Global Senior, Vice President Data Center of Schneider Electric and John Post, Managing Director of Foundation  Green IT Amsterdam region.

Table of contents

1 Prologue
2 Signs Of The Time
3 Data Centers, 21th Century Factories
4 Data Centers A Critical Infrastructure
5 Data Centers And The IT Supply Chain
6 The Core Processes Of A Data Center
7 Externalities
8 A Look At Data Center Management
9 Data Center Analysis
10 Data Center Monitoring and Control
11 The Willingness To Change
12 On The Move: Data Center 1.5
13 IT Is Transforming Now!
14 Dominant Logic Under Pressure
15 Away From The Dominant Logic
16 A New Industrial Model
17 Data Center 2.0

Needed: a Six Sigma Datacenter

As usual there was a lot of discussion on cooling and energy efficiency at the yearly DatacenterDynamics conference in Amsterdam last week. Finding point solutions to be efficient and/or creating redundancy to circumvent possible technical risks. But is this the way to go to optimise a complex IT supply chain?

In a lot of industries statistical quality management methods are used to improve the quality of process outputs by identifying and removing the causes of defects (errors) and minimising variability in manufacturing and business processes. One of the more popular methods is Six Sigma which utilises the DMAIC phases Define, Measure, Analyse, Improve and Control to improve processes.

But when Eddie Desouza of Enlogic asked the audience (of one of the tracks at DatacenterDynamics) who was using the Six Sigma method to improve their datacenters only three people raised their hand out of hundred. Eddie Desouza was advocating the use of Six Sigma to improve the efficiency and the quality of a datacenter. He made the observation that datacenters do apply substantial upfront reliability analysis and invest in costly redundant systems, but rarely commit to data-driven continuous improvement philosophies. In other words focussing on fixing errors instead of focussing on optimising the chain by reducing unwanted variability and reducing the associated costs of poor quality.

He also, rightly, emphasised that datacenter operators should use a system approach instead of a component approach in optimising the datacenter. The internal datacenter supply chain is as strong as its weakest link and there is also the risk of sub-optimisation.

An example of the necessity to use a system approach and to use industry methods like Six Sigma can be found in a blog post of Alex Benik about “the sorry state of server utilization”. He refers to some reports from the past five years:

• A McKinsey study in 2008 pegging data-center utilization at roughly 6 percent.

• A Gartner report from 2012 putting industry wide utilization rate at 12 percent.

• An Accenture paper sampling a small number on Amazon EC2 machines finding 7 percent utilization over the course of a week.

• Charts and quote from Google, which show three-month average utilization rates for 20,000 server clusters. A typical cluster spent most of its time running between 20-40 percent of capacity, and the highest utilization cluster reaches such heights (about 75 percent) only because it’s doing batch work.

Or take a look from another source, the diagram below of the Green Grid:

 UnusedServers

Why is this overlooked? Why isn’t there a debate about this weak link, this huge under-utilisation of servers and as a result the huge energy wasting? Why focussing on cooling, UPS, etc. if we have this weak link in the datacenter?

As showed in another blog post, saving 1 unit power consumption in information processing saves us about 98 units in the upstream of the power supply chain (that is up to the power plant).

So it is very nice to have a discussion about the energy efficiency of datacenter facility components but what is it worth if you have this “sorry state of server utilisation” and that it isn’t noticed and/or that no action is taken on this? Eddie Desouza of Enlogic is right, datacenters need Six Sigma. It would help if datacenter operators would embrace a system approach. Focussing on the complete internal  datacenter supply chain instead of a component approach, and using statistical quality management methods to improve efficiency and quality as in other industries.

Prepare for the Zettabyte Data Wave: right sizing and managing your data center

Lately two interesting and complementary reports from Cisco and Oracle came available with figures and trends associated cloud computing and data centers. By the end of 2015 global datacenter IP traffic will reach 4.8 zettabytes (Cisco). By the way a zettabyte is equal to 1 billion terabytes. The Oracle report reveals that many businesses appear to have been caught off guard by the boom in ‘Big Data’

Along with greater computing capability, businesses have increased demand for storing digital data, both in terms of amount and duration due to new and existing applications and to regulations. In order to be able to retrieve and transport the corresponding exponentially rising amount of data, the data transfers both in the (wired) Internet and wireless networks have been rising at the same speed. Data centers have become the back bone of the digital economy. They represent major investments for their owners and they also cost a huge amount to run and maintain them. They deserve to be well cared for.

According to the Cisco report from 2000 to 2008 the Internet was dominated by peer-to-peer file sharing traffic. This traffic didn’t touch a datacenter. From 2008 most of the Internet traffic has originated or terminated in a datacenter. It is estimated that the global datacenter IP traffic will increase fourfold over the next 5 years with a compound annual growth rate (CAGR) of 33 percent during the period 2011 – 2015. During this period the ratio between DC-to-User / DC-to-DC / Internal-DC data traffic will be more or less the same. For 2015 the ratio is estimated on 17%/7%/76%.

DC traffic (c) Cisco

In the same report Cisco is stating that from 2014 more then 50% of all workloads will be processed in the cloud. But in 2015 the traditional datacenters still process 47% of the workload

DC Workload (c) Cisco

According to Oracle’s second ‘Next Generation Data Centre Index’ report businesses are reacting to this data tsunami with a short term increase in outsourced data centre and cloud service use, while planning longer term to build their own in-house data centre facilities. Sustainability is also back on the agenda for 2012 as “businesses react either out of a need for a demonstrable green policy for governance reasons, or to reduce spiraling energy bills related to their IT use”.

Some highlights:

  • The proportion of data centre managers who see a copy of the energy bill has risen from 43.2 percent to 52.2 percent.
  • More than one third (36 percent) of data centre managers still have no visibility of energy usage, while almost 10 percent of respondents also doubt that anyone else sees a copy of the bill for data centre energy usage.
  • Virtualisation of IT hardware is gathering pace in the data centre but remains patchy with only 12 percent of respondents having virtualised more than 70 percent of their IT estate,38 percent have virtualised less than 30 percent.
  • Worryingly, almost 39 percent still admit to second guessing future workload requirements. However, the proportion that use advanced analytics or predictions based on historical usage has increased from 39 percent to 50 percent.

Consolidation, Virtualisation, Utilisation (c) Oracle

Combining the outcome of the two reports you can certainly questioning how well the datacenters are prepared for the Zettabyte Data Wave that is coming.

One of the most significant challenges for the IT organisation was and is to coherently manage the quality attributes performance, availability, confidentiality and integrity for the complete IT service stack. Energy usage as a quality attribute is a relative new kid on the block. This ‘Housing’ or site infrastructure attribute is composed of the power, cooling and floor space sub attributes. These attributes are not independent of each other. They form a hidden threshold or hidden ceiling.

To paraphrase Greg Schulz, author of ‘The Green and Virtual Data Center’: “For a given data center these site infrastructure resources are constrained therefore, together, these attributes form a certain threshold. If the demand for IT capacity reaches this threshold, further growth of the IT load is inhibited because of technical (over heating, not enough power) and or financial (excessive capital investment) reasons. In that case IT services are constrained and therefore business growth inhibited, which causes economic penalties and lost opportunities”

Remember that 76% of the Zettabyte Data Wave will be internal datacenter traffic. To anticipate on this coming Zettabyte Data Wave you must manage your datacenter IT infrastructure AND site infrastructure in a coherent and consistent way. What does this Zettabyte Data Wave means for your datacenter in terms of processing power, storage and network capacity? What is the impact on the power consumption and the cooling capacity?

Do you have an appropriate datacenter architecture? And do you have appropriate tools (DCIM software) for integral datacenter management so that you don’t hit that hidden ceiling by surprise?

Iceland green data center initiatives

Data center operators want to cut energy usage and energy costs. The Nordic countries, with their cool temperatures, cool water and an electricity generation that is nearly entirely from renewable sources are starting all kinds of initiatives to answer this needs. In an experimental phase we have the tidal power generation at the Orkney Islands, in Norway large projects have started to use old mines as data center locations.

Now in Iceland a third data center project has started after Thor DC and greenqloud: Verne Global. To enable Verne Global to open for business by the end of this year, Colt will manufacture and ship a 500m2 Modular Data Centre (MDC)) from the UK to Iceland, where a total of 37 modules will be assembled and commissioned at the Verne Global Campus in Kevlafik, Iceland.

Manufacturer and construction time for the first data center space should be less than four  months. To follow the progress and build of the data center Colt made a interactive portal where a countdown already has started. Jeff Monroe at Verne Global stated that with the modular approach “… the opportunity to quickly scale capacity to address customer demand in a rapid timeframe.” Colt has customized its modular design to optimize Iceland’s climate to endure that free, fresh air cooling is available 365 days a year.

Verne Global claims to reduce cooling costs by 80% or more. Also by using electricity which is 100% sourced from geothermal and hydroelectric power plants there is no carbon footprint. To give an idea about the impact to transfer 8MW of critical load power to a data centre here, you would save approximately 50,000 metric tons of CO2 annually. This is equivalent to a savings of hundreds of thousands of pounds annually if you were to purchase carbon offsets on carbon exchanges.

The data center facility is on the site of the former Keflavik Naval Air Station. Connectivity to the site is provided by redundant, high-capacity, multi-terabit-per-second connections including FARICE-1, DANICE and GREENLAND CONNECT.

The Icelanders put a lot of effort to build a data center industry in Iceland. It was only a year ago, the 21 of may to be precise, that the President of Iceland and Iceland’s Minister of Industry opened the first Icelandic data center, at Steinhella in Hafnarfjörður. Some interesting background information can be found in the report “Iceland The Ultimate Location for Data Centers” made by PricewaterhouseCoopers for The Invest in Iceland Agency, that is run by the Trade Council of Iceland and the Ministry of Industry.

Hopefully for them it will give a new boost to their North Atlantic economy. Something dearly needed after the financial crisis struck Iceland last year.

Data centers are expected to consume 19% more energy

The world’s data centers are expected to consume 19% more energy in the next 12 months than they have in the past year, according to results of a global industry census conducted by DatacenterDynamics (DCD). An interesting conclusion in the light of the report Jonathan Koomey released (new study) on data center electricity use in 2010. Which was a follow-up of the 2008 article: “Worldwide electricity used in data centers.

The 2007 EPA report to Congress on data centers (US EPA 2007) predicted a little less than a doubling in total data center electricity use from 2005 to 2010 if historical trends continued. But instead of this, In the U.S., the electricity used by data centers from 2005 to 2010 increased about 36 percent instead of doubling. And worldwide electricity consumption by data centers increased about 56 percent from 2005 to 2010 instead of doubling.

With the DCD forecast of a 19% energy growth the next 12 months it looks we are back on track again.

Data centers currently consume about 31GW, the census concludes. The average total power to rack is about 4.05kW, with 58% of racks consuming 5kW per rack, 28% consuming from 5kW to 10kW per rack and the rest consuming more than 10kW per rack.

Because energy demand is expected to rise so much, data center owners and operators are concerned about energy cost and availability. Analysis of the census data concluded that energy cost and availability is the number-one concern for them.

  • 44% believe that increased energy costs will impact significantly on their data center operations in the next 12 months – this is the highest ranked issue
  • 29% are concerned about the significant impact of energy availability (or the lack of it).

Energy concerns (c) DatacenterDynamics Global Industry Census 2011

Data center monitoring is directed by the priority to maintain availability (56%), reducing costs (31%) and reducing environmental impact scored 13%. According to DCD monitoring of energy efficiency is only conducted continuously by a minority of 42% although an equivalent proportion monitor it less regularly. This pattern is repeated for carbon emissions and is consistent with a lower priority given to the environmental impact of the data center.

Energy monitoring (c) DatacenterDynamics Global Industry Census 2011

Nevertheless these concerns big data centers are still being build in areas (for example London and Amsterdam) were lack of power supply has been touted as a supply constraining issue for years.

For example in the London arena:

  • Telehouse West, opened last March, 7.5MW of new capacity.
  • Telecity Harbour Exchange, 6MW opening in 2 phases.

And in the Amsterdam arena:

  • Switch 8.320 m2
  • Equinix  AM3 (in two phases 6400 m2 )
  • Terremark 2800 m2 first phase (10.000 m2 additional)

How can we explain these activities if power is in such tight supply?