Data Center 2.0 – The Sustainable Data Center, Now Available!

Data Center 2.0 – The Sustainable Data Center is now available. Data Center 2.0

The book is showing up on the websites of Amazon and will soon starts to pop up on websites of other  E-tailers’ .

Data Center 2.0 – The Sustainable Data Center is an in-depth look into the steps needed to transform modern-day data centers into sustainable entities.

See the press release:

Some nice endorsements were received:

“Data Center 2.0, is not so much about technology but about people, society and economic development. By helping readers understand that even if Data Centers, enabling the Digital economy, are contributing a lot to energy saving, they need to be sustainable themselves; Rien Dijkstra is on the right track. When explaining how to build sustainable Data Centers, through multi disciplinary approach, breaking the usual silos of the different expertise, Rien Dijkstra is proposing the change of behavior needed to build sustainable Data Centers. Definitely it is about people, not technology.” 

Paul-Francois Cattier, Global Senior Vice-President Data Center – Schneider Electric

“In Data Center 2.0 The Sustainable Data Center author Rien Dijkstra has gone several steps further in viewing the data center from the perspective of long term ownership and efficiency in combination with treating it as a system. It’s an excellent read with many sections that could be extracted and utilized in their own right. I highly recommend this read for IT leaders who are struggling with the questions of whether to add capacity (co-locate, buy, build, or lease) or how to create a stronger organizational ownership model for existing data center capacity. The questions get more complex every year and the risks more serious for the business. The fact that you’re making a business critical decision that must stand the test of technology and business change over 15 years is something you shouldn’t take lightly.” 

Mark Thiele, President and Founder Data Center Pulse

“Data centers used to be buildings to house computer servers along with network and storage systems, a physical manifestation of the Digital Economy. Internet of Things, the digitization of about everything in and around us, brings many profound changes. A data center is the place where it all comes together. Physical and digital life, fueled by energy and IT, economical and social demands and needs and not to forget sustainability considerations. Sustainable data centers have a great potential to help society to optimize the use of resources and to eliminate or reduce wastes of capital, human labor and energy. A data center in that sense is much more than just a building for servers. It has become a new business model. Data center 2.0 is a remarkable book that describes the steps and phases to facilitate and achieve this paradigm.” 

John Post, Managing Director – Foundation Green IT Amsterdam region

Preview Data Center 2.0 – The Sustainable Data Center

Data Center 2.0: The Sustainable Data Center is an in-depth look into the steps needed toData Center 2.0 transform modern-day data centers into sustainable entities.

To get an impression of the book you can read the prologue right here.

Prologue

In large parts of the world, computers, Internet services, mobile communication, and cloud computing have become a part of our daily lives, professional and private. Information and communication technology has invaded our life and is recognized as a crucial enabler of economic and social activities across all sectors of our society. The opportunity of anytime, anywhere being connected to communicate and interact and to exchange data is changing the world.

During the last two decades, a digital information infrastructure has been created whose functioning is critical to our society, governmental, and business processes and services, which depend on computers. Data centers, buildings to house computer servers along with network and storage systems, are a crucial part of this critical digital infrastructure. They are the physical manifestation of the digital economy and the virtual and digital information infrastructure, were data is processed, stored, and transmitted.

A data center is a very peculiar and special place. It is the place were different worlds meet each other. It is a place where organizational (and individual) information needs and demands are translated in bits and bytes that are subsequently translated in electrons that are moved around the world. It is the place where the business, IT, and energy worlds come together. Jointly they form a jigsaw puzzle of stakeholders with different and sometimes conflicting interests and objectives that are hard to manage and to control.

Electricity is the foundation of all digital information processing and digital services that are mostly provided from data centers. The quality and availability of the data center stands or falls with the quality and availability of the power supply to the data center.

For data centers, the observation is made that the annualized costs of power-related infrastructure has, in some cases, grown to equal the annualized capital costs of the IT equipment itself. Data centers have reached the point that the electricity costs of a server over its lifetime will equal or pass the price of the hardware. Also, it is estimated that data centers are responsible for about 2% of the total world electricity consumption.

It is therefore easy to understand why the topic of electricity usage of data centers is a subject of discussion.

Electricity is still mostly generated with fossil fuel-based primary energy resources such as coal, gas, and oil. But this carbon-constrained power sector is under pressure. Resilience to a changing climate makes the decarburization of these energy sources mandatory to ensure sustainability.

From different parts of society the sustainability of data centers is questioned. Energy efficiency and indirect CO2 emissions caused by the consumption of carbon-based electricity are criticized.

The data center industry is working hard on these issues. According to the common view, it comes down to implementing technical measures. The idea is that more efficient power usage of servers, storage and network components, improved utilization, and better power and cooling management in data centers will solve the problems.

This idea can be questioned. Data centers are part of complex supply chains and have many stakeholders with differing perspectives, incomplete, contradictory, and changing requirements and complex interdependencies. In this situation there is no simple, clear definition of data center efficiency, and there is no simple right or optimal solution.

According to the Brundtland Commision of the United Nations, sustainability is “to meet the needs of the present without compromising the ability of future generations to meet their own needs.”

Given the fact that we are living in a world with limited resources and the demand for digital infrastructure is growing exponentially, there will be limits that will be encountered. The limiting factor to future economic development is the availability and the functioning of natural capital. Therefore, we need a new and better industrial model.

Creating sustainable data centers is not a technical problem but an economic problem to be solved.

A sustainable data center should be environmentally viable, economically equitable, and socially bearable.

This book takes a conceptual approach to the subject of data centers and sustainability. The proposition of the book is that we must fundamentally rethink the “data center equation” of “people, planet, profit” in order to become sustainable.

The scope of this search goes beyond the walls of the data center itself. Given the great potential of information technology to transform today’s society into one characterized by sustainability what is the position of data centers?

The data center is the place where it all comes together: energy, IT, and societal demands and needs.

Sustainable data centers have a great potential to help society to optimize the use of resources and to eliminate or reduce wastes of capital, human labor and energy.

The idea is that a sustainable data center is based on economics, organization, people and technology. This book offers at least multiple views and aspects on sustainable data centers to allow readers to gain a better understanding and provoke thoughts on how to create sustainable data centers.

Creating a sustainable data center calls for a multidisciplinary approach and for different views and perspectives in order to obtain a good understanding of what is at stake.

The solution is, at the end of the day, a question of commitment.

Data Center 2.0 – The Sustainable Data Center

DC20_SustainableDataCenter
Currently busy with the final steps to get the forthcoming book ‘Data Center 2.0 – The Sustainable Data Center’ (ISBN 978-1499224689) published at the beginning of the summer.

Some quotes from the book:

“A data center is a very peculiar and special place. It is the place where different worlds meet each other. A place where organizational (and individual) information needs and demands are translated in bits and bytes that are subsequently translated in electrons that are moved around the world. It is the place where the business, IT and energy world come together. Jointly they form a jigsaw puzzle of stakeholders with different and sometimes conflicting interests and objectives that are hard to manage and to control.Data Center 2.0

Given the great potential of Information Technology to transform today’s society into one characterised by sustainability what is the position of data centers?

……..

The data center is the place were it all comes together: energy, IT and societal demands and needs.

…….

A sustainable data center should be environmentally viable, economically equitable, and socially bearable. To become sustainable, the data center industry must free itself from the shackles of 19th century based ideas and concepts of production. They are too simple for our 21th century world.

The combination of service-dominant logic and cradle-to-cradle makes it possible to create a sustainability data center industry.

Creating sustainable data centers is not a technical problem but an economic problem to be solved.”

The book takes a conceptual approach on the subject of data centers and sustainability. It offers at least multiple views and aspects on sustainable data centers to allow readers to gain a better understanding and provoke thoughts on how to create sustainable data centers.

The book has already received endorsements of Paul-Francois Cattier Global Senior, Vice President Data Center of Schneider Electric and John Post, Managing Director of Foundation  Green IT Amsterdam region.

Table of contents

1 Prologue
2 Signs Of The Time
3 Data Centers, 21th Century Factories
4 Data Centers A Critical Infrastructure
5 Data Centers And The IT Supply Chain
6 The Core Processes Of A Data Center
7 Externalities
8 A Look At Data Center Management
9 Data Center Analysis
10 Data Center Monitoring and Control
11 The Willingness To Change
12 On The Move: Data Center 1.5
13 IT Is Transforming Now!
14 Dominant Logic Under Pressure
15 Away From The Dominant Logic
16 A New Industrial Model
17 Data Center 2.0

VMworld Europe: Transform Now!

vmworld-europe

Transform Now!

Last week at VMworld Barcelona not only attention was given to products and solutions across the three key layers of Infrastructure, Application Platform and End-user technology but also much attention was given to how to transform the traditional IT department to an IT department that is ready for the Cloud Era.

With a specific conference track, ‘IT Transformation’, VMware paid tribute to the fact that on demand services based on automated and policy based provisioning and deployment will change the current operational model of IT.

With the introduction of cloud computing we are witnessing the transition from an ‘artisan’ way of IT production to an industrial way of IT production. Handcrafted, dedicated IT infrastructure siloes will be converted into industrial made, commoditized and generic, horizontal IT infrastructural platforms. For this, a new operational model of IT is needed to make this transition possible. In several presentations VMware staff told that last year they had worked on an IT Transformation model that must support the customers to make this move.

A 3 steps capability model was presented; starting with a traditional reactive IT organization, according to VMware the IT organization has to change to a proactive service broker and finally should get an position as an innovative strategic partner for the business. To make this happen four worktracks or swimming lanes were defined. The first one is for the well known Technology and Architecture activities to answer the question which technology should be used to build a Software defined infrastructure. The other ones are about answering the question “How do I operate in this new world”. Thus talking and taking action on processes & control and people, culture & organization and IT Business management. VMware anounced that their view on how to transform IT will soon be supported by improved or new products for automated provisioning and deployment and the monitoring of capacities, qualities and costs of IT services. Also new consulting services, education offerings and certifications about IT Transformation can be expected.

VMware made it very clear that the only way you can be succesful in the new Cloud Era is by transforming your current IT organization. Only paying attention to technology is a dead end street. This change will require a lot of effort.

Besides the technology costs the organizational transition costs of cloud computing is something that should not be forgotten.

Cloud-Ops-Model

Prepare for the Zettabyte Data Wave: right sizing and managing your data center

Lately two interesting and complementary reports from Cisco and Oracle came available with figures and trends associated cloud computing and data centers. By the end of 2015 global datacenter IP traffic will reach 4.8 zettabytes (Cisco). By the way a zettabyte is equal to 1 billion terabytes. The Oracle report reveals that many businesses appear to have been caught off guard by the boom in ‘Big Data’

Along with greater computing capability, businesses have increased demand for storing digital data, both in terms of amount and duration due to new and existing applications and to regulations. In order to be able to retrieve and transport the corresponding exponentially rising amount of data, the data transfers both in the (wired) Internet and wireless networks have been rising at the same speed. Data centers have become the back bone of the digital economy. They represent major investments for their owners and they also cost a huge amount to run and maintain them. They deserve to be well cared for.

According to the Cisco report from 2000 to 2008 the Internet was dominated by peer-to-peer file sharing traffic. This traffic didn’t touch a datacenter. From 2008 most of the Internet traffic has originated or terminated in a datacenter. It is estimated that the global datacenter IP traffic will increase fourfold over the next 5 years with a compound annual growth rate (CAGR) of 33 percent during the period 2011 – 2015. During this period the ratio between DC-to-User / DC-to-DC / Internal-DC data traffic will be more or less the same. For 2015 the ratio is estimated on 17%/7%/76%.

DC traffic (c) Cisco

In the same report Cisco is stating that from 2014 more then 50% of all workloads will be processed in the cloud. But in 2015 the traditional datacenters still process 47% of the workload

DC Workload (c) Cisco

According to Oracle’s second ‘Next Generation Data Centre Index’ report businesses are reacting to this data tsunami with a short term increase in outsourced data centre and cloud service use, while planning longer term to build their own in-house data centre facilities. Sustainability is also back on the agenda for 2012 as “businesses react either out of a need for a demonstrable green policy for governance reasons, or to reduce spiraling energy bills related to their IT use”.

Some highlights:

  • The proportion of data centre managers who see a copy of the energy bill has risen from 43.2 percent to 52.2 percent.
  • More than one third (36 percent) of data centre managers still have no visibility of energy usage, while almost 10 percent of respondents also doubt that anyone else sees a copy of the bill for data centre energy usage.
  • Virtualisation of IT hardware is gathering pace in the data centre but remains patchy with only 12 percent of respondents having virtualised more than 70 percent of their IT estate,38 percent have virtualised less than 30 percent.
  • Worryingly, almost 39 percent still admit to second guessing future workload requirements. However, the proportion that use advanced analytics or predictions based on historical usage has increased from 39 percent to 50 percent.

Consolidation, Virtualisation, Utilisation (c) Oracle

Combining the outcome of the two reports you can certainly questioning how well the datacenters are prepared for the Zettabyte Data Wave that is coming.

One of the most significant challenges for the IT organisation was and is to coherently manage the quality attributes performance, availability, confidentiality and integrity for the complete IT service stack. Energy usage as a quality attribute is a relative new kid on the block. This ‘Housing’ or site infrastructure attribute is composed of the power, cooling and floor space sub attributes. These attributes are not independent of each other. They form a hidden threshold or hidden ceiling.

To paraphrase Greg Schulz, author of ‘The Green and Virtual Data Center’: “For a given data center these site infrastructure resources are constrained therefore, together, these attributes form a certain threshold. If the demand for IT capacity reaches this threshold, further growth of the IT load is inhibited because of technical (over heating, not enough power) and or financial (excessive capital investment) reasons. In that case IT services are constrained and therefore business growth inhibited, which causes economic penalties and lost opportunities”

Remember that 76% of the Zettabyte Data Wave will be internal datacenter traffic. To anticipate on this coming Zettabyte Data Wave you must manage your datacenter IT infrastructure AND site infrastructure in a coherent and consistent way. What does this Zettabyte Data Wave means for your datacenter in terms of processing power, storage and network capacity? What is the impact on the power consumption and the cooling capacity?

Do you have an appropriate datacenter architecture? And do you have appropriate tools (DCIM software) for integral datacenter management so that you don’t hit that hidden ceiling by surprise?

Greener IT Can Form a Solid Base For a Low-Carbon Society

Greening ITPrecisely a year a go we launched  the book Greening IT in print and online (free to download). And if I may say so, the book is still worth the effort of reading.

The book aims at promoting awareness of the potential of Greening IT, such as Smart Grid, Cloud Computing, Thin Clients and Greening Supply Chains. The chapter “Why Green IT is Hard – An Economic Perspective” is my contribution to this book. See Greening IT and read the following press release.

Press release Greening IT

Information Technology holds a great potential in making society greener. Information Technology will, if we use it wisely, lead the way to resource efficiency, energy savings and greenhouse gas emission reductions – taking us to the Low-Carbon Society.

The IT sector itself, responsible for 2% of global greenhouse gas emissions, can get greener by focusing on energy efficiency and better technologies – we call this Green IT. Yet, IT also has the potential to reduce the remaining 98% of emissions from other sectors of the economy – by optimising resource use and saving energy etc. We call this the process of Greening IT. IT can provide the technological fixes we need to reduce a large amount of greenhouse gas emissions from other sectors of society and obtain a rapid stabilisation of global emissions. There is no other sector where the opportunities for greenhouse gas emission reductions, through the services provided, holds such a potential as the IT industry”, says Adrian Sobotta, president of the Greening IT Initiative,   Founding Editor and author of the book.

In her foreword to the book, European Commissioner for Climate Action, Connie Hedegaard writes: “All sectors of the economy will need to contribute…, and it is clear that information and communication technologies (ICTs) have a key role to play. ICTs are increasingly recognised as important enablers of the low-carbon transition. They offer significant potential – much of it presently untapped – to mitigate our emissions. This book focuses on this fundamental role which ICTs play in the transition to a low-carbon society.”

The book aims at promoting awareness of the potential of Greening IT, such as Smart Grid, Cloud Computing and thin clients. It is the result of an internationally collaborative, non-profit making, Creative Commons-licensed effort – to promote greening IT.

There is no single perfect solution; Green IT is not a silver bullet. But already today, we have a number of solutions ready to do their part of the work in greening society. And enough proven solutions and implementations for us to argue not only that IT has gone green, but also that IT is a potent enabler of greenhouse gas emission reductions”, says Adrian Sobotta.

It is clear that the messages in the book put a lot of faith into technologies. Yet, technologies will not stand alone in this immense task that lies before us. “Technology will take us only so far. Changing human behaviour and consumption patterns is the only real solution in the longer-term perspective”, continues Adrian Sobotta. IT may support this task, by confronting us with our real-time consumption – for instance through Smart Grid and Smart Meters – thereby forcing some of us to realise our impact.

But technologies, such as Green Information Technologies, are not going to disperse themselves. Before betting on new technologies, we need to establish long-term security of investments. And the only way to do this is to have an agreed long-term set of policy decisions that create the right incentives to promote the development we need.

Creating awareness with Infographics: Power impacts and solutions for Data Centers

A lot of people  are pleased to use all kind of digital services any place, any where , any time but are not aware that to make this possible we are using vast data centers to provide this services.

Creating awareness, sending the message around that we must manage this energy consumption properly is still an important job to do.

A picture is worth a thousand words. To illustrate this saying ABB has prepared a very nice and readable infographic showing some of the major impacts and mitigating solutions of the power consumption of data centers. A five minute read and your are up to date.

Lets have more of these kind of infographics to get the message around.

ABB Data Center Infographic

Asian Tigers still have something to learn about Green IT

Asian TigersIn Asia, the larger data centres tend to be based in the most expensive cities such as Tokyo, Hong Kong, Singapore or Shanghai. For almost ten years there is an impressive and continuous growth in data centers in the Asia-Pacific market. According to Chengyu Wu from Frost & Sullivan this growth in Asia-Pac will continue at a CAGR (compound annual growth rate) of 14.6 percent (2009-2011). Demand for data centre hosting, Wu adds, currently exceeds supply. “In fact, over 80 percent of the major data centres in Asia-Pacific are running at close to 90 percent capacity and space is at a premium.”

While the outlook appears highly promising, data centre operators struggle with the high cost of operations which have increased exponentially in recent times. According to Frost & Sullivan director Jayesh Easwaramony, “Power costs can often account for more than 50 percent of the overall operational expenditure (OPEX) of a data centre, while real estate pricing could also seriously inflate costs.

Begin this year the ZDNet Asia IT Priorities 2010 survey showed that, the green initiatives scored the lowest as an IT priority among Asian businesses. In a recent interview of ZDNet Asia, Chris McPherson of Raritan stated that Asian companies are not yet seeing the full importance of implementing green technologies.

This is a strange thing given the incentives of expensive data center locations, the enormous power costs, and not to forget that power and cooling and floor space form together a certain data center threshold and therefore can prohibit growth. The demand for IT capacity can’t go beyond this threshold because of power shortage or overheating and/or lack of floor space. This create the risk that suddenly demand for IT capacity can’t be fulfilled and it will come to a grinding halt.

One way to “push” for green uptake, McPherson said, is to have governments either reduce subsidized power bills or increase subsidies for green energy. However, these incentives are currently slow and minimal. Analyst Chengyu Wu pointed out that discussions have centered mainly around concepts such as virtualization and utility computing emerging in the data center segment. McPherson agreed that virtualization is one way to help companies manage green costs since not as many servers need to be deployed, which consequently brings about savings in real estate expenditure. McPherson emphasized that employing information and management tools that helps companies to find out what is really happening at their device level and measure the individual energy consumption in order to make better decisions on reducing spending and maximizing savings. “The electricity bill is for the total cost of the data center, but to break that down as to what each component is costing you, it is only recently that tools are available to do so,” he said.

To use these tools, first a paradox must be solved. Because if you don’t see the issue why spending money on tools? So who starts using them? Who is aware about the issue, who is responsible, who feels the pain and take action? We need publications of real cases in reducing energy consumption and reducing energy costs in the data center environment to get things started.

P.S. Are the Asian Tigers the only one that still have something to learn about Green IT? I don’t think so …

Energy Elasticity: The Adaptive Data Center

Data centers are significant consumers of energy and it is also increasingly clear that there is much room for improved energy usage. However, there does seem to be a preference for focusing on energy efficiency rather than energy elasticity. Energy elasticity is the degree in which energy consumption is changing when the workload to be processed is changing. For example, IT infrastructure which has a high degree of energy elasticity is one characterised by consuming significantly less power when it’s idle compared to when it’s running at its maximum processing potential. Conversely, an IT infrastructure which has a low degree of energy elasticity consumes almost the same amount of electricity whether it’s in use or idle.  We can use this simple equation:

Elasticity = (% change in workload / % change in energy usage)

If elasticity is greater than or equal to one, the curve is considered to be elastic. If it is less than one, the curve is said to be inelastic.

Given the fact that it isn’t unusual that servers operating under the ten per cent average utilization and most servers don’t have a high energy elasticity (According to IDC, a server operating at 10% utilization still consumes the same power and cooling as a server operating at 75% utilization) it is worthwhile to focus more on energy elasticity. A picture can say more than words so this energy elasticity issue is very good visualized in a presentation of Clemens Pfeiffer CTO of Power Assure, at the NASA IT Summit 2010. As you can see without optimization, energy elasticity, power consumption is indifferent to changes in application load.

Load Optimization (c)Power Assure

Servers

Barroso and Holzle of Google have made the case for energy proportional (energy elastic) computing based on the observation that servers in data centers to-day operate at well below peak load levels on an average. According to them energy-efficiency characteristics is primarily the responsibility of component and system designers, ”They should aim to develop machines that consume energy in proportion to the amount of work performed”. A popular technique for delivering someway of energy proportional behavior in servers right now is consolidation using virtualization. By abstracting your application from the hardware, you could shift things across a data center dynamically. These techniques

  • utilize heterogeneity to select the most power-efficient servers at any given time
  • utilize live Virtual Machine (VM) migration to vary the number of active servers in response to workload variation
  • provide control over power consumption by allowing the number of active servers to be increased or decreased one at a time.

Although servers are the biggest consumers of energy, storage and network devices are also consumers. In the EPA Report to Congress on Server and Data Center Energy Efficiency is suggested that, servers will on average account for about 75 percent of total IT equipment energy use, storage devices will account for around 15 percent, and network equipment will account for around 10 percent. For storage and network devices energy elasticity is also a relevant issue.

Storage

Organizations have increased demand for storing digital data, both in terms of amount and duration due to new and existing applications and to regulations. As stated in a research of Florida University and IBM it is expected that storage energy consumption will continue to increase in the future as data volumes grow and disk performance and capacity scaling slow:

  • storage capacity per drive is increasing more slowly, which will force the acquisition of more drives to accommodate growing capacity requirements
  • performance improvements per drive have not and will not keep pace with capacity improvements.

Storage will therefore consuming an increasing percentage of the energy that is being used by the IT infrastructure. Of the data set that is being stored only a small set is active. So it is the same story as for the servers, on an average storage operate at well below peak load levels. A potential energy reduction of 40-75% by using a energy proportional system is claimed. According to the same research there are some storage energy saving techniques available:

  • Consolidation: Aggregation of data into fewer storage devices whenever performance requirements permit.
  • Tiering/Migration: Placement/movement of data into storage devices that best fit its performance requirements
  • Write off-loading: Diversion of newly written data to enable spinning down disks for longer periods
  • Adaptive seek speeds: Allow trading off performance for power reduction by slowing the seek and waiting an additional rotational delay before servicing the I/O.
  • Workload shaping: Batching I/O requests to allow hard disks to enter low power modes for extended periods, or to allow workload mix optimizations .
  • Opportunistic spindown: Spinning down hard disks when idle for a given period.
  • Spindown/MAID: Maintaining disks with unused data spundown most of the time.
  • Dedup/compression: storing smaller amounts of data using very efficient

Storage virtualization can also help but component and system designers should aim to develop machines that consume energy in proportion to the amount of work performed. There is still a way to go to get energy elastic storage.

Network

According to a paper of the USENIX conference NSDI’10 “today’s network elements are also not energy proportional: fixed overheads such as fans, switch chips, and transceivers waste power at low loads. Even though the traffic varies significantly with time, the rack and aggregation switches associated with these servers draw constant power.” And again the same recipe dooms up, component and system designers should aim to develop machines that consume energy in proportion to the amount of work performed. On the other hand, as explained in the paper, some kind of network optimizer must monitor traffic requirements. Choosing and adjusting the network components to meet those energy, performance and fault tolerance requirements and powers down as many unneeded links and switches as possible. In this way, on average, savings of 25-40% of the network energy in data centers is claimed.

Cooling

Making servers, storage and the network in data centers energy-proportional we will also need to take air-conditioning and cooling needs into account. Fluctuations in energy usage is equivalent to fluctuations in warmth, and the question is if air-conditioning can be quickly zoned up and down to cool the particular data center zones that see increased server, storage or network use. As Dave Craven of Spinwave Systems, stated in a recent editorial article of the Processor “Unfortunately, the mechanical systems used to cool and ventilate large data centers haven’t kept up with technological advances seen in the IT world”. “Many buildings where they are putting newer technology and processes are still being heated and cooled by processes designed 20 years ago” Craven adds to this. Given the fact that the PUE is driven by the cooling efficiency (see for example the white paper of Trendpoint) it looks like cooling is the weak spot to create an energy elastic data center.

Next step

The idea of ‘disabling’ critical infrastructure components in data centers has been considered taboo. Any dynamic energy management system that attempts to achieve energy elasticity (proportionality) by powering off a subset of idle components must demonstrate that the active components can still meet the current offered load, as well for a rapid inactive-to-active mode transition and/or can meet changing load in the immediate future. The power savings must be worthwhile, performance effects must be minimal, and fault tolerance must not be sacrificed.

Energy management has emerged as one of the most significant challenges faced by data center operators. Defining this energy management control knob to tune between energy efficiency, performance, and fault tolerance, must come from a combination of improved components and improved component management. The data center is a dynamic complex system with a lot of interdependencies. Managing, orchestrating, these kinds of systems ask for sophisticated math models and software that uses algorithms to automatically make the necessary adjustments in the system.

Cutting energy bill of data center by millions

Also in the digital economy location matters, but by moving your workload you can save money. The location of your data center can make a difference in energy costs. “So can this be the starting of moving your IT workload, on the virtualized infrastructure, to the place where the energy price is the lowest?” was the end of the blog entry “Power usage and money savings” earlier this year  about the huge price differences in electricity.

Now some very interesting stuff came up.

A recent study from researchers at MIT, Carnegie Mellon University, and the networking company Akamai suggests that such Internet businesses could reduce their energy use by as much as 40 percent by rerouting data to locations where electricity prices are lowest on a particular day.

Asfandyar Qureshi, a PhD student at MIT, first outlined the idea of a smart routing algorithm that would track electricity prices to reduce costs. This year, Qureshi and colleagues approached researchers at Akamai to obtain the real-world routing data needed to test the idea. The team then devised a routing scheme designed to take advantage of daily and hourly fluctuations in electricity costs.

“We introduce Power-Demand Routing (PDR), a technique that redistributes traffic between replicas with the express purpose of spatially redistributing the system’s power consumption, in order to reduce operating costs. Cost can be described in monetary terms or in terms of pollution. Within existing Internet services, each client request requires a meaningful amount of marginal energy at the server. Thus, by rerouting requests from a server at one geographic location to another, we can spatially shift the s marginal power consumption at Internet speeds.” as stated by Qureshi. Qureshi shows “how PDR can be used to reduce electric bills. He describe how to couple request routing policy to real-time price signals from wholesale electricity markets. In response to price-differentials, PDR skews client load across a system’s clusters and pushes server power-demand into the least expensive regions”. His conclusion is that existing systems can use PDR to cut their annual electric bills by millions of dollars.”

Setting a price for carbon emission will have a huge effect on (IT) economics. Its time to get the impact of a “carbon tax” on the radar screen of organizations as explained in the blog entry IT Carbon Emission is a million business. In his thesis, ‘Power-Demand Routing in Massive Geo-Distributed Systems’, Qureshi also show how PDR can be used to reduce carbon footprints. “Not all joules are created equal and in power pools like the grid the environmental impact per joule varies geographically and in time. We show how to construct carbon cost functions that can be used with PDR to dynamically push a system’s power-demand toward clean energy system”.

But beware we are not talking about sustainability we are talking about money: these ideas are not about energy or carbon savings but about energy and carbon costs savings.

Bookmark and Share