Mobility and Cloud Computing and the need for a new security concept

RestrictedThe current modern way of doing business and service provisioning is based on openness and agility.
This brings the traditional security concept in which an organization is positioned as an “information fortress” strongly under pressure. The traditional perimeter is vanishing and sensitive information travel outside your organization in many ways on many different devices. Mobility and Cloud Services is the name of the game.

For the most part business information is stored as unstructured data. Unstructured data refers to information that either does not have a pre-defined data model or is not organized in a pre-defined manner and is usually stored in files instead of databases.

This context creates a strong need for other security concepts. Instead of securing a predefined and fixed “location” the focus must be shifted to the actual security of information that is mobile; information security at the file level. Or in other words file protection and control beyond the perimeters of the enterprise because the rule set is stored with the file itself.

There are security products on the market that makes it possible to achieve security at the file level. Applying this security concept will affect the current and usual roles and activities related to security.

Still it is common that a security officer in cooperation with the system administrator ensures safe work areas to which firewalls, passwords and malware scanners protect access. The user organization is consulted and her voice, the voice of the customer, is found in the general security policy but the focus is on the security of the IT infrastructure. So bottom line security is an IT issue.

By implementing a security concept based on information security on the file level, one comes in very close contact with the daily operational activities of the users and the organization. Security choices have a much more immediate and greater impact on the work and activities than in a traditional security concept.

Who has when and where access to which information? The access management can be excellently managed by using the new file access management tools. But access management is not an isolated topic. Access is granted to individuals and groups, thus the topic of Identity Management comes in sight and should be very well organized. Access management is part of the triple A; authentication (identity management), authorization (access management) and accounting. These are well-known concepts in the security world but they were mainly applied to the level of IT infrastructure. Now that it is urgently needed to shift the focus of the security of the IT infrastructure to the security of information, one comes in very close contact with the daily work of the organization. This includes authorization, as well as authentication and accounting.

In the traditional division of labor a security officer defines a security policy, based on the input from the user organization and rules and legislation, and then align with the IT organization what mechanisms will be used to operationalize this policy. Product selection, technical equipping and the daily operation are performed by the IT organization. Thus the questions about why, how and with what are satisfied. In fact the user organization is only involved with the first question and it plays no role in the implementation. Also the security officer is hardly involved in the daily operations of the IT organization and the user organization. It is a slow top-down approach.
In fact in terms of a responsibility assignment matrix (RACI), one could say that the security officer is accountable, the system administrator is responsible and the user organization is consulted and informed.

The application of information security through access management at the file level puts this traditional work under pressure. The granularity of the access rights on file level is very fine-grained, in combination with the dynamics of working processes, the needed agility, and the shortening of time to deliver, is conflicting with the traditional hierarchical top down approach.

So to solve this issue a different way of working is required, where the user organization should be much more involved in information security or better stated they should in fact take the lead. After all it affects their daily work. The security officer, the auditor, and the IT organization must be well aware about the daily work of the user organization. They should gain knowledge about what is going on in the workplace to ensure that the access management is workable, that it produces the desired result, and meets the expectations of the user organization. Additionally, identity management and accounting at this micro level should also be taken into account.

To get information security by means of access management at the file level it is advisable to take a closer look at the different roles that are involved. These are the security officer, the auditor, the system administrator and the “super user” or functional administrator of the access rules for the user organization.

• The security officer is dedicated to define the information security policy and with which mechanisms (the solution approach regarding the security organization, workflow and technology) this should be realized. On behalf of the user organization.
• The auditor is the one that shows how accounting should take place (what are the control points, which information should be captured to comply with laws and regulations, what is the audit trail) and executes audits.
• The system administrator is the one that operationalizes the access management within the framework of the security officer and the auditor, and also takes care of the relationship (in terms of technology and execution) with the work areas of identity management and monitoring (accounting).
• The “super user” or functional administrator of the user organization is the one that actually manages the access rules within the framework of the security officer and the auditor.

To support the modern way of doing business and service provisioning we need to create an agile security organization, with a transparent separation of interests and responsibilities. Instead of a hierarchical security organization, a flat organization is needed where the “super user” is accountable, the system administrator is responsible, the security officer and auditor are consulted and the end-users get informed.

Cloud Computing, ownership matters

In the past fifteen years, many internal IT departments of enterprises evolved from artisan organizations that only assembled and provided customized, tailor-made products, to hybrid craft and mass production organizations that provides custom as well as standard products. But nowadays these IT departments are confronted with external organizations that deliver standard services and products that can be easily adapted to the needs of the customer based on the concept of “mass customization”.
Instead of buying, owning, and managing all kinds of IT components by yourself, nowadays the IT infrastructure is offered as a service by means of cloud computing.

There is a shift from “goods dominant logic” to a “service dominant logic”, were the focus is shifting away from tangibles and toward intangibles. This trend is supported by the use of virtualization technology for server, storage and network devices and also for applications.

The cloud computing offering of lower costs, shorter time to market, and on demand provisioning makes it very tempting for organizations to outsource their IT infrastructure and services.

But don’t we forget something? One of the most important things of information processing is that an organization has the right controls over the use of applications, data and infrastructure. Incomplete control can lead to all kinds of issues about business obligations and liabilities.

The control of these items is arranged by contracts, which is in fact an exchange of property rights. These property rights are a bit complicated because they have several dimensions:
• The right of possession
• The right of control
• The right of exclusion (access rights)
• The right of enjoyment (earn income from it)
• The right of disposition (buying or selling)

The consequence of these different dimensions is that different parties are able to hold partitions of rights to particular elements of a resource. On top of this there is the issue of legislation. When we talk about ownership we have to be careful because in legal systems ownership is based on tangible/physical objects. And yes of course, we have legislation about intellectual property, trademarks, etc. but when it comes to virtualized objects it becomes murky. Also cloud computing is about delivering services (intangibles) not about goods (tangibles).

The transition from “goods dominant logic” to a “service dominant logic” is a mind shift where the “bundle of rights” or property ownership still matters.

Signing cloud computing deals is not only about money and provisioning it is also about control. When a cloud computing sourcing deal is taking place the partitions of property rights should be grouped into appropriate bundles to stay in control.

DataCloud 2015 Awards

Datacloud2015

 

For the fifth year on row being part of the panel of Judges for the ‘DataCloud Awards’, the winners of the Data Cloud Awards 2015 will be announced in Monaco the 2nd of June 2015.

Read the brochure for more information about these event.

Over the past years, award winners have included companies such as Equinix, TelecityGroup, Interxion, Cohesive FT, iomart, Colt Technology Services, Cofely GDS Suez/akquinet GmbH, Claranet, 6Degrees, Prior1 GmbH, The Open Compute Foundation, Portugal Telecom, and NTT Communications among many others…

Now in its 8th year, the prestigious Awards provide exceptional marketing differentiation for both winners and runners up and recognized as the premier industry accolade across Europe and internationally.

With an extended panel of expert Judges, the 2015 categories, including new Cloud Awards, are designed to reflect changes in markets and technologies, but retain a focus on recognising best in class and excellence across the industry. 

In 2015 the Awards will again provide new benchmarks of aspiration for the companies shortlisted and world class recognition for winners.

Aging power plant fleet and data centers

Thomson Reuters made a nice visualisation on Europe’s aging nuclear reactors. Currently the EU operates 131 reactors with an average age of 30 years.

eu-nuclear-TR

It reminded me on a report I wrote in 2012 for Broadgroup about the power market and data centers in Europe. The quality and availability of the data center stands or falls with the quality and availability of the power supply to the data center. So the power market is something to watch closely.

Depending on the power technology that is being used, power plants have different life cycles. Coal-fired plants have a life cycle of about 40 years, gas-fired: 30 years, nuclear: 40 years, hydro: 80 years, and renewables are estimated on 25 years. Based on this life cycle estimates we can say that Europe has an ageing power plant fleet. A report of RWE states that for hard coal power plants more than 70% are in their half of their life cycle, for lignite and gas/oil more than 60% and more than 50% of the plants are in their half of their life cycle. For hard coal plants, based on the EU Large Combustion Plants Directive, 
replacement of all these plants is needed by 2030.

There is the expectation that the nuclear reactor lifetime is 40 years or more. The implication of a forty-year life expectancy is that in the next ten years (from 2012 onwards) forty nuclear power plants will be closed or 30% of the current nuclear power plant fleet. This would be a decommissioning of 30207 Mw net capacity, or 24.5% of the nuclear power capacity.

Given the fact that the average age of the 130 units that already have been closed worldwide is about 22 years, the projected operational lifetime of 40 year or more appears rather optimistic.

The decommissioning of nuclear power plants, has an impact on the carbon policies and targets and can create a shortage in power and a rise of the electricity price if proper counter measures are not taken.

Number of nuclear power plants EOL with a forty-year life expectation scenario (Broadgroup 2012)

Number of nuclear power plants EOL with a forty-year life expectation scenario (Broadgroup 2012)

A special case is Belgium. Two nuclear reactors were closed for a second time in march 2014 because of cracks in the steel reactor casings. The nuclear reactors Doel 3 and Tihange 2 in Belgium will be restarted earliest in the spring of 2015 and there is an increased chance that will be closed forever.

In august another nuclear reactor, Doel 4, has to be shut down after major damage to its turbine because of oil leakage. Electrabel said its Doel 4 nuclear reactor would stay offline at least until the end of this year, with the cause confirmed as sabotage.

As a result of this just over 3 GW of power is offline, more than half of the nuclear powerNuclear power plant Doel supply. Whereas nuclear power contributes about 50% of the electricity produced domestically.

So there is the possibility of blackouts this winter so Belgium will have to boost interconnection capacity with neighbouring countries to prevent power shortages.

According to the Minister of Energy Johan Vande Lanotte the last electricity consumption record was recorded on 17 January 2013, “On such a cold winter day, we consume about 14,000 megawatts. With the current production we come 1000 too short.”

Much depends on the weather, potential problems are to be expected from the end of october or early november according to Elia, Belgium’s electricity transmission system operator.

See Power market, power pricing and data centers in Europe. Broadgroup 2012, for more information about the electricity market and data centers in Europe.

Data Center 2.0 – The Sustainable Data Center and the use of Enterprise Architecture

Recently I had an interesting conversation with John Booth of Carbon3IT (who is also Chairman of DCA Energy Efficiency and Sustainability Steering Group) about my latest book on sustainable data centers. The discussion focussed on what the book is addressing and the level of abstraction that is being used.

Based on this discussion I made a short presentation with a different visualization of the idea behind the book and the informal use of an enterprise architecture framework when writing the book.
Although not stated explicitly, the book is loosely based on an Enterprise Architecture Framework. This EA framework has four Architecture domains:
  • Business architecture
  • Information architecture
  • Information systems (application) architecture
  • Infrastructure architecture
For each of these architecture domains, this EA framework shows five levels of abstraction;
  • Why – Contextual;  Motivation, scope, constraints, strategies, principles that apply to the different architecture domains
  • What – Conceptual; The vision of services (business services, information services, application services, and infrastructure services)
  • How – Logical; The mechanisms, components, and connections that provide the services
  • With What & Who – Physical; The physical implementation, deployment and sourcing of all the components and people.
  • When – Transformational; To define an integrated roadmap to make the wanted transformation possible.
The book focus is on the two top layers and scratch the surface of the logical layer or in other words the strategical and tactical level. It also spend some thoughts on transformation.
As stated this architectural framework is used in an informal way it don’t define formal architecture deliverables for the different domains and levels of abstractions.

But in the end the use of an enterprise architecture framework is the way to go to design and “build”,  in a coherent and consistent way,  a sustainable data center.

DC20ExplainedInfrarati.001 DC20ExplainedInfrarati.002 DC20ExplainedInfrarati.003 DC20ExplainedInfrarati.004 DC20ExplainedInfrarati.005 DC20ExplainedInfrarati.006 DC20ExplainedInfrarati.007

World Cup 2014: did you recorded a drop in the network energy use?

Netherlands_Spain2014Large events can have a serious impact on IT infrastructure. A great example of this is the current football World Cup 2014.

RIPE NCC, one of five Regional Internet Registries (RIRs), does a great job with collecting data of the network traffic during the event. In collaboration with the European Internet Exchange Association (Euro-IX) they follow the Internet Exchange Point (IXP) traffic during the championship.

Two examples from their research. The following graphs shows in red the traffic of the match day and the same day of the week in the weeks before as grey lines. The game times are indicated as grey rectangles in the background.

 The Netherlands – Spain

The traffic volume during the Netherlands-Spain game differs 256 Terabytes to the week before!

TheNetherlands_Spain2014

Traffic statistics at the Amsterdam Internet Exchange (AMS-IX).

 

Cameroon – Brazil

Cameroon_Brazil2014

Traffic statistics at the PTT Metro IXP in Sao Paulo, Brazil.

These traffic drops makes you wonder how much energy was saved by these games.

More information about the network traffic can be found at RIPE NCC and more information will follow during the World Cup.

Data Center 2.0 – The Sustainable Data Center, Now Available!

Data Center 2.0 – The Sustainable Data Center is now available. Data Center 2.0

The book is showing up on the websites of Amazon and will soon starts to pop up on websites of other  E-tailers’ .

Data Center 2.0 – The Sustainable Data Center is an in-depth look into the steps needed to transform modern-day data centers into sustainable entities.

See the press release:

Some nice endorsements were received:

“Data Center 2.0, is not so much about technology but about people, society and economic development. By helping readers understand that even if Data Centers, enabling the Digital economy, are contributing a lot to energy saving, they need to be sustainable themselves; Rien Dijkstra is on the right track. When explaining how to build sustainable Data Centers, through multi disciplinary approach, breaking the usual silos of the different expertise, Rien Dijkstra is proposing the change of behavior needed to build sustainable Data Centers. Definitely it is about people, not technology.” 

Paul-Francois Cattier, Global Senior Vice-President Data Center – Schneider Electric

“In Data Center 2.0 The Sustainable Data Center author Rien Dijkstra has gone several steps further in viewing the data center from the perspective of long term ownership and efficiency in combination with treating it as a system. It’s an excellent read with many sections that could be extracted and utilized in their own right. I highly recommend this read for IT leaders who are struggling with the questions of whether to add capacity (co-locate, buy, build, or lease) or how to create a stronger organizational ownership model for existing data center capacity. The questions get more complex every year and the risks more serious for the business. The fact that you’re making a business critical decision that must stand the test of technology and business change over 15 years is something you shouldn’t take lightly.” 

Mark Thiele, President and Founder Data Center Pulse

“Data centers used to be buildings to house computer servers along with network and storage systems, a physical manifestation of the Digital Economy. Internet of Things, the digitization of about everything in and around us, brings many profound changes. A data center is the place where it all comes together. Physical and digital life, fueled by energy and IT, economical and social demands and needs and not to forget sustainability considerations. Sustainable data centers have a great potential to help society to optimize the use of resources and to eliminate or reduce wastes of capital, human labor and energy. A data center in that sense is much more than just a building for servers. It has become a new business model. Data center 2.0 is a remarkable book that describes the steps and phases to facilitate and achieve this paradigm.” 

John Post, Managing Director – Foundation Green IT Amsterdam region