Cloud Standards Get Customer Push

by admin on August 9, 2011

The newly formed Open Data Center Alliance is
using an array of usage models to weld cloud-using customers into a force that
prevents vendor lock. At the same time, the group is promoting secure movement
of virtual workloads from one provider to another. The organization is made up
of more than 200 members including JPMorgan Chase, Lockheed Martin and
Marriott.

While there are other nascent cloud-user organizations forming, namely CSCC (Cloud
Standards Customer Council), ODCA (Open Data Center Alliance) in June issued
eight usage models that organizations can use today when specifying baseline
requirements for cloud projects.

The formation of both customer groups comes at a seminal moment for data center
design. Virtualization is driving compute, storage, networking, application and
desktop IT managers to drastically increase the efficient use of costly
resources. And the option to outsource some or all virtual workloads is a bell
that cannot be un-rung.

The emergence of the ODCA’s usage models is a recognition of the seismic
changes in data center operations. IT managers must now provide strategic
guidance to C-level managers and line-of-business leaders for incorporating the
changes being wrought by virtualization and cloud computing while avoiding
vendor lock in.

The usage models from the ODCA can help in this effort. However, my analysis
shows that many of the guidelines can be immediately strengthened and made more
practical. For example, the ODCA Security Provider Assurance guide doesn’t
spell out exactly what level of law enforcement action is needed for the
provider to turn over your data. In a private data center, there are understood
procedures and boundaries on the execution of search warrants. In a hosted
environment, data protections from unwarranted law enforcement searches are
murky. Therefore, IT managers should demand very specific answers from
providers about the safeguards in place to prevent data loss when the
governmental agency comes knocking.

The Usage Models


All together, there are eight published usage models that fit into four general
categories: secure federation, automation, common management and policy, and
transparency.

Secure federation is made of the SM (Security Monitoring) and the SPA
(Security Provider Assurance) models. The SM usage model depends heavily on
work being done at the Cloud Security Alliance and CloudAudit, both of which
are made up primarily of security service vendors. Among the more interesting
usage requirements is the daunting ability of the cloud provider to supply
“dedicated capabilities with specific resources and reserved for specific
customers.”

The SPA document has three stated purposes that are backed up
with a four-category, bronze-to-platinum rating system. The publication also
enables cloud consumers to compare security levels from one provider to another
and between internally and externally hosted clouds. The SPA
usage model should make it easier for cloud consumers to understand and select
among various levels of security offered by providers. As previously stated,
this usage model should be augmented to probe when a search warrant would
result in the loss of data control.

The automation category encompasses I/OC (I/O Control) and VMW (VM
Interoperability). The I/OC is a short document that references one of the big
problems raised by increased virtual-machine density in a cloud environment:
I/O contention. The I/OC weighs in on the side of work being done by NIST
(National Institute of Standards and Technology) and the DMTF (Distributed
Management Task Force) when it comes to I/O bottlenecks. To control for I/O
bottlenecks, the I/OC publication focuses on monitoring, SLA metrics,
APIs, timeslice controls and I/O reservations with the expectation these
requirements could be met in multi-vendor environments using non-proprietary
protocols.

Furthering the case for automation, the ODCA is using work done by the DMTF on
the OVF (Open Virtualization Format) to press the case that a VM created on one
hypervisor platform and/or running in a cloud provider data center should be
able to move to another data center and/or to a different hypervisor platform.
Now that the ODCA is also pushing OVF, it’s clear this is the interoperable VM
format to implement. VMware and Citrix have been long-time proponents of OVF,
and shops that are using either of these platforms are likely well along in
understanding how to use OVF in transporting virtual machines between
platforms.

Common management and policy is defined by the RF (Regulatory Framework) usage
model.
The model does a decent job of setting out a way for cloud providers to step up
to the compliance plate while rightly insisting cloud consumers are ultimately
accountable for risks.

The RF focuses on an ongoing corporate compliance program for cloud
environments, which matches best practice guidelines that I’ve seen for private
data center operations. This means the practices that your organization already
follows aren’t that different when data and applications are moved to a shared,
cloud environment. What changes is the cloud provider becomes the source of the
risk assessment and management data. Thus, IT managers would do well to use the
RF as a starting point for exploring the ability of an external cloud
provider’s ability to satisfy reporting and control requirements.

The transparency category includes the SUoM (Standard Units of Measure), SC
(Service Catalog) and CF (Carbon Footprint) usage models. The SUoM outlines
definitions and measures that would make it easier for cloud consumers to
compare apples to apples when it comes to paying for services.

The Service Catalog usage model is by far the most detailed of the publications
and is the codification of the actual interaction between cloud providers and
cloud consumers. The SC is all about what an API
(application programming interface) will provide to large cloud consumers. “[A]
GUI is of relatively low importance to our target Cloud-Subscriber, who may
choose not to use it at all for service discovery and selection. Instead,
Cloud-Subscribers need a robust and detailed API…to
interrogate the Service Catalog.” With this gruff introduction, the SC usage
model dives into 10 pages of what, roughly, would be entailed in this “robust
and detailed API.”

Besides being a refreshingly sober antidote to the often breezy discussion
surrounding cloud computing implementation, the SC is a decent primer about
just what cloud computing can offer. In detailing the service catalog
requirements, the SC touches on nearly every aspect of cloud operations and
talks quite frankly about what role IT can play in ensuring these services are
delivered efficiently.

The CF document describes how cloud providers can measure the carbon output for
IT managers who want to make greener choices when selecting cloud services. On
the minus side, the Carbon Footprint model is candid in pointing out that the
SUoM’s don’t include the impact incurred when the data center was first built.
Nor is hardware decommissioning included when measuring carbon release. In this
sense, the Carbon Footprint takes a “we-had-to-start-somewhere” approach.

Indeed, the ODCA had to start somewhere, and IT managers would do well to take a
look at the documentation set as part of any plan to move to the cloud.

 

Comments on this entry are closed.

Previous post:

Next post: