Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

FalconStor

ShareFile

VeriStor

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

July 15, 2013
Volume XLIV, Issue 7


CLOUD COMPUTING WEST 2013 Speakers

The first wave of keynote speakers, panelists, and moderators for CLOUD COMPUTING WEST 2013 (CCW:2013) includes industry leaders AT&T Mobility, Comcast, Dell, Microsoft, Netflix, Rackspace, and Sprint Nextel.

CCW:2013 is the Cloud Computing Association's (CCA) and Distributed Computing Industry Association's (DCIA) business strategy summit taking place October 27th-29th at The Cosmopolitan in Las Vegas, NV.

This year's themes are "Revolutionizing Entertainment & Media" and "The Impact of Mobile Cloud Computing & Big Data."

Rackspace's Cloud Products Program Manager Tom Hopkins will keynote on "Strawberry Coconut Cloud — You Choose the Flavor."

Other featured speakers include Microsoft's Platform Technology Evangelist Yung Chou, Trend Micro Director Dan Reis, and Sprint Nextel's Chief Cloud Strategist Jay Gleason.

ABI Research's Practice Director Sam Rosen will speak on "Consumer Transition to the Cloud: Service Provider & OTT Video, Gaming, and Music Services."

AT&T Mobility's Enterprise Architecture Manager Melody Yuhn will address "Mobile Storage Considerations."

Dell's Enterprise Cloud Evangelist Michael Elliott will discuss "Hybrid Clouds — The End State."

And Netflix's Architect and Principal Engineer Mikey Cohen will examine "Cloud Migration Considerations."

There's no question that advances in cloud computing are having enormous effects on the creation, storage, distribution, and consumption of diverse genres of content.

And most profound among these effects are those involving the increased proliferation of portable playback systems and the accompanying generation of unprecedented amounts of viewership, listenership, and usage information from audiences globally.

The ubiquity and widespread acceptance of user interfaces that reflect the dynamic interactivity exemplified by smart-phone applications is rapidly replacing the flat linearity of traditional TV channel line-ups and changing expectations for a new generation of consumers.

Cloud-based information and entertainment-of-all-kinds accessible everywhere always on each connected device will become the new norm.

And perfect data related to consumer behaviors associated with discovering and consuming this content will displace metering and ratings technologies based solely on statistical sampling.

DCINFO readers are encouraged to get involved in CCA's and DCIA's CCW:2013 as exhibitors, sponsors, and speakers.

The CCA is handling exhibits and sponsorships. Please click here for more information.

The DCIA's role is to provide keynotes, panelists, and case-study presenters to participate in our comprehensive agenda of sessions in ENTERTAINMENT & MEDIA and MOBILE CLOUD & BIG DATA.

Please click here to apply to speak at CCW:2013.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyAs a Member of the Digital Due Process (DDP) coalition, the DCIA this week signed-on to the following letter to Members of the United States Senate.

"We write to express our concerns about a proposal that would grant federal regulatory agencies authority to require web-based e-mail service providers, cloud service providers, and other Internet companies to disclose the contents of sensitive and proprietary communications or documents that they store on behalf of their customers.

American consumers and businesses large and small are increasingly taking advantage of the efficiencies offered by web-based e-mail servers and cloud-based storage and computing.

Cloud computing enables consumers and businesses to access their data anywhere and with many computing devices, facilitating collaboration and flexibility and providing cost-effective solutions.

American companies have been innovators in this field. Removing uncertainty about the level of legal protection afforded such information will encourage consumers and companies, including those outside the US, to utilize these services.

S. 607, the Leahy-Lee "Electronic Communications Privacy Act Amendments Act of 2013," which the Judiciary Committee approved in April, would make it clear that government agents must obtain a warrant (with appropriate emergency exceptions) if they want third party service providers to disclose content stored on behalf of their customers.

The undersigned support S. 607.

Many providers, concerned about the rights of their customers, are already requiring a warrant from law enforcement officials who seek access to content.

These providers point to US v. Warshak, a 2010 case in which the Sixth Circuit determined that the Fourth Amendment protects e-mail content stored by third parties. Moreover, the US Department of Justice (DoJ) has stated that it follows the warrant-for-content rule.

However, several regulatory agencies are resisting this reform.

Specifically, the Securities and Exchange Commission (SEC) has asked for an exception to the warrant requirement, allowing regulators to demand the content of customer documents and communications from third party providers.

That would work a sea change in the way that civil regulatory agencies conduct their investigations.

The sweeping change sought by the SEC would extend to all civil investigations conducted by the IRS, EPA, FCC, FEC, CFPB, and the whole panoply of regulatory agencies.

It would reverse current law and practice, under which these and other government agencies cannot gain access to more recent communications content from a third party service provider without a warrant.

We believe that it is best to preserve the traditional system.

In the traditional system, a regulatory agency serves a subpoena on the target of its investigation requiring that the target turn over documents that respond to the subpoena.

Under this approach, the target, or the target's lawyers, comb through all of the target's documents regardless of whether they are in a file cabinet, on an internal network, or stored with a third party.

The agencies have multiple tools to ensure that targets disclose all relevant but not privileged documents.

This process ensures full production, but protects against disclosure of the personal or proprietary records that don't meet the relevance test.

The SEC proposal would turn that process on its head.

If a civil regulatory agency could serve process on the target's communications service provider, the provider would be forced to turn over all of the information in the target's account, even if irrelevant to the subject of the investigation or legally privileged, since the service provider would be in no position to make a judgment about what was privileged or relevant.

Personal privacy would suffer, and the potential for government abuse would expand dramatically, because an individual or company whose records were sought would have no opportunity to object.

This would turn civil proceedings into fishing expeditions at a huge cost to individual privacy and the confidentiality of proprietary data.

Accordingly, we urge you to reject any proposal to give civil regulatory agencies a carve-out from the strong warrant protection of S. 607."

Joining the DCIA in signing this letter were sixty-six other organizations, including industry-leading companies Adobe, Dropbox, eBay, Facebook, Firehost, Foursquare, Google, Hedgehog Hosting, Hewlett Packard, Intel, LinkedIn, Microsoft, Oracle, Rackspace, Reddit, SAP, Sonic.net, Tumblr, Twitter, and Yahoo.

We urge American readers of DCINFO to contact your Senators and ask them to support S. 607 and reject the SEC proposal. Share wisely, and take care.

The Cloud Privacy Wars Are Coming

Excerpted from InfoWorld Report by David Linthicum

Germany's interior minister, Hans-Peter Friedrich — the country's top security official — cautioned privacy-conscious residents and organizations to steer clear of US-based service companies, according to the Associated Press.

As InfoWorld's Ted Samson has reported, "Friedrich is by no means the first EU politician to issue this type of warning, and as details continue to emerge about the US government's widespread surveillance programs, such warnings are certain to garner greater attention."

The blowback in Europe around NSA surveillance is no surprise. Privacy has always been a huge issue in Europe, as demonstrated by confrontations with Google, among others.

However, the real privacy wars in the cloud have yet to be fought, both in the United States and in Europe. This battle will likely occur in courtrooms and in government regulatory agencies.

The reality is that people who are working with cloud-based platforms won't stop using those platforms — but they will get much better at security and privacy.

With such improvements in security and privacy, law enforcement and government agencies won't have ready access to some data. That means legal battles will occur in many countries, with the use of remote data hosting services, such as cloud services, in the middle of those frays.

One result of businesses taking steps to ensure that their data won't be monitored by government agencies will be wider use of both encryption and physical restrictions on access.

However, if the government wants to see the data and obtains a court order (sometimes in secret), it will want access to that data.

To get that encrypted or restricted-access data in the cloud, the government will need an additional court order to gain access. That's when lawsuits will be filed and all hell breaks loose.

Some people believe these issues can be avoided by not using public cloud providers. But that's naive. If the government wants your data and if there is cause to support their concerns to a judge, it will come after that data whether it's in your closet or a cloud.

Welcome to the new world order.

AUS AG Gives Final Sign-Off on Private Data Stored in Cloud

Excerpted from Business Cloud News Report by Jonathan Brandon

Just over a month after the release of its official cloud computing policy, the Australian government announced a new set of risk assessment guidelines for all public sector organizations looking to store data in the cloud.

The Protective Security Framework builds on the National Cloud Computing Strategy announced in May this year, which was introduced to help promote the use of cloud services within the public sector, and sets out strict risk-assessment guidelines which must be followed by all public sector organizations looking to use cloud services.

The policy aims to help decision-makers in determining when to allow the use of offshoring or outsourcing on a case-by-case basis, and encourage "appropriate" use of cloud computing services.

The new Framework was introduced by Australia's Attorney-General, Mark Dreyfus, and the Minister Assisting for the Digital Economy, Kate Lundy, at a press conference Friday afternoon.

"I have paid special attention to the security of personal information, which people expect will be treated with the highest care by all organizations, but by government in particular," Dreyfus said.

The Framework ensures that information which doesn't require privacy protection can be stored and processed in outsourced environments after an agency-level risk assessment is completed. But privacy protected information can only be stored in the cloud if suitable approvals are in place, which will require that the relevant portfolio Minister and the Attorney-General.

Additionally, classified information will only be allowed to be stored onshore, with the exception of Australian Embassies and other locations designated by "special agreements."

"Safeguards have been incorporated so that before personal information can be stored in the cloud, the approval of the Minister responsible for the information, and my own approval as Minister for privacy, must be given. This is to ensure that sufficient measures have been taken to mitigate potential risks to the security of that information," Dreyfus said.

"The safeguards we have put in place will ensure the Government can take advantage of cloud computing to reduce storage costs and improve efficiency while still ensuring the external storage and processing of data only occurs where the privacy of personal information can be properly protected," he said.

Cloud Seen as Answer to Big Data Needs

Excerpted from Federal Times Report by Nicole Johnson

Some agency managers are still reluctant to hand over their data to cloud computing providers, for fear of diminished security or potential hassles should they need to switch vendors.

But the government's growing appetite for collecting information is forcing agencies to consider the promised benefits of cloud computing, such as storing and managing their data for less money and the ability to access data from any device over the Internet.

The latter is a major tenet of the Obama administration's push to make more data accessible to the public, and to increase productivity by allowing employees to access systems and files remotely, or via the cloud.

"We see a lot of momentum behind these changes, but it's still in the early days," Mark Ryland, with Amazon Web Services' World Wide Public Sector, said of agencies using cloud computing to manage large amounts of varying and complex data. Agencies often refer to this as big data.

Amazon's recent $600 million cloud contract with the CIA, though interrupted by a protest, is proof of the momentum Ryland is seeing. Under the indefinite-delivery, indefinite-quantity contract with a four-year base period, Amazon was to provide a modified version of its public cloud, or cloud services available to the general public. However, the cloud hardware and software was required to be installed in a government facility and operated by Amazon.

But IBM protested the contract with the Government Accountability Office. GAO last month released a public version of its decision that agreed with some of IBM's complaints, including the CIA's failure to evaluate pricing for both vendors fairly.

In its decision, GAO recommended that the CIA reopen the competition and amend the solicitation to ensure bid proposals are evaluated fairly.

"The CIA and the intelligence community in general have said that they want to retain all of the data they collect," said Alex Rossino, a principal analyst at market research firm Deltek. "This means that storage will be an exponentially growing requirement for the CIA and a huge area of profit for industry partners."

The "$600 million was really just the beginning," Rossino said. "We are talking potentially doubling the contract to at least $1.2 billion" over a potential nine-year period.

Ryland said the CIA contract signals "that people looking to the future of information technology (IT) are seeing that the future lies in these large-scale commodity, computing architectures."

In other words, buying tons of servers and standing up massive data centers is not the way of the future for agencies. In fact, the government is shuttering hundreds of data centers and urging agencies to share common IT hardware and software applications. Agencies also have been instructed to consider cloud solutions as a first option for data storage and Internet hosting needs.

Historically, rarely accessed data have been stored on large tapes or other devices and shipped to storage facilities, said Joe Brown, president and cofounder of Accelera Solutions.

"Customers have a much more dynamic way to store, retrieve, and operate data in cloud environments," Brown said. "They don't have to call someone and say, 'Retrieve data, bring it to my office and let me reconstitute it on my server,'" or format the data to be analyzed.

"The ability to more quickly get to data sets ultimately allows you to perform analytics more quickly and test different data sets to get more useful, meaningful data out of them," Brown said.

Amazon Cloud Security Cleared by Federal Government

Excerpted from Midsize Insider Report by Marissa Tejada

Amazon's cloud security gained a nod of approval from the US government, according to a recent announcement by the online retailer. Through a new accreditation process with the federal government, Amazon can more easily provide IT services to various federal agencies looking to utilize the cloud.

The US government's new program, called FedRAMP, is enabling federal agencies to utilize Amazon cloud services through a division of Amazon, called Amazon Web Services or AWS.

Accreditation for the program means that AWS is able to provide cloud computing services to federal agencies for a total of three years. Amazon said that all AWS data centers in the United States have been accredited for approval under FedRAMP.

"This will cut the cost and time for agencies to deploy our systems," said Teresa Carlson, vice president of Worldwide Public Sector at AWS. "It cuts costs for AWS too."

Over the past few years, Amazon, which is the world's largest online retailer, has pushed sales for its IT services involving remote computing, IT storage and other services through AWS.

Smaller companies and startups had comprised much of the AWS business. However, the firm is currently going after bigger companies and corporations are interested in the array of IT services. But with larger organizations, various demands on security must be considered as well as regulatory compliance rules, which are more stringent.

FedRAMP was a program launched by the US government just last year. It was an effort to streamline and standardize cloud security and cloud services.

Prior to the inception of the FedRAMP program, vendors had a more difficult time approaching the federal government when it came to selling IT services. For example, vendors were required to win authorization on an individual project basis. That slowed down the selling process and in the end added on costs, making the services more expensive.

Now that the US government has approved AWS under the FedRAMP program, various IT services can be cleared for approval for a certain government agency just once.

Then, if those services are needed again there will be no need to seek authorization a second time. As a result AWS services can seamlessly be used multiple times by a federal agency.

AWS said the Department of Health and Human Services approved their first three-year clearance under the FedRAMP program. The approvals will apply not only to the Department of Health and Human Services but to all the divisions that operate under the department.

Divisions such as the US Food and Drug Administration, Centers for Disease Control and Prevention, and the National Institutes of Health also will be able to use various Amazon cloud computing services through AWS.

Multi-cloud usage on the rise in business - 07/05/2013

As cloud computing secures its place as a powerful business tool, the technology continues to evolve and grow. Companies are relying more on the cloud, meaning that the technology must be able to incorporate more to fulfill increasingly diverse needs.

A recent report found that cloud based services are no longer bleeding edge technology, and three quarters of those surveyed are adopting cloud computing.

Channel Web said that integrating applications is becoming important for business, and it's much simpler to do in the cloud. Applications in the cloud work and communicate better and easier than on-premise alternatives.

The Rise of the Multi-Cloud in the Enterprise

Excerpted from CenterBeam Industry News Report

The technology industry is evolving and companies are adapting to the changes, according to Data Center Knowledge. The business world now demands a mix of services and the answer is proving to be a multi-cloud.

The report said that companies have various strategies for employing private, public or hybrid cloud computing services. But now multi-cloud services offer a multitude of options. Multi-clouds are becoming the favored option with 77 percent of respondents making this choice. In addition, if companies are already using a public cloud, they are more likely to have a multi-cloud strategy in place with 88 percent choosing multi-cloud.

There's a distinct middle ground between public and private clouds now that incorporates the public cloud's flexibility and the private cloud's security, said Data Center Knowledge.

Channel Web said that cloud providers need to develop new capabilities quicker than ever before, especially since customers now want instant action in business dealings. Whether, it's apps, services or integration, the pace is quickened.

According to the report, as the cloud matures, businesses are more readily accepting of it. Currently, companies use the cloud mainly for testing, development and customer-facing web apps. But now, its also being used for running internal web apps, mobile apps and batch processing. In addition to usage, the cloud provides many benefits including better scalability, faster infrastructure access and faster time to market for apps.

The report also noted that utilizing the cloud, more than 50 percent of respondents saw improvements like increased app availability, better business continuity, increased IT efficiency, better app performance, and expanded geographic reaches proving that cloud computing, especially with the multi-cloud option, is an asset in today's business world.

Microsoft: Cloud Computing is the New Normal

Excerpted from Search IT Channel Report by Lynn Haber

On a mission to hammer home to partners that cloud is the new normal, Microsoft Chief Operating Officer Kevin Turner urged partners attending the Vision Keynote on Day 3 of the Microsoft Worldwide Partner Conference here on Wednesday to lean in and start transacting with Microsoft in the cloud.

"No other company has 22,000 partners worldwide as cloud partners," said Turner, thanking the early adopters. But clearly the company has a long way to go. "Microsoft wants all of its partners — meaning the other 630,000 partners — to also transact in the cloud," he added.

And, this is the year the vendor wants to see the numbers explode; this is the time to double down on cloud, said Microsoft's Channel Chief, Jon Roskill.

What that means is that partners must have an answer for customers who inquire about the cloud. Partners must also be able to stitch together hybrid computing scenarios — on-premises and cloud-based — because "that's where we win," Roskill said.

The Corporate Vice President of the Worldwide Partner Group at Microsoft also pointed out that 70% of IT investment will still be on premises in 2016.

Even so, Microsoft cloud-oriented partners outperform their peers in gross profit, revenue per employee, new customer acquisition and growth, according to IDC's Darren Bibby, Vice President of Channels and Alliances Research.

To help partners down the Microsoft cloud computing path, Roskill shared five key insights from a new Microsoft-sponsored IDC report, dubbed "Successful Cloud Partners:"

1. Lead with cloud, close with hybrid. Use cloud as a door opener to talk to the customer, especially with new customers who want to have a conversation about the cloud.

2. Get to know the line-of-business leaders: the chief marketing officer, chief sales officer and COO. These executives hold the purse strings on a large portion of the IT budget. Today, 41% of IT budgets come from line-of-business functions, according to Roskill.

3. Optimize your team for cloud. Cloud-oriented partners get higher returns per employee because they optimize, or restructure, their team for the cloud. This can be done by reworking a partner's existing on-premises engagement model. In the end, companies that optimize their teams for cloud do so with fewer technical experts for more business with more customers. These companies are also able to offer customers fixed fees for project delivery. There is opportunity cost in retraining, but partners realized more value and more revenue per employee.

4. Partners must scale their business with their own intellectual property. Roskill referred to this insight as perhaps the most impactful in the IDC study. This is the value-add that partners will bring to the cloud conversation that they have with customers. And their IP will differentiate them and bring them higher margins, by offering broader services around the Microsoft stack, for example.

5. Take advantage of all the benefits of the Microsoft Partner Network. The average value of MPN membership is $320,000, according to the company, in the form of technical benefits, internal use rights, trainings and incentives. The IDC study found that the successful cloud-oriented partners used their MPN benefits 1.5 times more than traditional IT partners.

Melanie Gass, President of New York City-based CenterPoint Solution and a Microsoft Cloud Accelerate Partner, started a cloud division a year ago to expand her business and to help customers migrate to Microsoft cloud computing — to Office 365, SharePoint, InTune and Dynamics CRM.

"Some customers believe that it's a cost barrier to move to new technologies, but we show them that with cloud that isn't the case," she said.

CenterPoint is also utilizing Microsoft's Get2Modern, a joint campaign between Windows and Office to help Windows XP and Office 2003 customers migrate to Windows 8 Pro and the new Office 365 or Office on-premises.

Cloud Computing and Virtualization Is the Future

Excerpted from Host Review Report by Brent Johnson

Cloud computing and virtualization are two approaches to make more efficient use of hardware. The basic principle behind the cloud, which is similar to the old concept of on-demand computing, is that computing resources can be turned into products and delivered over the Internet.

A company or individual can then use these resources on a metered basis, which is paying for processing power and storage consumption. For many companies utilizing cloud services may eventually shrink the need for your IT support staff as your cloud provider would manage and support your infrastructure.

Virtualization technology allows you to take physical hardware and split it up into multiple different services or applications all running simultaneously on one system. This allows organizations to consolidate many servers on a single hardware component saving them a ton of money.

There are many different forms of virtualization from hardware virtualization to storage to software. The options are endless and more and more companies are creating specialized services that continue to enhance the abilities you have available.

With the advent of virtualization and cloud computing comes less of a need for human IT staffing as more end users will leverage the internet and more automation will be possible and in fact we are already at that point.

Currently we are nearing a revolution in technology right before our very eyes and as an IT professional myself I have embraced the fact that soon everything will be done offsite and virtually which means you have to educate and train yourself on the new technology or you will be left behind.

Open source virtualization and cloud computing are also helping to spearhead the affordability of the cloud. Companies such as Rackspace and Citrix and of course Red Hat Linux all have their own flavors of open source virtualization technology that for the most part is free you may only need to pay for a support subscription but the downloading and installation of the cloud operating system is totally free.

This may put pressure on market leader VMware to modify its virtualization costs so it doesn't lose market share. All in all it is an exciting time if you are an IT professional or an organization getting ready to ramp up and upgrade never before has there been more affordable and advanced solutions to put your business online, secure and remotely accessible.

We are in a transition period where technology is rapidly changing. Just 10 years ago we could not have imagined the type of virtualization that we have available today, what will the next 10 years bring? I can't wait to see it.

Dropbox Expands its Reach

Excerpted from Information Week Report by Thomas Claburn

For home buyers, it's said, three things matter most: location, location and location. This oversimplification of real estate valuation holds true for computer file systems too: It matters where files reside.

Developers still have to specify file paths in their code to make file data accessible, to say nothing about the effect of distance on responsiveness when accessing a file over a network.

But the cloud computing metaphor argues for a world where location doesn't matter, at least as far as the consumer is concerned. File data simply exists somewhere in the cloud, to be accessed at the user's convenience.

Dropbox embraced this concept six years ago because it made sense from both a practical and a business perspective. Since then, the company has grown to 175 million users syncing more than 1 billion files daily.

In San Francisco, CA on Tuesday, at DBX, its first developer conference, Dropbox added a new Datastore API to its Dropbox Platform, to enable developers to sync more varied kinds of data across devices and platforms.

A follow-up to February's Sync API, which made it possible to sync files across multiple devices and platforms, the Datastore API provides support for syncing structured data, like contacts, to-do items and game state. Dropbox, in its FAQs, says the Datastore API is necessary because "apps need to store and sync more than just files."

In fact, such data exists in files, but those files are not normally exposed to end users. By providing developers with a way to make such structured data available across devices and apps, Dropbox is clouding the distinction between apps, devices and operating systems.

"When you use an app built with datastores your data will be up-to-date across all devices whether you're online or offline," Dropbox co-founders Drew Houston and Arash Ferdowsi wrote in a blog post.

Dropbox also introduced Chooser for iOS and Android, a UI component that provides easy access to Dropbox files, and Saver, a UI component for saving files to Dropbox from the Web and mobile websites.

The major platform companies — Apple, Google and Microsoft — would prefer that their customers use their own storage services. So it is that Apple offers iCloud, Google offers Google Drive and Microsoft offers SkyDrive, each of which has its own APIs for integration with third-party apps.

Dropbox is likely to face an uphill battle convincing mobile game developers to rely on its Datastore API rather than APIs for Apple Game Center or Google Play Services, both of which support game state saving as well as other game-oriented functions.

Dropbox last year was the performance leader among three other competing services — Box, Google Drive and Microsoft SkyDrive — according to a Pingdom survey. But this year, it came in second (708 ms) in an overall speed comparison, behind Google Drive (549 ms).

10 Steps to Avoid Cloud Vendor Lock-In

Excerpted from ZDNet Report by Joe McKendrick

Along with security, one of the most difficult issues with cloud platforms is the risk of vendor lock-in. By assigning business processes and data to cloud service providers, it may get really messy and expensive to attempt to dislodge from the arrangement if it's time to make a change.

There are ways enterprises, as well as the industry in general, can address these lock-in issues. Solutions to potential vendor lock-in were recently surfaced in a new guide from The Open Group.

The guide, compiled by a team led by Kapil Bakshi and Mark Skilton, provides key pointers for enterprises seeking to develop independently functioning clouds, as well as recommendations to the industry on standards that need to be adopted or extended.

Here are 10 key problems and recommendations identified by The Open Group team for achieving cloud formations based on standards, rather than on vendor technology:

Platform-platform interfaces: A universally used standard for platform-platform interfaces — the Internet, HTTP and message envelope layers of web service APIs — "would provide a major part of the basis for real cloud interoperability," the guide states. The two styles of web service interface handled by platforms — WS-I and raw HTTP — each has strengths for specific applications. "Many small-scale integrations that originally used WS-I with SOAP and XML, because JSON was not mature enough at the time, are now moving towards raw HTTP and JavaScript Object Notation because it is better suited to their needs. However, for enterprise-level integrations, SOAP is still king."

Recommendations: Use WS-I as the service platform interoperability interface between cloud services. Use direct HTTP with JSON as the service platform interoperability interface. "The industry should identify best practice in use of direct HTTP and JSON, including means of authentication and access control (such as OAuth), and develop standard profiles for interoperability between service platforms using this approach."

Application-platform interfaces: "Currently, there are a number of programming languages that might be used for the interface; there is no agreement on what functionality is needed; there are no commonly-accepted application-platform interface standards that cover the full range of functionality; however, it might be agreed. There are, however, products, both commercial and open source, that implement parts of the functionality, such as Enterprise Service Buses (ESBs), and some vendor-independent interface standards for part of the functionality, such as the Java Message Service (JMS)."

Recommendations: "Enterprises should seek to use cloud platforms with vendor-independent programming interfaces." "PaaS vendors stating that they support .NET or J2EE should say which versions they support." "A language-independent specification of a standard cloud application platform interface should be defined." "Instantiations of this should then be developed for the most widely-used programming languages."

Service descriptions: The accepted standard for service descriptions, the Web Service Description Language (WSDL), has limitations, the guide says: "Its descriptions are machine-readable rather than human-friendly; it describes the functional characteristics of services, but does not cover non-functional characteristics such as quality of service and conditions of contract; it has no real ability to describe service data models; and it applies to services that use the WS-I approach, but not to services that use the direct HTTP approach." Bodies working to develop standards for service descriptions that address some of these limitations include the Web Application Description Language (WADL) authors, the Open Data Center Alliance (ODCA), the SLA@SOI Consortium, and the OASIS TOSCA Technical Committee.

Recommendations: "Produce clear human-readable descriptions of them, covering functional and non-functional characteristics." "Enterprises developing services using the WS-I approach should also produce WSDL descriptions of them." "Insist on the availability of clear and stable human-readable descriptions and, for services using the WS-I approach, of WSDL descriptions." "The industry should work to establish best practice for human-readable service descriptions covering all service characteristics, building on the work of bodies currently active in this area." "The industry should work to establish standards for machine-readable service descriptions, including templates and component schemas." "These standards should cover all service characteristics and parallel the human-readable descriptions. They should include or be linked to descriptions of service data models, and be applicable to services that use the direct HTTP approach as well as to those that use the WS-I approach. WSDL forms a good starting point for such standards."

Service management interfaces: "Standardization of these interfaces will enable the development of cloud management systems as commercial off-the-shelf products," according to the guide." Initiatives alreday underway include the "DMTF Cloud Infrastructure Management Interface (CIMI) and Virtualization Management (VMAN) standards, the OASIS Topology and Orchestration Specification for Cloud Applications (TOSCA), the Open Grid Forum Open Cloud Computing Interface (OCCI), and the SNIA Cloud Data Management Interface (CDMI). The Openstack APIs may also provide de facto standards."

Recommendations: Ensure that "management interfaces follow emerging standards where possible." Look for services "whose management interfaces follow emerging standards." The industry should support the ongoing cloud management standardization work, including the work in the DMTF, OASIS, OGF, and SNIA, and the Openstack open source initiative."

Data models: "These do not cover the new 'NoSQL' paradigms that are increasingly being used in cloud computing," the guide states. "Also, the schema standards do not enable correspondences between different data models to be established, and this is crucial for interoperability. The semantic web standards and the Universal Data Element Framework (UDEF) can be used to define correspondence between data models, but their application is not widely understood, and they are little used."

Recommendations: Describe data models clearly, "using text and applicable schema standards. The descriptions should be computer-readable and have good human-readable documentation. A well documented XML schema would achieve this, for example, but just using XML probably would not." Look for clear data model descriptions. "The industry should establish best practice to describe correspondences between data models, should ensure that the standards in this area are fit for purpose, and should work to improve understanding of them."

Loose coupling: "Tightly-coupled components are difficult and expensive to integrate, particularly over the lifetime of a system that undergoes change (as most do)."

Recommendations: "Cloud application components should be loosely coupled with the application components that interact with them."

Service orientation: "Cloud offerings are packaged as services (IaaS, PaaS, SaaS). Cloud platform-platform interfaces, whether in the WS-I or raw HTTP style, assume client-server interaction. Service orientation encompasses and reinforces other principles — loose coupling, service descriptions, and described interfaces."

Recommendation: "Cloud applications should be service-oriented."

Marketplaces: "Use of marketplaces and app stores is growing, but there are as yet no standards or established good practice for their operation," according to the guide. "This means that product vendors must cater for the different requirements and practices of all the marketplaces in which their products appear, that customers must understand the different features of all the marketplaces that they use, and that marketplace operators are spending effort on unnecessary differentiation."

Recommendation: "Industry bodies should seek to identify the best practices for marketplace operation, with a view to defining standards and working with governments on any legislation that may be needed to underpin them."

Representational State Transfer (REST): "There is a need for robust and scalable services that are loosely-coupled and have stable interfaces that are easy to describe."

Recommendation: "Applications should be designed using the Representational State Transfer (REST) style, though without insisting on its full rigor."

Machine image formats: "The ability to load a machine image containing an application together with its application platform onto different cloud infrastructure services is a new form of portability that is made possible by cloud computing. A standard machine image format makes portability possible across different infrastructure service providers, as well as across infrastructure services of a single provider.

Recommendations:"The Open Virtualization Format (DMTF OVF) standard is designed to meet the need for a machine image format standard." Evaluate the OVF standard "and support it if feasible."

Hulu Sale Called Off

Excerpted from Variety Report by Todd Spangler

Hulu is no longer for sale — and its current owners claim their strategies for the Internet TV site are fully in sync.

On Friday, 21st Century Fox, NBCUniversal and The Walt Disney Co. jointly announced that they will maintain their respective ownership positions in Hulu and together provide a cash infusion of $750 million "to propel future growth."

While the move indicates Hulu's owners didn't like the final offers on the table, which they received last Friday, company insiders said the decision was more about settling on the right strategy for "the future of where our content was going to live" and noted that keeping Hulu was always one of the outcomes under consideration.

DirecTV, a combo bid from AT&T and Chernin Group, and Time Warner Cable had been the last three contenders. Hulu's owners were originally hoping to bring in $2 billion for the company, but bids had come in at less than $1 billion. It's not known what the final offers from DirecTV, AT&T-Chernin and TWC (which was seeking a partial ownership stake) were.

According to Disney and 21st Century Fox execs, they came to the conclusion to keep Hulu in the fold not because the bids were too low but rather because they feel it has the potential to become an even bigger player in subscription VOD and take on Netflix and others.

Disney chairman and CEO Bob Iger told reporters at the Allen & Co. conference in Sun Valley, Idaho, on Friday that the decision was not the result of evaluating the bids, calling them "good, solid offers."

"The future of Hulu is bright, and if the future of Hulu is bright then we should hold on to it," he told reporters.

In a prepared statement, Iger said, "Hulu has emerged as one of the most consumer-friendly, technologically innovative viewing platforms in the digital era. As its evolution continues, Disney and its partners are committing resources to enable Hulu to achieve its maximum potential."

Chase Carey, prexy and COO of 21st Century Fox, commented, "We had meaningful conversations with a number of potential partners and buyers, each with impressive plans and offers to match, but with 21st Century Fox and Disney fully aligned in our collective vision and goals for the business, we decided to continue to empower the Hulu team, in this fashion, to continue the incredible momentum they've built over the last few years."

News Corp. and NBC launched Hulu in 2008, as a response to the rise of YouTube and other online-video sites. The media companies were looking for a way to aggregate premium TV content in a way they could control and monetize. Disney came on board in 2009.

In 2011, Hulu's parents had looked to unload their joint venture, but similarly called that off after engaging in discussions with interested buyers reported to have included Google, Yahoo, Dish Network and Amazon.

Hulu now has more than 400 content partners and boasts more than 30 million monthly unique visitors, according to the company. In 2010, it launched Hulu Plus, an $8-per-month service that provides additional content not available on the free site, including past seasons of many TV series. Hulu Plus, which has more than 4 million subscribers, also provides access across multiple Internet-connected devices, including TVs and tablets.

DirectTV and Chernin with AT&T had been viewed as front-runners, but sources close to the Hulu talks had cautioned that 21st Century Fox, NBCU and Disney were just as close to pulling Hulu off the market as completing a sale. While a higher bid may have compelled the owners to proceed with a sale, the potential buyers had complained that content rights restrictions Hulu's parents were asking reduced the site's value.

While Hulu's ownership picture is clear, there are still a myriad of questions surrounding its future strategic direction. Will the streaming service continue its hybrid model of free ad-supported and monthly subscription fees? Disney and 21st Century Fox (which has split off from News Corp) have had different opinions on the matter, and it's not clear whether their keeping Hulu means there's a final resolution on that front.

Meanwhile, it's an open question as to whether their holding on to Hulu would preclude another investor coming in and joining them. Time Warner Cable was mentioned during the bidding process as potentially interested in taking a stake, as Providence Equity Group once did before bailing as a fourth partner once it turned a profit

Since the second round of Hulu negotiations got underway earlier this year, Hulu has seen the departure of many high-level execs including CEO Jason Kilar, who had clashed with the company's owners over strategy. He was replaced on an interim basis by senior veep of content Andy Forssell; Hulu's owners are not currently searching for a new CEO but may revisit management plans.

Hulu had a fourth investor — Providence Equity Partners — whose 10% stake NBCU, Disney and 21st Century Fox bought out last October.

News of Hulu's aborted sale was almost a non-story for insiders milling about Sun Valley confab Friday morning. Days of chatter at the event about how valuable a market had been created by rival service Netflix may may have tipped the balance for execs at the congloms who were weighing the decision, according to a source familiar with the talks.

Data Storage that Could Outlast the Human Race

Excerpted from Slashdot Report

Nerval's Lobster writes "Just in case you haven't been keeping up with the latest in five-dimensional digital data storage using femtocell-laser inscription, here's an update: it works.

A team of researchers at the University of Southampton have demonstrated a way to record and retrieve as much as 360 terabytes of digital data onto a single disk of quartz glass in a way that can withstand temperatures of up to 1000 C and should keep the data stable and readable for up to a million years.

'It is thrilling to think that we have created the first document which will likely survive the human race,' said Peter Kazansky, professor of physical optoelectronics at the Univ. of Southampton's Optical Research Centre.

'This technology can secure the last evidence of civilization: all we've learnt will not be forgotten.'

Leaving aside the question of how many Twitter posts and Facebook updates really need to be preserved longer than the human species, the technology appears to have tremendous potential for low-cost, long-term, high-volume archiving of enormous databanks.

The quartz-glass technique relies on lasers pulsing one quadrillion times per second though a modulator that splits each pulse into 256 beams, generating a holographic image that is recorded on self-assembled nanostructures within a disk of fused-quartz glass.

The data are stored in a five-dimensional matrix—the size and directional orientation of each nanostructured dot becomes dimensions four and five, in addition to the usual X, Y and Z axes that describe physical location.

Files are written in three layers of dots, separated by five micrometers within a disk of quartz glass nicknamed 'Superman memory crystal' by researchers. (Hitachi has also been researching something similar.)"

Coming Events of Interest

BUILD, BUY, OR RENT - July 16th Webinar at 11:00 AM ET. How should you implement your optimal cloud storage solution? This webinar will compare the three options, including an in-depth comparison of monthly Amazon pricing and OpenStack build-it-yourself costs, helping you optimize your plan for your business needs and growth.

Cloud Computing Summit - July 16th-17th in Bradenton, South Africa. Advance your awareness of the latest trends and innovations from the world of cloud computing. This year's ITWeb-sponsored event will focus on key advances relating to the infrastructure, operations, and available services through the global network.

NordiCloud 2013 - September 1st-3rd in Oslo, Norway. The Nordic Symposium on Cloud Computing & Internet Technologies (NordiCloud) aims at providing an industrial and scientific forum for enhancing collaboration between industry and academic communities from Nordic and Baltic countries in the area of Cloud Computing and Internet Technologies.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 - October 27th-29th in Las Vegas, NV. Two major conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; and the impact of mobile cloud computing and Big Data analytics in this space.

International CES - January 7th-10th in Las Vegas, NV.  The International CES is the global stage for innovation reaching across global markets, connecting the industry and enabling CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $209 billion US consumer electronics industry.

CONNECTING TO THE CLOUD - January 8th in Las Vegas, NV. This DCIA Conference within CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector. Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

CCISA 2013 – February 12th–14th in Turin, Italy. The second international special session on  Cloud computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed and  Network-Based Processing.

Copyright 2008 Distributed Computing Industry Association
This page last updated July 21, 2013
Privacy Policy