Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Aspera

Citrix

Oracle

Savvis

SoftServe

TransLattice

Vasco

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

January 6, 2014
Volume XLVI, Issue 8


Visit the DCIA at the International CES Show

This week, the DCIA will be exhibiting at the 2014 International Consumer Electronics Show (CES), as well as presenting CONNECTING TO THE CLOUD (CTTC), a Conference within CES, at the Las Vegas Convention Center, Las Vegas, NV.

Please register now for CTTC at CES.

From Tuesday January 7th through Friday January 10th, visit the DCIA at Exhibit Booth 31800 on the Second Floor of the South Hall of the Convention Center, in the Allied Associations Pavilion. Personally discuss DCIA Membership Benefits and Options with professional staff members.

On Wednesday January 8th, attend to the DCIA's CTTC in Conference Room N262 on the Second Floor of the North Hall of the Convention Center, as one of the CES Conference Tracks. Please click here for a preview of the Conference Brochure.

CTTC at CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector or - as ABI Research's Sam Rosen referenced that category at CLOUD COMPUTING WEST - the "cloud electronics (CE) sector."

fCLOUD Speakers for CLOUD COMPUTING EAST 2014

As previously announced, the DCIA & CCA will present CLOUD COMPUTING EAST (CCE:2014), a strategic summit for business leaders and software developers, in Washington, DC from Tuesday May 13th through Wednesday May 14th.

As one of three tracks within CCE:2014, fCLOUD (The Financial Services Cloud) will thoroughly examine the ways that financial transactions and currency exchange, domestic banking and insurance services, as well as efficient investment decision-making are all being impacted by cloud computing. It will also address liabilities and challenges that need to concern financial services organizations regarding cloud-based services.

Topics will include: international financial activities impacted by cloud computing; how banks and insurance companies are migrating to the cloud; and private equity and hedge fund investor use of cloud computing; and more.

If you would like to speak at this major industry event, please contact the DCIA at your earliest convenience. We're interested in representatives of hospitals, clinics, multi-physician practices, and solutions providers to discuss how cloud computing is serving every part of the healthcare system.

We're finalizing the conference agenda and speakers list to be included in the promotional materials and conference program now. Call or e-mail at your earliest convenience — THIS WEEK — for more information.

DCIA Member companies and DCINFO subscribers also received invitations during the last two weeks to speak on the gCLOUD (The Government Cloud) track and hCLOUD (The Healthcare Cloud) track.

Please contact the CCA for attractive exhibition and sponsorship opportunities.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA is grateful for your support and welcomes your active participation in 2014, which promises to be the best year yet for the distributed computing industry

During 2013, we experienced substantial increases in adoption, as the businesses of a growing array of cloud services providers (CSPs) — ranging from major established technology companies to new start-up firms — gathered momentum.

Analysts, researchers, and other industry observers now predict that over the next two years, the cloud platform will see a massive doubling in its revenue.

As 2014 gets underway, it's useful to examine lessons learned as well as celebrate the successes achieved during this period of increasingly dynamic growth.

Early in the year, we underscored advances in reliability and security at our CLOUD COMPUTING CONFERENCE at the 2013 NAB Show.

Later in the year, first at CLOUD COMPUTING EAST 2013 and then at CLOUD COMPUTING WEST 2013, with our partner organization Cloud Computing Association (CCA), we showcased the growing number of successful implementations of cloud-based services in the government, healthcare, financial services, and media/entertainment sectors, as well as acceleration in growth of mobile cloud solutions and big data.

And finally, at GOVERNMENT VIDEO IN THE CLOUD, we highlighted this important area of special interest.

An outstanding need we must address in 2014 is outage prevention. With massive numbers of consumers and the businesses of large enterprises relying on the cloud, as opposed to a small number of experimental end-users, unplanned downtime is unacceptable.

As an industry, we need to make 2014 the year that fail-proof redundancy, through a combination of enhanced architectures and improved practices, finally becomes a reality.

On the security front, 2013 revealed "virtual spying" as a fundamental and preventable problem. For virtual machines (VMs) co-located within the same server, code needs to be written to detect and block unauthorized access to software and data across partitions intended to ensure privacy.

Industry-wide, we need to initiate verification mechanisms that provide customers using VMs with total integrity assurance.

This is very important for commercial deployments that rely on multi-tenant, virtual-server hosts, where one physical server is used by multiple companies and customers.

2014 will be the year of the hybrid cloud, with the security of private cloud and scalability of public cloud services. Hybrid cloud will offer affordable infrastructure to small-scale enterprises and custom solutions for big data analytics.

In 2014, we will support the development of new web-powered apps with platform independence as one of their prominent features.

Enterprise cloud end-users want cloud-based applications that can unify their information and apps through a single data model.

Industrial Internet will come closer to a reality, possibly with lab testing and field trails. Real-time industrial processes and increased efficiency of smart data management will be key attributes of this.

Industries will be able to use real-time data for improving their processes and use the information to take action. Cloud computing will play a key role in creating intelligent machines with central controllability.

And finally, billing practices need to be re-examined in 2014. As the cloud-computing sector matures, we owe it to our customers to eliminate the need for deciphering complex and inconsistent billing formats.

During 2013, we experienced multiple vendors offering different configurations and definitions of their virtual CPUs, for example. Suppliers not only offered various levels of service and volume discounts — a good thing — but they also defined combinations of memory, CPU, and storage differently from one another.

This made direct comparisons difficult and actually impeded commercial advancement.

Instead of requiring that prospective users gather, arrange into spreadsheets, and analyze reams of information, we can serve both the supply and the consumption sides better with common, plain-language terminology for services provided.

Stay tuned and stay involved as a DCIA Member Company, working group participant, or delegate at our conferences, such as CONNECTING TO THE CLOUD and CLOUD COMPUTING EAST 2014.

There's much more to come. Share wisely, and take care.

CONNECTING TO THE CLOUD at CES

This Wednesday January 8th, the DCIA will present CONNECTING TO THE CLOUD (CTTC), a Conference within the 2014 International Consumer Electronics Show (CES) in N262 of the North Hall at the Las Vegas Convention Center, Las Vegas, NV.

This is the best way we know to preview coming developments in this vital topic area for 2014, gauge the impact of these advancements on your business, and network with industry leaders.

At 1:00 PM, an opening panel presented by Verizon Digital Media Services, featuring Jonathan Perelman, GM Video & VP Agency Strategy, BuzzFeed; Adam Ostrow, Chief Strategy Officer, Mashable; Michael Schneider, CEO, Mobile Roadie; and Jason Baptiste, CMO, Onswipe; and moderated by Daniel Dan Cryan, Senior Director of Digital Media, IHS / Screen Digest, will examine "Millennials, Online TV, and Gaming: Now and Tomorrow."

What are the implications of the digital revolution in the way Millennials discover, access, and consume video, music, and gaming content online? Hear it first-hand from young voices representing leading companies in the digital, social, and tech arenas.

At 1:45 PM, Robert Stevenson, Chief Business Officer & VP of Strategy, Gaikai, will further examine this issue in a keynote address entitled, "Tomorrow Is Now: What We Are Connecting to the Cloud."

At 2:00 PM in a solo presentation, Sam Rosen, Practice Director, TV & Video, Consumer Electronics, ABI Research, will address, "Where Are There Problems Connecting to the Cloud?"

Next, at 2:15 PM, Reza Rassool, Chief Technology Officer, Kwaai Oak, will further expose "Consumer Drawbacks of Cloud-Delivered Content: Availability, Reliability, Scalability Issues."

The follow-on panel at 2:30 PM with Andy Gottlieb, VP, Product Management, Aryaka; Larry Freedman, Partner, Edwards Wildman Palmer; David Hassoun, Owner & Partner, RealEyes Media; Jay Gleason, Cloud Solutions Manager, Sprint; and Grant Kirkwood, Co-Founder, Unitas Global, will discuss "The Impact on Telecommunications Industries of Cloud Computing."

Then two sessions, starting at 3:00 PM will delve into "Telecommunications Industry Benefits of Cloud-Delivered Content: New Opportunities" with Doug Pasko, Principal Member of Technical Staff, Verizon Communications. And at 3:15 PM, "Telecommunications Industry Drawbacks of Cloud-Delivered Content: Infrastructure Challenges" with Allan McLennan, President & Chief Analyst, PADEM Group.

The next panel, at 3:30 PM, will address "The Impact on Entertainment Industries of Cloud Computing" with Jay Migliaccio, Director of Cloud Platforms & Services, Aspera; Mike King, Dir. of Mktg. for Cloud, Content & Media, DataDirect Networks; Venkat Uppuluri, VP of Marketing, Gaian Solutions; Mike West, Chief Technology Officer, GenosTV; George Dolbier, CTO, Social & Interactive Media, IBM; Kurt Kyle, Media Industry Principal, SAP America; Adam Powers, and VP of Media Technology & Solutions, V2Solutions.

Two solo presentations, at 4:15 PM with Les Ottolenghi, Global CIO, Las Vegas Sands Corporation, and, at 4:30 PM, Saul Berman, Partner & Vice President, IBM Global Business Services, will highlight "Entertainment Industry Benefits of Cloud Computing: Cost Savings & Efficiency" and "Entertainment Industry Drawbacks of Cloud Computing: Disruption & Security" respectively.

Additional sessions, starting at 4:45 PM will introduce the subjects "Consumer Electronics Industry Benefits of Cloud-Based Services: New Revenue Streams" with Mikey Cohen, Architect & Principal Engineer, Netflix, and, at 5:00 PM, "Consumer Benefits of Cloud-Delivered Content: Ubiquity, Cost, Portability Improvements," with Joshua Danovitz, VP of Innovation, TiVo

The closing panel at 5:15 PM will draw on all the preceding sessions to more deeply analyze "The Impact on the Consumer Electronics Industry of Cloud Computing" with Melody Yuhn, CTO, CSS Corp.; Michael Elliott, Enterprise Cloud Evangelist, Dell; David Frerichs, President, Media Tuners; Thierry Lehartel, VP, Product Management, Rovi; Russ Hertzberg, VP, Technology Solutions, SoftServe; Guido Ciburski, CEO, Telecontrol; and Scott Vouri, VP of Marketing, Western Digital.

Top program topics will include case studies on how cloud-based solutions are now being deployed for fixed and mobile CE products - successes and challenges; the effects on consumers of having access to services in the cloud anytime from anywhere - along with related social networking trends.

Also featured will be what broadband network operators and mobile Internet access providers are doing to help manage - and spur - the migration to interoperable cloud services.

Some in traditional entertainment industries find this technology overwhelmingly threatening and disruptive - others see enormous new opportunities; and the value proposition for CE manufacturers will also continue to evolve substantially to providing cloud-based value-adding services - rather than conventional hardware features.

Please register now for CTTC at CES.

Cloud Computing: Not Nearly Hyped Enough

Excerpted from TechRepublic Report by Matt Asay

In the wake of big data, cloud seems like old news. Yet as Host Analytics CEO Dave Kellogg points out, we've barely scratched the surface on cloud adoption, even in established categories like software-as-a-service (SaaS).

Cloud, in other words, is not nearly hyped enough.

In 2013 the overall cloud pie grew to $131 billion, including such diverse segments as infrastructure-as-a-service (IaaS), advertising services, and SaaS, according to Gartner.

Yet big as that sounds, we're nowhere near saturation, as Kellogg highlights:

"IDC predicts that aggregate cloud spending will see amazing growth at a scale of 25%. Those are big numbers, but think about this: some 15 years after Salesforce.com was founded, its head pin category, sales force automation (SFA), is still only around 40% penetrated by the cloud. ERP is less than 10% in the cloud. EPM is less than 5% in the cloud."

Kellogg goes on to quote Bill Gates, who once stated that, "We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten."

It's clear that enterprise IT is moving to the cloud. What's less clear is just how long it will take.

If anything, the cloud is underhyped, and by quite a lot.

Just ask Forrester analyst Jeffrey Hammond, who notes that "hype" is in the eye of the beholder. Or, rather, the mouth of the hypester:

Cloud is over-hyped by vendors, but under-hyped by successful adopters who get the culture shift and competitive advantage it brings.

Adding to this DataStax Co-Founder Matt Pfeil stresses that cloud "enables faster technology innovation" by letting developers focus on building applications, not infrastructure. In fact, cloud frees businesses from the shackles of old-school IT, waiting around for servers to be provisioned before getting to innovate.

Or would, if only IT would get out of the way.

Too often, what the business wants and what IT delivers are two very different things, and cloud takes the blame. As George Reese, executive director of Cloud Management at Dell, indicates, "The business goes to IT and says, 'I want cloud.' They are thinking (cloud = AWS). But when IT hears. 'I want cloud,' they are thinking (cloud = virtualization). The business wants 'on demand, self service.' IT delivers them virtualization. Thus, cloud fails both."

Enterprise IT, in other words, delivers cloud according to the tools it knows, from the vendors it knows, and the line of business suffers as a result. Cloud computing isn't yesterday's technology dressed up in cloudwashed language.

True cloud looks, feels, and smells a lot like Amazon Web Services, which is one reason I suspect AWS adoption is actually bigger than we think. Gartner analyst Lydia Leong pegs AWS's utilized compute capacity at five times that of the other 14 cloud providers in the Gartner Magic Quadrant combined.

That may be low.

An Amazonian cloud future?

So the real question going into 2014 is whether the cloud, as defined by the enterprise, simply means "Amazon." Today it seems plausible that the answer is a resounding "Yes."

But remember: we're in the earliest days of cloud computing, even in "established" markets like SaaS. For the vast majority of the cloud market, at least as detailed by analysts like IDC and Gartner, Amazon doesn't even play. Over time it may come to compete in SaaS and advertising, but this is doubtful.

But what doesn't seem to be in doubt is the enterprise's voracious appetite for more cloud services, be they business processes like salesforce automation delivered as cloud services or infrastructure made available as such. Each year we talk about security and other inhibitors to greater cloud adoption, and each year these fade in the face of the improved business and development agility the cloud affords. As such, we should expect 2014 to be the Year of the Cloud...just like 2013, 2012, 2011 and 2010 were before it.

Viewing Where the Internet Goes

Excerpted from NY Times Report by John Markoff

Will 2014 be the year that the Internet is reined in?

When Edward J. Snowden, the disaffected National Security Agency contract employee, purloined tens of thousands of classified documents from computers around the world, his actions — and their still-reverberating consequences — heightened international pressure to control the network that has increasingly become the world's stage.

At issue is the technical principle that is the basis for the Internet, its "any-to-any" connectivity. That capability has defined the technology ever since Vinton Cerf and Robert Kahn sequestered themselves in the conference room of a Palo Alto, CA., hotel in 1973, with the task of interconnecting computer networks for an elite group of scientists, engineers, and military personnel.

The two men wound up developing a simple and universal set of rules for exchanging digital information — the conventions of the modern Internet. Despite many technological changes, their work prevails.

But while the Internet's global capability to connect anyone with anything has affected every nook and cranny of modern life — with politics, education, espionage, war, civil liberties, entertainment, sex, science, finance, and manufacturing all transformed — its growth increasingly presents paradoxes.

It was, for example, the Internet's global reach that made classified documents available to Mr. Snowden — and made it so easy for him to distribute them to news organizations.

Yet the Internet also made possible widespread surveillance, a practice that alarmed Mr. Snowden and triggered his plan to steal and publicly release the information.

With the Snowden affair starkly highlighting the issues, the new year is likely to see renewed calls to change the way the Internet is governed. In particular, governments that do not favor the free flow of information, especially if it's through a system designed by Americans, would like to see the Internet regulated in a way that would "Balkanize" it by preventing access to certain websites.

The debate right now involves two international organizations, usually known by their acronyms, with different views: the Internet Corporation for Assigned Names and Numbers (Icann), and the Internet Corporation for Assigned Names and Numbers (ITU).

Icann, a nonprofit that oversees the Internet's basic functions, like the assignment of names to websites, was established in 1998 by the United States government to create an international forum for "governing" the Internet. The United States continues to favor this group.

The ITU, created in 1865 as the International Telegraph Convention, is the United Nations telecommunications regulatory agency. Nations like Brazil, China and Russia have been pressing the United States to switch governance of the Internet to this organization.

Dr. Cerf, 70, and Dr. Kahn, 75, have taken slightly different positions on the matter. Dr. Cerf, who was Chairman of Icann from 2000-7, has become known as an informal "Internet ambassador" and a strong proponent of an Internet that remains independent of state control. He has been one of the major supporters of the idea of "network neutrality" — the principle that Internet service providers should enable access to all content and applications, regardless of the source.

Dr. Kahn has made a determined effort to stay out of the network neutrality debate. Nevertheless, he has been more willing to work with the ITU, particularly in attempting to build support for a system, known as Digital Object Architecture, for tracking and authenticating all content distributed through the Internet.

Both men agreed to sit down, in separate interviews, to talk about their views on the Internet's future. The interviews were edited and condensed.

After serving as a program manager at the Pentagon's Defense Advanced Research Projects Agency, Vinton Cerf joined MCI Communications Corp., an early commercial Internet company that was purchased by Verizon in 2006, to lead the development of electronic mail systems for the Internet. In 2005, he became a vice president and "Internet evangelist" for Google. Last year he became the President of the Association for Computing Machinery, a leading international educational and scientific computing society.

An official with Darpa from 1972 to 1985, Robert Kahn created the Corporation for National Research Initiatives, based in Reston, Va., in 1986. There he has focused on managing and distributing all of the world's digital content — as a nonproprietary Google. He has cooperated with the ITU on the development of new network standards.

Please click here for the full interviews with Cerf and Kahn.

Hybrid Cloud: The Year of Adoption Is upon Us

Excerpted from ComputerWorld Report by Brandon Butler

If one word were to encapsulate the cloud computing market as we head into 2014, it could be hybrid.

It was the big buzzword for cloud computing vendors last year. VMware launched its vCloud Hybrid Service. Rackspace, Microsoft, HP and Joyent have been touting how the same software that runs their public cloud can be used to manage a company's own data center, creating a seamless management experience across both.

There's good reason for all this talk about hybrid. The cloud is still in its nascent stages, which means that most organizations are not yet ready to jump into outsourcing their entire IT operations to the public cloud, experts say. But many are intrigued by the advantages the cloud can bring, such as automated self service provisioning of virtual machines and storage. So if the public cloud isn't right for everything, but organizations still want some sort of cloud, it usually ends up being a hybrid deployment.

If you don't have a hybrid cloud now, research firm Gartner says you likely will in the future. The firm says that hybrid cloud is today where the private cloud market was three years ago. By 2017, Gartner predicts that half of mainstream enterprises will have a hybrid cloud, which it defines as a policy-based service provisioning platform that spans internal and external cloud resources. So what's holding the industry back?

People and processes are big stumbling blocks to the cloud.

"One of the most common issues is employees going around IT to get to the public cloud," says John Humphreys, vice president of sales at Egenera, a Boston-area consulting and IT management firm. This "shadow IT" issue surfaces because employees want the benefits the cloud provides: easy access to virtual machine or storage resources without having to wait for IT to spin them up.

Employees circumventing IT may sound discouraging for IT managers, but Humphreys says it proves that employees are looking for these types of services. IT shops, he says, have an opportunity to become an internal service provider for these employees. Instead of employees going around IT, they can use IT-approved resources to access cloud-based features. "The big question is, how can we make it easier for workers to go through us instead of going around us?" he says.

So if the demand is there, why hasn't adoption been as robust? It could be because the platforms offered by vendors are still maturing, says Bryan Cantrill, senior vice president of engineering at Joyent, which is one of the smaller (compared to Amazon Web Services, Microsoft and Google) but technically savvy IaaS providers on the market.

Platforms from various providers are still in their earliest stages. VMware just released its public/hybrid cloud platform last fall. Microsoft has had its platform out for longer, but Rackspace is still developing its private cloud platform based on OpenStack code, which continues to mature and evolve. Joyent's hybrid cloud offering is based on its SmartOS, which is an internally-developed operating system that runs its public cloud. SmartDataCenter is the name of Joyent's private cloud platform that uses SmartOS, which customers can run on their own premises.

"We see a surprising amount of hybrid cloud," Cantrill says. "When we sell private clouds, there is virtually always a public cloud component to it." The advantage of having your private and public cloud on the same platform, he says, is that over time the business can shift between the two. Applications should run on whichever platform is best suited for their needs, not just in whatever platform the IT shop has gotten around to supporting. If it's a highly dynamic app with unknown spikes, the public cloud is best. Highly secure and performance-intensive apps may be better in a private cloud. Having a hybrid cloud creates one platform for apps to run in either.

Hybrid cloud is the platform that will dominate the industry moving forward, says Vikrant Karnik, a senior vice president at Capgemini who oversees the system integrator's cloud consulting business. He works with large enterprise customers to plan and execute their cloud strategies and says that many of the big financial and pharmaceutical companies, for instance, will likely never be comfortable migrating their entire IT operations into the public cloud.

The largest companies in the country, which are also the ones with the largest IT budgets, are using the public cloud sparingly for development and testing or backup and recovery. They have massive infrastructures already that support their operations they're not just going to throw those away. Because of that, the world will have to be a hybrid one, Karnik says. If these types of companies are going to use any public cloud, it will be as part of a hybrid cloud.

So then, how important is it to have a consistent platform between your public and private cloud and to be "all-in" with one vendor's cloud management platform? Today these big businesses already have mixed environments; they have dozens of vendors across their IT shop today and they haven't standardized on any specific vendors. So why would the cloud be any different?

Hybrid cloud is coming, and in many cases, it's already here. 2013 was the year vendors got their hybrid cloud strategies out in the open, and 2014 will be the year when customers start using them.

Mobile Devices, Cloud, Applications Drive Server Design Diversity

Excerpted from eWEEK Report by Jeff Burt

When Verizon began laying the groundwork several years ago for its new public cloud, company IT executives set several goals for the environment. They wanted consistent performance, high security and availability, and they didn't want customers to have to modify their applications in order to run them in the cloud.

They wanted few moving parts and no special hardware.

"We wanted very few actual hardware components," Paul Curtis, Chief Architect of Cloud Computing at Verizon, said during Advanced Micro Device's Developer Summit 2013 in November. "My boss said, 'I only have five fingers in my hand. Don't make me use them all.'"

There was to be "no honking router outside of this. … No special firewall, no special anything. It just doesn't scale."

What Verizon quickly settled on was SeaMicro, a small company that at the time was making small, highly energy-efficient microservers that could be used in very dense data center environments and are linked via the company's Freedom Fabric. SeaMicro has since become part of AMD, which bought the company in February 2012 for $344 million.

The Verizon Cloud Platform, which will compete with the likes of Amazon Web Services and Rackspace, is now in paid public beta, with the expectation of rapid expansion as 2014 unfolds. It also is a microcosm of some of the drivers that are fueling the changes in server and data center architecture, giving rise to new offerings from established system and component makers and new designs from smaller vendors.

These changes in the data center also are roiling the competitive waters, with longtime partners suddenly becoming competitors and dominant architectures seeing threats from new sources. Enterprise data centers and service providers are making new demands on their system vendors, and those vendors are working hard to meet those demands. Organizations are looking for smaller sizes, more energy efficiency, easier manageability and lower costs. They want systems that can run the growing range of new applications, from big data to video to analytics.

This has created a fundamental shift in what is driving server design. In the past, server makers would put out new systems with the newest chips and then give those systems to customers to let them decide the best way to use them. Now it's the customer that is in the driver's seat, according to Andrew Feldman, corporate vice president and general manager of AMD's Server Business Unit.

"OEMs have lost a lot of the power" over how systems are designed, Feldman, former CEO of SeaMicro, said. "We're now seeing radical new designs in servers."

It's really a story about numbers. It's about the skyrocketing numbers of people and devices that will continue to connect to the Internet over the next few years, and the massive amounts of data they will generate. (Cisco Systems forecasts that by 2017, there will be 3.6 billion Internet users and more than 19 billion network connections, including machine-to-machine (M2M) connections. By 2020, it is expected that there will be 50 billion devices connected to the Internet.)

It's about the number of organizations that are increasingly moving parts of their businesses to the cloud and the number of new applications—like big data and analytics—that enterprises are implementing. (IDC analysts expect global spending on public IT cloud services to hit $107 billion in 2017.)

And it's the growing number of major cloud service providers—like Facebook, Google, and Amazon—with large, hyperscale data center environments that are aggressively looking for new technologies that will help them run and expand their operations while saving money. And if they can't find those products on the market, they're increasingly willing to develop them themselves.

Please click here for the full report.

Cloud Competition in China Means Good News for Consumers

Excerpted from WantChinaTimes Report

December 2013 marked an important page in the history of China's cloud computing industry, first with IBM announcing a partnership with 21Vianet Group to introduce its top cloud computing basic structure into the nation, followed by Amazon introducing its public cloud computing service to the mainland.

China's cloud computing service providers also launched major promotion activities in December, led by Tianyi Cloud, followed by Ali Cloud in its December 18th promotion announcing overall price cuts of much as 50%. Tencent Cloud then announced a one-week year-end promotion to give super discounts on its cloud computing services. The competition suggests a Warring States era of cloud computing has arrived, the report said.

Due to the restrictions related to laws and supervision, overseas public cloud computing service firms such as Microsoft, IBM, and Amazon have to cooperate with local companies to enter China, affecting the speed of their development in the nation. Microsoft on May 22nd this year first announced its cooperation with 21Vianet, followed by IBM's partnership with Capital Online on July 31st, and IBM's further deal with 21Vianet this month.

What users are most concerned with is not the technology but rather the content such as filing and other management areas. To cope with supervision requests in China, overseas cloud computer leaders have to compromise in order to compete with their local counterparts and give up some of their expertise, casting doubts over their chance of success.

The entrance by Tianyi Cloud and Tencent Cloud over the past year has totally changed the nation's cloud computing market, formerly dominated by Ali Cloud, dividing the market between the three leaders and giving more competitive price benefits to end users. So far, Tianyi Cloud, which is a unit of the state-owned China Telecom, is leading the way in serving state-run enterprises because of their long-term partnership, leaving few opportunities for Ali Cloud and Tencent Cloud in this field. The market for individuals and small and medium businesses will be the major battleground and the most uncertain for the three market leaders in 2014. Tianyi since September has actively launched a series of promotions aimed at this sector of the market. Tianyi Cloud has the advantage of its client service system and security over its product function.

Ali Cloud, to judge from its recent promotions, seems to be focusing on the large corporate client market to challenge Tianyi Cloud, rather than its previous focus on small and medium businesses. Tencent Cloud meanwhile is backed up by Tencent's unique social ecosystem and strong back-office systems.

Each of the three players has its own strengths, but what's for sure is that their intensified competition will benefit the consumer, the report said.

Government in the Cloud: Minimizing the Risks

Excerpted from Governing Report by John Kamensky

Many governments find the lure of cloud-based solutions -- the delivery of computing services such as email, data storage and online forms over the Internet -- to be highly compelling. Government leaders are finding they can lower their information-technology costs and expand services while improving performance and security.

But as with any form of government contracting, there are risks to be considered. Do governments lose control over their data? Do they risk losing access to it? Are they locked in to a single vendor? The key to success is writing and negotiating a strong contract. A recent report for the IBM Center for the Business of Government by researchers at the University of North Carolina at Chapel Hill analyzes contracting issues that state and local governments face when adopting cloud-computing services.

The study's authors, Shannon Tufts and Meredith Weiss, interviewed pioneering users of cloud services in local governments and state agencies in North Carolina to find out first-hand what works and what doesn't. Based on their research, Tufts and Weiss developed a 12-part checklist of issues that should be addressed whenever writing or negotiating a cloud-computing contract:

1. Pricing. "Pricing for cloud services," they write, "typically includes initial or upfront costs, maintenance and continuation costs, renewal costs, and volume commitments." Some contracts also include caps on the increases in costs permitted over time.

2. Infrastructure security. This encompasses "the supplier's responsibilities in the areas of information security, physical security, operations management, and audits and certifications."

3. Data assurances. In addition to determining responses to data breaches, Tufts and Weiss identify a number of related issues, including "ownership, access, disposition, storage location, and litigation holds."

4. Governing law. If there is a dispute, whose law will govern the case? Contracts should specify how and where any legal disputes will be settled, and should take into accounts different jurisdictions' laws. For example, the researchers note, North Carolina law "voids contract provisions that require disputes under contracts to be litigated outside of the state."

5. Service-level agreements. Cloud contracts should specify not only service-level parameters but also specific remedies and penalties for non-compliance.

6. Outsourced services. The contract "should require the vendor to inform the government of any outsourced functionality and its provider" while holding the primary vendor "directly responsible for all terms of the contract, regardless of outsourced functions."

7. Functionality. Cloud contracts should not only specify the functionality of the service being purchased but should require advance notice if a function is to be changed or deleted along with a notification period to allow time to switch vendors if necessary.

8. Disaster recovery. Contract language concerning disaster recovery and business continuity should specify processes and safeguards "to protect the contracting public entity's data and services in the event of system failures."

9. Mergers and acquisitions. What will happen if the vendor becomes involved in a corporate merger or buy-out? Contracts should "articulate the responsibilities and transferability of contracts or contract terms."

10. Compliance with laws. In addition to language related to warranties and liabilities, cloud contracts should ensure that the vendor will comply with laws and regulations "of import to the contracting entity."

11. Terms and conditions modifications. Noting that many cloud contracts incorporate terms and conditions that are posted online and could be changed by the vendor at any time, Tufts and Weiss recommend that "the active terms and conditions at the time of contract signature should be incorporated as an exhibit for future reference purposes."

12. Contract renewal and termination. "Since switching cloud vendors can be costly and involve significant planning," Tufts and Weiss write, "contract renewal and termination clauses are critically important." For example, "the contract should specify how data will be retrieved/returned upon termination by either party."

Based on their interviews, evaluation and analysis of various public-sector contracts, Tufts and Weiss identify three key lessons for those considering the transition to cloud computing:

First, IT professionals should not select cloud solutions without legal and procurement help. IT staff can evaluate a contract based on its technical merits but generally have limited knowledge of legal and procurement issues that public-sector officials must be aware of to minimize exposure to legal risk.

Second, agencies should negotiate their contracts rather than merely accepting the cloud solutions being offered via vendor-supplied master service agreements.

And to effectively negotiate a cloud contract, the public entity "has to be willing to seek alternative providers or solutions in the event that the government's contract terms cannot or will not be met," note the authors. The bottom line: All contracts involve some degree of risk.

Cloud Vendors Must Meet Federal Security Standards by June

Excerpted from Federal Times Report by Nicole Johnson

The June deadline is quickly approaching for cloud providers to prove their services meet federal security standards.

Meanwhile, agencies are being advised to inventory whether their cloud contractors have made the cut.

The deadline isn't that far off considering it can take a company six months to complete the government's security cloud program, known as FedRAMP. Cloud services in use at federal agencies must meet FedRAMP security requirements by June 5.

"If agencies have cloud providers that have not been accredited they should contact my office and ask if they are in the pipeline," said Maria Roat with the General Services Administration. Roat, who serves as FedRAMP director, spoke last month at the Federal Cloud Computing Summit in Washington.

If those companies are not in the pipeline, agencies must decide whether they should work with cloud providers to get their services accredited through FedRAMP, Roat said. They can also have the company work directly with the FedRAMP office to get accredited.

It can take an agency about 4 ½ months to complete a FedRAMP review or six months for a company to undergo the process on its own, Roat said.

She suggested companies with a small federal footprint — one or two small agencies — consider working with those agencies directly to get FedRAMP approval for their products and services.

FedRAMP's 298 security controls are based on National Institute of Standards and Technology guidelines that govern how agencies should secure their information technology systems. NIST updated those guidelines last year. Roat said there are plans for cloud providers under FedRAMP to transition to the new standards, but that's largely dependent on where they are in the FedRAMP process.

The plan was to incorporate the new security standards into FedRAMP this month, but that likely won't happen until around March because NIST has not yet released test cases, Roat said.

Roat said her office also worked with the Defense Information Systems Agency on its efforts to establish additional requirements above FedRAMP standards.

DoD spent the past 18 months trying to address how it will move DoD mission and data into commercial clouds, said Doug Gardner, DISA's technical director for the Mission Assurance Executive.

"With unclassified and nonsensitive data, the basic controls that you get from somebody who has been through FedRAMP, for example, is really good enough," Gardner said. "We're only worried about integrity. The data is already releasable to the world."

Healthcare Cloud Computing Trends, Challenges, Opportunities

Excerpted from Sacramento Bee Report

Technologies in the healthcare IT industry are converging with time and are far outpacing the legacy systems used by hospitals and healthcare providers. Pronouncements by various countries such as American Recovery and Reinvestment Act of 2009 (ARRA) laid down by the US government are encouraging businesses in the healthcare industry to utilize certain applications of electronic records. Recently, cloud technology has started replacing these legacy systems and offers easier and faster access to this data as defined by the way it is stored i.e. public, private or hybrid.

Cloud computing offers significant benefits to the healthcare sector; Doctor's clinics, hospitals, and health clinics require quick access to computing and large storage facilities which are not provided in the traditional settings, moreover healthcare data needs to be shared across various settings and geographies which further burdens the healthcare provider and the patient causing significant delay in treatment and loss of time.

Cloud caters to all these requirements thus providing the healthcare organizations an incredible opportunity to improve services to their customers, the patients, to share information more easily than ever before, and improve operational efficiency at the same time. The flip side of this advantage is that healthcare data has specific requirements such as security, confidentiality, availability to authorized users, traceability of access, reversibility of data, and long-term preservation. Hence, cloud vendors need to account for all these while conforming to regulations such as HIPAA and Meaningful use.

All the above factors bring the market for cloud computing to grow at a CAGR of 20.5% from 2012 to 2017 in healthcare. Although cloud computing offers significant advantages to HCOs and other stakeholders, it has set of restraints. Security of patient information, interoperability and compliance with government regulations are some of the factors which are slowing down this market.

Cloud technology has been adopted only in certain regions of the world, the majority share being held by the developed nations. The geographies studied include North America, Europe, Asia, and ROW. North America accounts for the lion's share in the cloud computing market with U.S. being the largest contributor to this region.

The healthcare cloud computing market is a fragmented one with no player occupying a share more than 5%. A few players in this market are CareCloud (US), Carestream Health, Inc. (US), Merge Healthcare, Inc. (US), GE Healthcare (UK), and Agfa Healthcare (Belgium).

New Computing Model for Quicker Advancements in Medical Research

Excerpted from The Almagest Report

With the promise of personalized and customized medicine, one extremely important tool for its success is the knowledge of a person's unique genetic profile.

This personalized knowledge of one's genetic profile has been facilitated by the advent of next-generation sequencing (NGS), where sequencing a genome, like the human genome, has gone from costing $95,000,000 to a mere $5,700. So, now the research problem is no longer how to collect this information, but how to compute and analyze it.

"Overall, DNA sequencers in the life sciences are able to generate a terabyte—or one trillion bytes—of data a minute. This accumulation means the size of DNA sequence databases will increase 10-fold every 18 months," said Wu Feng of the Department of Computer Science in the College of Engineering at Virginia Tech.

"In contrast, Moore's Law (named after Intel co-founder Gordon E. Moore) implies that a processor's capability to compute on such 'BIG DATA' increases by only two-fold every 24 months. Clearly, the rate at which data is being generated is far outstripping a processor's capability to compute on it. Hence the need exists for accessible large-scale computing with multiple processors … though the rate at which the number of processors needs to increase is doing so at an exponential rate," Feng added.

For the past two years, Feng has led a research team that has now created a new generation of efficient data management and analysis software for large-scale, data-intensive scientific applications in the cloud. Cloud computing is a term coined by computing geeks that in general describes a large number of connected computers located all over the world that can simultaneously run a program at a large scale. Feng announced his work in October at the O'Reilly Strata Conference + Hadoop World in New York City.

Please click here for the full report.

Cloud Computing 2014: Moving to a Zero-Trust Security Model

Excerpted from The Threat Vector Report

The leaking of classified documents detailing the data collection activities of the US National Security Agency earlier this year reignited some long-standing concerns about the vulnerability of enterprise data stored in the cloud.

But instead of scaring businesses away from using hosted services, as some experts predicted, the leaks about the NSA spy programs are driving some long overdue changes in enterprise and service provider security and privacy policies.

When Edward Snowden first began spilling details of the NSA's surveillance practices to selected reporters in June, industry analysts had expected that the revelations would put a severe crimp on plans for cloud deployment.

For instance, the Information Technology & Innovation Foundation in August said the leaks could cause U.S. cloud providers to lose 10% to 20% of the foreign market to overseas competitors — or up to $35 billion in potential sales through 2016.

Another industry group, the Cloud Security Alliance, predicted a similar backlash due to concerns by European companies that the U.S. government would access to their data.

Six months later, the impact appears to be less severe than expected.

Despite some reports of slowing sales of cloud services by U.S. vendors to overseas companies, experts now expect that the Snowden leaks will have little effect on long-term sales. The business benefits of using cloud-based services continue to supersede enterprise fears of government snooping.

At the same time though, the detailing of classified NSA spy programs has prompted an increased emphasis on cloud data security and protection that's expected to grow further in 2014.

The leaks hammered home just how little control companies have over data stored in the cloud, said Richard Stiennon, principal at consulting firm IT-Harvest. "There is a fundamental shift to a zero-trust model in the cloud." The disclosures showed enterprises that "there cannot be any chink in the trust chain from internal resources to the cloud and back."

Analysis say IT security officials are looking at several key areas, such as data encryption, key management and data ownership, regionalization, and the need for increased government transparency, to improve cloud security.

Encryption has gained a lot of attention since the Snowden leaks. Major service providers like Microsoft, Yahoo and Google set the tone by adding end-to-end encryption of data they host and manage for customers.

For instance, Google Cloud Storage now automatically encrypts all new data before it's written to disk. Such server-side encryption will soon be available for older data stored in Google clouds.

Since the NSA programs were disclosed, Microsoft has announced that it plans to ramp up encryption support for various services, including Outlook.com, Office 365, SkyDrive and Windows Azure.

Please click here for the full report.

Whistleblower Edward Snowden Is Tech Person of Year

Excerpted from USA Today Report by John Shinal

"They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."— Benjamin Franklin, for the Pennsylvania Assembly, in its reply to the governor, 1755.

In the wake of the 9/11 terrorist attacks, the American people, through their elected representatives in Washington, chose to exchange a significant amount of freedom for safety.

But until a lone information-technology contractor named Edward Snowden leaked a trove of National Security Agency documents to the media this summer, we didn't know just how much we'd surrendered.

Now that we do, our nation can have a healthy debate — out in the open, as a democracy should debate — about how good a bargain we got in that exchange.

For facilitating that debate, at great risk to his own personal liberty, Snowden is this column's technology person of the year for 2013.

Please click here for the full report.

Coming Events of Interest

International CES - January 7th-10th in Las Vegas, NV.  The International CES is the global stage for innovation reaching across global markets, connecting the industry and enabling CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $209 billion US consumer electronics industry.

CONNECTING TO THE CLOUD - January 8th in Las Vegas, NV. This DCIA Conference within CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector. Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

CCISA 2014 – February 12th–14th in Turin, Italy. The second international special session on  Cloud Computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed, and  Network-Based Processing.

CLOSER 2014 - April 3rd-5th in Barcelona, Spain. The Fourth International Conference on Cloud Computing and Services Science (CLOSER 2014) sets out to explore the emerging area of cloud computing, inspired by recent advances in network technologies.

NAB Show - April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.

Media Management in the Cloud — April 8th-9th in Las Vegas, NV. This two-day conference provides a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain, including production, delivery, and storage.

CLOUD COMPUTING EAST 2014 - May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.

Copyright 2008 Distributed Computing Industry Association
This page last updated January 19, 2014
Privacy Policy