Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

Amazon Web Services

Aspera

Dax

Equinix

YouSendit

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

March 11, 2013
Volume XLIII, Issue 1


TV Is Changing Before Our Eyes

Excerpted from All Things D Report by David Pakman

It's finally happening. The Internet is taking over TV. It's just happening differently than many of us imagined. There are two major transformations under way.

The Rise of the Internet Distributors. Led by Netflix, the group of new distributors includes Amazon and Microsoft now, but maybe Apple and Google later. They are largely distributing traditional TV shows in a nontraditional way.

All the content is delivered over IP, and usually as part of a paid subscription or per-episode electronic sell-through (EST). Important to note that all of this content contains no advertising and is available entirely on demand. This content falls into the "non-substitutional" content bucket. To watch it, you don't need to be a cable TV subscriber.

The Rise of Alternative Content Producers. Thanks to YouTube's Channel strategy and investment in hundreds of content providers, new producers of content are emerging and offering nontraditional programming, usually in shorter form.

This content is marked by dramatically different production economics than traditional TV content, taking advantage of an expanded labor pool and low-cost cameras and computer editing. This alternative content is chipping away at long- and mid-tail viewership on traditional networks (the "filler" and "nice-to-see" buckets).

Both of these transformations are successful to date, and will only become more so. Rich Greenfield has a nice summary of why the TV industry suddenly loves Netflix.

The first transformation takes advantage of the massive pressure MVPDs place on traditional cable nets to not offer their programming direct to consumers. In this case, the HBOs and AMCs requirement that you authenticate your existing cable subscription in order to watch their programming over IP successfully persuades the cord-nevers to just avoid the programming on those networks until the hit shows are offered through Netflix or EST.

Netflix, once again, looks like the hero. Those empty threats by Jeff Bewkes that he will never work with Netflix turned out to be, well, empty. The second transformation will take longer to fully prove out, but I believe it will happen. As more of our viewership takes place over IP, we lose our allegiance to networks as the point of distribution and allow new distributors to guide us toward content choice.

There is a third budding area of transformation, but I don't yet see evidence that a business exists: Trying to repackage cable TV bundles and sell them over IP.

Companies like Aereo and Nimble TV offer versions of this. I believe we live in a show-based world. Consumers aren't looking for networks (with the exception of ESPN and regional sports nets) so much as they are looking for shows. Shows delivered over IP allow for the slow unbundling of television.

One of the many challenges about this model for traditional broadcasters is that there is no advertising in this world. The traditional cable-net business model enjoys two great revenue streams — affiliate fees and ad dollars. In IP-delivered shows, there are no ads.

Who are the winners and losers in this model? Well, show creators continue to flourish. The new distributors enjoy great success. Of course, ISPs, who are often the same companies as the MVPDs, do fine in the ISP business, but I believe the decline in total cable subs will continue.

In a world where shows do not contain advertising, why do we need Nielsen? They have been a measurement standard for decades, largely because advertisers needed a third-party validator of viewership. You can see why they have a vested interest in insisting TV ad viewership is not on the decline (despite everyone's experience to the contrary).

I don't think cable nets are in immediate trouble. They enjoy a great business model now, and also get to reap EST or licensing benefits after the shows air. But the Netflix "House of Cards" effort shows that consumers will now expect to be able to watch shows whenever they want, and not be bothered by inconvenient broadcast schedules. The day is coming when the cable nets will have to respond.

For start-ups, one of the wide-open spaces seems to be in cross-provider discovery. Now that my shows are spread among Netflix, Amazon, YouTube and on my DVR, I would prefer one interface to reach them all. Companies like Dijit NextGuide, Peel, Squrl, and Telly are taking cracks at this important space.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe're very excited to announce that Equinix has joined the expanding group of sponsors for our upcoming CLOUD COMPUTING CONFERENCE at the 2013 NAB Show taking place at the Las Vegas Convention Center on April 8th and 9th in Las Vegas, NV.

Equinix connects more than 4,000 companies directly to their customers and partners inside the world's most networked data centers.

Today, businesses leverage the Equinix interconnection platform in 31 strategic markets across the Americas, EMEA, and Asia-Pacific.

Previously announced sponsors for this year's CLOUD COMPUTING CONFERENCE include Amazon Web Services, Aspera, DAX, and YouSendIt.

Our 2013 event track will demonstrate the new ways cloud-based solutions are providing increased reliability and security, not only for commercial broadcasting and enterprise applications, but also for military and government implementations.

From collaboration during production, to post-production and formatting, to interim storage, delivery, and playback on fixed and mobile devices, to viewership measurement and big-data analytics, cloud computing is having an enormous impact on high-value multimedia distribution.

The 2013 Conference has been extended from one to two full-days reflecting the increased importance of and growing interest in its subject matter. We're also very pleased to announce new conference speakers.

DAY ONE will begin with an "Industry Update on Cloud Adoption."

How are cloud-based technologies currently being deployed throughout the audio/video (A/V) ecosystem? What file-based workflow strategies, products, and services are now working best?

After an introductory presentation by Amazon Web Services, a panel discussion with Dr. Frank Aycock, Appalachian State University; Jonathan Hurd, Altman Vilandrie; Rob Kay, Strategic Blue; and Patrick Lopez, Core Analysis, will thoroughly examine this emerging market segment.

Next, we'll discuss "Outstanding Issues: Reliability & Security." What remaining pitfalls cause producers and distributors to resist migrating to the cloud? How are liability, predictability, privacy, and safety considerations being addressed?

Speaker Shekhar Gupta, Motorola Mobility, will introduce the topic. And then a panel with Lawrence Freedman, Edwards Wildman Palmer; Tom Gonser, Docusign; Jason Shah, Mediafly; and John Schiela, Phoenix Marketing International, will follow-up with further discussion.

Then "Cloud Solutions for Content Creation" will be our subject. How is cloud computing being used for collaboration and other pre-production functions? What do dailies-screening and editing in the cloud offer the content production process?

Speaker Patrick MacDonald King, DAX will explore this area first. And then a panel with Sean Barger, Equilibrium; Morgan Fiumi, Sfera Studios; Rob Green, Abacast; and Robert Blackburn, Equinix, will continue our examination.

"Post-Production in the Cloud" will follow. What do cloud solutions bring to post-production functions such as animation and graphics generation? How are formatting, applying metadata, and transcoding improved by cloud computing?

Our DAY ONE Marquee Keynote Chris Launey of Disney will speak first.

Then a panel with Jim Duval, Telestream; Joe Foxton, MediaSilo; Jun Heider, RealEyes; and Bill Sewell, Wiredrive, will delve into this topic in more detail.

Next, we'll discuss "Cloud-Based Multimedia Storage." How are data centers and content delivery networks (CDNs) at the edge evolving? What do business-to-business (B2B) storage solutions and consumer "cloud media lockers" have in common?

Speaker Jean-Luc Chatelain, DataDirect Networks, will address the topic first. And then a panel with Bang Chang, XOR Media; Tom Gallivan, WD; Mike Wall, Amplidata; and Douglas Trumbull, Trumbull Ventures, will follow up with further discussion.

DAY ONE will end with "Content Delivery from the Cloud." How is cloud computing being used to enable distribution and playback on multiple fixed and mobile platforms? What does the cloud offer to improve the economics of "TV Everywhere?"

Speaker Chris Rittler, Deluxe Digital Distribution, will explore this area first. And then a panel with Scott Brown, Octoshape; Brian Campanotti, Front Porch Digital; Malik Khan, LTN Global Communications; and Mike West, GenosTV, will continue the examination

DAY TWO will open with four cloud implementation case studies.

How was cloud computing used most successfully during 2012 in the multimedia content distribution chain? What lessons can be learned from these deployments that will benefit other industry players?

Case studies will be presented by Jason Suess, Microsoft; Michelle Munson, Aspera; Keith Goldberg, Fox Networks, and Ryan Korte, Level 3; and Baskar Subramanian, Amagi Media Labs. Then the presenters will join in a panel discussion.

Next, we'll look at "Changes in Cloud Computing." How is the cloud-computing industry changing in relation to content rights-holders? What new specialized functions-in-the-cloud, interoperability improvements, and standardization are coming this year?

First, David Cerf, Crossroads Systems; Margaret Dawson, Symform; Jeff Malkin, Encoding; and Venkat Uppuluri, Gaian Solutions, will join in a panel. And then Mark Davis, Scenios, will speak on this topic.

"A Future Vision of the Cloud" will explore what to expect next. What do the latest forecasts project about the ways that cloud-computing solutions will continue to impact the A/V ecosystem over the long term? How will the underlying businesses that are based on content production and distribution be affected?

Panelists Lindsey Dietz, ODCA; John Gildred, SyncTV; Mike Sax, ACT; and Sam Vasisht, Veveo, will join in the discussion.

"Military & Government Cloud Requirements" will follow. How do the needs of military branches and government agencies for securely managing multimedia assets differ from the private sector? What do these requirements have in common with commercial practices?

Michael Weintraub, Verizon, will speak first. Then Scott Campbell, SAP America; Fabian Gordon, Ignite Technologies; Linda Senigaglia, HERTZ NeverLost; and Alex Stein, Eccentex, will go into more depth.

Next, we'll explore "Unique Cloud-Based Solutions." What are cloud solutions providers currently developing to address specific considerations of the intelligence community (IC) in fulfilling its missions? How will these approaches evolve and change during 2013?

DAY TWO Marquee Keynote Saul Berman of IBM, will address this area first.

Then David Bornstein, Akamai; Rajan Samtani, Industry Consultant; Ganesh Sankaran, PrimeFocus; and Dan Schnapp, Hughes Hubbard & Reed, will continue this examination.

Four relevant cloud Case studies will follow.

How is cloud computing being used to help securely manage sensitive multimedia? What lessons can be learned from these deployments that will benefit military and government organizations?

Grant Kirkwood, Unitas Global; William Michael, NEC Corporation; Randy Kreiser, DataDirect Networks; and John Delay, Harris, will present case studies.

These presenters will then join in a panel discussion.

The Conference Closing will tie back to the commercial sector. How do those involved in multimedia production, storage, and distribution leverage cloud-based solutions to their fullest potential? What resources are available for comparing notes and staying current on the latest developments?

Our closing session speakers will be Steve Russell, Tata Communications, and Jeffrey Stansfield, Advantage Video Systems.

There are special discount codes for DCINFO readers to attend the NABShow. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04. Share wisely, and take care.

The US Must Protect Internet Freedom

Excerpted from Mashable Op-Ed by Gary Shapiro

Gary Shapiro is the President & CEO of the Consumer Electronics Association (CEA), the US trade organization representing more than 2,000 consumer electronics companies.

Since the United Nations' International Telecommunication Union (ITU) convened the World Conference on International Telecommunication (WCIT) in December, concern has been mounting among Internet freedom advocates over efforts to regulate the Internet.

These concerns came to a head at the US House of Representatives Communications and Technology Subcommittee's recent Global Internet hearing. Despite strong US support for a free and open Internet, many worry that the December event is just the beginning of an ongoing effort by the ITU to expand its power and regulate the web on a global scale.

In his opening statement before the hearing, Communications and Technology Subcommittee Chairman Greg Walden (R-OR) hit the right tone when he noted that, "Governments' traditional hands-off approach has enabled the Internet to grow at an astonishing pace and become perhaps the most powerful engine of social and economic freedom and job creation our world has ever known."

Americans understand that the Internet brings innovative products to market, lowers communication barriers, serves as a source for investment and entrepreneurship and promotes free expression. Sadly, some foreign governments see an open and free Internet as a threat. In fact, 89 nations signed an ITU treaty to give greater government control over cybersecurity and spam. The nations included Russia, Cambodia, Iran, China, Cuba, Egypt and Angola, none of which are known for promoting the open exchange of ideas. Their efforts, if successful, would close the door on freedom for millions around the world.

The treaty would grant the ITU policing powers to legitimize government inspections over Internet communications and to decide what should be censored. More broadly, government regulations could mean increased amounts of politicized engineering and a decrease in Internet growth. Entrepreneurs may be forced to seek bureaucratic permission to innovate and invest via the web. This should be an eye-opener for nations that, like America, thrive on an open and unregulated web.

Since the WCIT convened in Dubai, many have lashed out at the ITU, expressing their support for defunding the agency; Defundtheitu.org is a website dedicated to this effort. Supporters have also started a petition on the White House's website. Many of America's free-market allies, including Germany, France, Spain, and Finland, along with successful tech companies, have already taken steps to de-fund the ITU.

But the ITU's actions aren't the only ones that should raise concern. The US is susceptible too, as we saw last year when lawmakers attempted to pass the Stop Online Piracy Act (SOPA) and the Protect IP Act (PIPA). Supporters of Internet freedom acted overwhelmingly to stop SOPA and PIPA, and it's time for them to act again. Big tech industry names like Google and Verizon are urging US leaders to defend Internet freedom. As we saw with SOPA and PIPA, this is the time for all to get involved. You can contact your member of Congress or tell your lawmaker to support US policy efforts to promote an uncensored Internet.

As the world's technology leader, the US must lead the way in preserving a free and open Internet. The Internet is the lifeblood of our economy and helps individuals create businesses, trade goods and services, access low-cost banking resources, facilitate open communication, and more. America must provide the example and walk the walk — not just talk — about preserving Internet freedom.

I commend the Energy and Commerce Committee for its recent hearing reinforcing America's commitment to keeping the Internet open and free from government censorship and interference. I look forward to seeing Congress act to protect and promote a global Internet free from government control.

Google, UCSD Boost Cloud Computing Efficiency

Excerpted from ScienceBlog Report

Computer scientists at the University of California, San Diego (UCSD) and Google have developed a novel approach that allows the massive infrastructure powering cloud computing as much as 15 to 20 percent more efficiency. This novel model has already been applied at Google. Researchers presented their findings at the IEEE International Symposium on High Performance Computer Architecture conference February 23rd to 27th in China.

Computer scientists looked at a range of Google web services, including Gmail and search. They used a unique approach to develop their model. Their first step was to gather live data from Google's warehouse-scale computers as they were running in real time. Their second step was to conduct experiments with data in a controlled environment on an isolated server. The two-step approach was key, said Lingjia Tang and Jason Mars, faculty members in the Department of Computer Science and Engineering at the Jacobs School of Engineering at UCSD.

"These problems can seem easy to solve when looking at just one server," said Mars. "But solutions do not scale up when you're looking at hundreds of thousands of servers."

The work is one example of the research Mars and Tang are pursuing at the Clarity Lab at the Jacobs School, their newly formed research group. Clarity is an acronym for Cross-Layer Architecture and Runtimes.

"If we can bridge the current gap between hardware designs and the software stack and access this huge potential, it could improve the efficiency of web service companies and significantly reduce the energy footprint of these massive-scale data centers," Tang said.

Researchers sampled 65 K of data every day over a three-month span on one of Google's clusters of servers, which was running Gmail. When they analyzed that data, they found that the application was running significantly better when it accessed data located nearby on the server, rather than in remote locations. But they also knew that the data they gathered was noisy because of other processes and applications running on the servers at the same time. They used statistical tools to cut through the noise. But more experiments were needed.

Next, computer scientists went on to test their findings on one isolated server, where they could control the conditions in which the applications were running. During those experiments, they found that data location was important, but that competition for shared resources within a server, especially caches, also played a role.

"Where your data is versus where your apps are matters a lot," Mars said. "But it's not the only factor." Servers are equipped with multiple processors, which in turn can have multiple cores. Random-access memory is assigned to each processor, allowing data to be accessed quickly regardless of where it is stored. However, if an application running on a certain core is trying to access data from another core, the application is going to run more slowly. And this is where the researchers' model comes in.

"It's an issue of distance between execution and data," Tang said. Based on these results, computer scientists developed a novel metric, called the NUMA score, that can determine how well random-access memory is allocated in warehouse-scale computers. Optimizing the NUMA score can lead to 15 to 20 percent improvements in efficiency. Improvements in the use of shared resources could yield even bigger gains — a line of research Mars and Tang are pursuing in other work.

Adobe Taps Social Media for Marketing Cloud Revamp

Excerpted from ZDNet Report by Rachel King

Call it the Pinterest effect. Like the consumerization of IT, all segments of business technology seem to be influenced by popular consumer tech trends these days, and a new example of that is the latest version of the Adobe Marketing Cloud.

Announced at the 2013 Adobe Summit in Utah today, the new user interface for the Marketing Cloud has visual-heavy design intended to inspire as well as encourage collaboration among teams and agencies.

Brad Rencher, Senior Vice President and General Manager of the digital marketing business unit at Adobe, described further that "when managing teams for creative design, advertising and analytics are all under the same roof, marketers need information that paints a full picture of their business, in one easy-to-access spot."

Acknowledging the ever-growing Pinterest platform as a base model, Adobe said the gist is to take visualization and sharing capabilities and apply them to the enterprise.

Also, like most consumer software and products seeping into the enterprise market, there shouldn't be much of a learning curve here. Users simply pin images to and curate boards around projects. From there, those "cards" show up on team members' news feeds, and co-workers can edit and comment on cards, too.

Those cards also turn into more data as well. Every time cards are shared, that content is filtered into a metrics report for further analysis. Speaking of analytics, Adobe also introduced a new workflow that is touted to enable marketers with the chance to both identify and target "high-value" customers within minutes.

Essentially, the predictive workflow integrated into the latest version of Adobe Analytics takes advantage of big data, promising to sort through terabytes of data "quickly," identifying audiences based on shared characteristics and then score them based on how likely they are to convert to customers.

That information can then be filtered to Adobe Target — another realm of the Adobe Marketing Cloud — to deliver optimized offers depending on which category or audience they fall into.

Finally, Adobe is making a major mobile push for all of its releases, with a heavy focus on tablets.

From the simplest user point-of-view, Adobe Marketing Cloud is being optimized for tablets with a touch-based, mobile-first interface, which was done in reflection of the growing mobile workforce in general.

But Adobe is particularly keen on strengthening its portfolio for trapping and analyzing mobile data through new social apps and advertising. This includes delivering optimized content through mobile-optimized websites thanks to a revised SDK.

Supporting content for iOS, Android, Windows 8, Windows Phone 8, OSX, and Blackberry mobile apps, the SDK has been configured to automatically identify a consumer's device and then deliver real-time content specific to that device.

The San Jose-based corporation is also aiming to conquer mobile analytics with help from industry partners.

For example, Adobe is integrating solutions from apps analytics and conversions specialists Distimo to discern app store data such as downloads, rankings and revenue to offer marketers with a more well-rounded perspective on the engagement process from initial download through monetization.

While most of the new mobile UI changes to the Marketing Cloud are available in beta, an integrated digital asset manager along with a campaign setup wizard are scheduled to follow in a few months.

The mobile analytics updates are also out now in beta with a full-fledged release planned for the second quarter.

IBM Bets on Open Cloud

Excerpted from Techworld Report by Rohan Pearce

IBM has thrown its weight behind open cloud standards, announcing that all future cloud services and software would be based on "open cloud architecture". As a first step the company today unveiled a new private cloud tool, SmartCloud Orchestrator, based on the open source OpenStack project.

"History has shown that open source and standards are hugely beneficial to end customers and are a major catalyst for innovation," IBM Senior Vice President of Software, Robert LeBlanc, said.

"Just as standards and open source revolutionized the web and Linux, they will also have a tremendous impact on cloud computing. IBM has been at the forefront of championing standards and open source for years and we are doing it again for cloud computing.

"The winner here will be customers, who will not find themselves locked into any one vendor -- but be free to choose the best platform based on the best set of capabilities that meet that needs."

IBM joined the OpenStack Foundation as a top-tier Platinum member in April last year.

OpenStack is a collection of open source software for building public and private clouds. It can be used either by providers who want to deliver infrastructure as a service to customers or enterprises that want a private cloud for on-demand, self-service provisioning of compute services for departments.

The roots of the project, which launched mid-2010, lie in collaboration between NASA and Rackspace.

Why Tech Must Embrace Cloud Disruptiveness

Excerpted from Forbes Report by Maribel Lopez

Cloud computing and mobile are two technology trends that are transforming how businesses operate and how consumers live. The first generation of cloud computing focused on cost optimization. Lopez Research believes the second wave of cloud computing will help businesses save money while operating more efficiently.

Cloud computing will change where data and applications live and how they operate. To get a sense of the disruption and the opportunity that cloud computing provides, I interviewed John Considine, the Chief Technology Officer of Verizon's Terremark division. Previously Considine worked for CloudSwitch that was acquired by Verizon.

Considine said cloud computing is disrupting the entire value chain from the technology to the types of buyers. "Cloud computing companies, such as CloudSwitch, started by simplifying the cloud for enterprises by focusing on cloud security, advanced computing, and workload portability."

Workload portability is how an enterprise moves workloads to and from the cloud. He noted that in the early days, companies used cloud computing as a way to get services while bypassing IT. This was called shadow IT.

Chief marketing officers, research and development and other business units were using cloud computing to get their work done. The industry had postulated IT would catch on and lock down cloud computing.

"In 2012, shadow IT came out into the light. It has been accepted by businesses almost universally."

However, many CIOs still have certain concerns based on security and compliance regulations. Considine warned that CIOs must strive to build relevancy in the new world of cloud computing. He said "As they face this disruption, CIOs and IT leaders have to look at moving up the stack and spend less time on low level operational tasks."

"On one hand things are getting easier from an architectural standpoint. In other ways the architectures are getting more complicate. Instead of servicing from desktop, we have mobile devices and tablets. How do we facilitate connectivity and interaction across a broader set of interfaces?"

He stated that companies must consider how to blend technology across internal data centers, cloud computing and a mixture of wired and wireless networks. Considine believes this is an area where IT can excel by helping the business span these technologies.

Later in our interview, Considine described three ways that cloud computing disrupts the current IT infrastructure.

First, "Virtulaization created a major disruption to hardware vendors." He said it gave businesses the opportunity to consolidate equipment, increased overall utilization, and decrease new unit purchases. This has and will continue to curtailed hardware vendors growth.

According to Considine, the second area cloud computing has disrupted is management software. Cloud vendors are creating and operating the infrastructure. These services frequently encompass management software that was traditionally sold to enterprises. Hence changing the landscape for management software. He didn't predict the death of any players but he said growth for existing vendors in management software could stagnate because these services may not be required in the new cloud architectures.

"It isn't as if these vendors will go away overnight but we see any growth opportunities is being usurped by cloud computing." Considine noted "IT isn't the only purchaser of the infrastructure." As he said earlier, the line of business buyer is procuring storage and platform services without going through standard procurement channels. This disrupts how contracting works. Essential cloud computing is disrupting business at almost every layer.

"Mobile adds a fascinating dimension." He discussed how processing was being pushed out to the edge for distributed computing.

But while smart-phones have faster processor's Considine said, "The computing demands may still outstrip capability of the processing utility in cell-phones and tablets. This pushes for additional forms of cloud computing to support the needs of applications."

"Data is important and has gravity. It attracts things." In traditional architecture all the information was created and consumed behind the firewall.

He discussed a major transition where B2B, supply chain management, customer management, and other data is moving outside the data center in terms of creation and consumption."The workforce isn't sitting in the same building and consumers are outside of your data center."

He raised an excellent point when he asked "Why are we making a concentrated effort to concentrate things in your own data center versus distributing them closer to the web? This leads us back to cloud computing."

When asked what advice he had for business leaders, Considine said many businesses leaders are still searching for reasons to avoid cloud computing. "Companies should look for ways to take advantage of cloud computing. CIOs and CTOs must find out how their company is using it. Just about every company is using the public clouds in one format or another. It's important for IT to discover the problems and issues and deliver value by resolving these."

He said one of the best ways to stay head in cloud computing was to "Get involved in the public clouds, since they are leading the way."

What about the future?

"In the future, we'll talk more about PaaS. It got off to a bad start …because original versions change how you did things to take advantage of the platform." said Considine.

The next major transformation will be in terms of how people are assembling their applications.

Today, businesses dedicate resources such as time, money and staff to handle the functions that don't offer high value. "From an app standpoint, you just need a database." You don't have to care which database it is as long as it can do the job.

Considine closed the conversation by saying that cloud computing can help a business "shift the operational burden from company. It will offer faster time to market, higher quality and lower cost. In my book that is a winner every time."

I agree with John. Mobile and cloud are helping employees and consumer create and consume content anywhere. Business leaders must build technology strategies that enable this new behavior.

Cloud Benefits Are Both Financial & Operational

Excerpted from Formtek Blog Report by Dick Weisinger

Two primary benefits of cloud computing are financial and operational, says FreshlyTechy in a Seattle PI article.

Consider the financial benefits of the cloud. A survey by KPMG found that 70 percent of businesses say that the cloud has already brought them significant efficiencies and cost savings.

Rick Madan, Executive Director of Network Services for TEKsystems, said that "the total cost of ownership of software and hardware goes down for end users since they are removed from the business of buying, licensing, and maintaining associated assets.

Furthermore, the revenue-and-service premise for cloud providers is built on a 'pay as you go' model so as an end-using company, instead of getting bogged down in the sunk costs of servicing debt related to sometimes idle IT processes and resources, the cloud model allows you only pay for those actually utilized by the business at any given moment."

There are operational benefits from the cloud, too. The introduction of cloud computing can be transformative to the operations side of the business. Tasks that focus on the lifecycle management of hardware and software often get offloaded from the business to the cloud provider.

Tasks like change control, versioning, upgrades and updates, release management, storage, process job management, and data center mirroring are either eliminated or greatly reduced.

Craig Ryman, IT director at AMP Capital Investors, called adoption of the cloud an operational "game changer" for their business. "It is critical that we realize maximum value from our investment. Beyond a focus on costs, we've deliberately taken a strategic, 'whole of portfolio' approach, recognizing that changes to our operating model will be required to achieve our long-term growth objectives."

Madan said "make no mistake, for IT departments and individual IT workers, it's no longer a question of 'will' cloud become part of the mix. It's just a question of to what extent."

The Disaster Mitigation Power of Cloud Computing

Excerpted from CloudTweaks Report by Abdul Salam

In a world where digital connectivity and online presence are a significant part of life and businesses alike, a disaster that disables online services is undoubtedly going to ruin some people's day, to say the least. Hurricane Katrina and others like it around the world have proven that no data center or server facility is safe from natural disasters and other freak accidents.

The effects of such events can at least be mitigated with cloud computing services and technology, making disaster mitigation and recovery easier.

Imagine if a bank losses all its customer's data including all digital backups because of some calamity, it certainly is a disaster for everyone, the bank most especially. However, not to worry for there are backups in the form of paper, age old reliable paper; well, one can just imagine the banks clerks trudging through tons of paper and slaving in front of computers as they quickly try to get everything into electronic form while their cheeks slowly turn gaunt and their eyes white.

It is a highly unlikely event with today's standards for backing up data, but still a possible worst case scenario.

The nature of cloud computing makes disaster recovery an extremely logical solution, and as a service it can be tremendously lucrative. Because cloud computing can allow one to offload data into "offshore" installations, anywhere and everywhere around the world with multiple backups, the chances that all of them can be wiped out in the same instance are pretty slim.

This can be accomplished with the old setup one might say. Yes, it can, but not as well and not as economically as cloud computing can, especially if you consider the public cloud option.

Because of virtualization, service providers are able to provide the resources required for disaster mitigation and recovery for a small fraction of the price that it would cost an organization to setup its own. This is because hardware resources are shared by multiple customers and clients through virtualization.

This allows providers to accommodate more customers using the same amount of resources as with non-cloud computing technology. This affordability opens up the market to not just the leading companies but also to SMB's and startups, which brings more competition and hence more options and better service for customers.

The area of disaster mitigation and recovery is not just limited to backup and storage but is also applicable with any other online service. Servers or data centers that become unavailable due to disaster can quickly be replicated in another location in a matter of a few hours.

This minimizes downtime for those that are providing services like online games and streaming services, and even those that rely on online transactions like e-stores and financial institutions. When it comes to disaster mitigation and recovery, nothing simply does it better and cheaper than cloud computing.

Cloud Haters: You Too Will Be Assimilated

Excerpted from ZDNet Report by Jason Perlow

You despise the idea of losing your individual computing power. You hate subscriber services. You don't trust our security model or feel we are reliable enough. You don't believe your connectivity will ever be good enough. It doesn't matter: Your distinctiveness will be added to our own. We are the cloud. Do not resist us.

I've been writing a lot of articles about the cloud lately, particularly as it relates to the future of personal computing. And what I've seen in the Talkbacks by average end-users as well as business types as to how they perceive the cloud has been fascinating.

It seems some of you find the cloud threatening. And that you'll move to the cloud kicking and screaming, holding your personal computer and your local data with your gritty nails dug into your laptops, external drives, and NAS appliances, tearing at them with whatever last lingering bit of life force you have left in you before you'll accept the inevitable.

Well, I've got news for you, Cloud Haters. The cloud is coming for you whether you like it or not. The cloud cannot be stopped. Your data and user experience will be assimilated.

We still don't understand why this frightens people. The cloud will be a better experience than you have now, and it will be less expensive in terms of asset expenditure and total cost of ownership.

But we at The Cloud Continuum are not entirely without compassion. Let's go down the list of your grievances and address your concerns. I mean, it's not like we have to, because we'll just end up owning your infrastructure anyway. But we are, if anything, attentive.

Grievance 1: I'll lose my individual computing power if I move to the cloud.

This cannot be any farther from the truth. If anything, you'll have more individual computing power from moving to the cloud, because your user experience will be backed up by a balls to the wall datacenter with huge amounts of remote compute power as well as even remote GPU compute capability, if you look at the latest advancements in desktop as a service (DaaS) with technologies such as Microsoft RemoteFX, Citrix XenApp, and VMware Horizon View.

But don't blame the cloud for losing your localized computing power. Blame the technology industry and the overall desire to move to greener, more power efficient localized processing. Blame tablets and smart-phones and low-power SoCs and other inexpensive endpoint devices that will be the crux of the next-generation personal computing experience.

Powerful workstations, PC desktops, and even heavy-duty laptops are going to make way for thinner, lighter ultra-books and tablets, many of which will end up using ARM-based SoCs as opposed to the venerable x86 architecture.

While these systems will have the still have the capability to run localized applications, as time goes on and the operating systems from the usual sources for these devices evolve, the newest applications will be deployed from cloud-based app stores and use entirely new API sets, like Microsoft's WinRT and of course the APIs used by iOS and Android.

Sure, you'll need to get to legacy, CPU-intensive applications for things like content creation (think Photoshop, AutoCADs, video editing, and the like), but those will be deployed by ISVs as subscriber legacy apps or in private clouds by the enterprise.

Yes, there will be minimal edge cases that do need workstations, but they will be so few and far between as to amount to a rounding error in a Tier 1 PC manufacturer's yearly income, and private citizens won't be able to justify the expense of buying them for the sheer vanity of having a "local" machine when their cloud-enabled devices are a fraction of the cost.

Oh, and yes, we've heard the "I'm a hardcore gamer, I need a real PC" argument. No, really, you don't. And we don't care about you, either. Between smart-phones, tablets, and consoles, the "hardcore PC gamer" has been marginalized for years, and the game publishers willing to put resources in strictly PC games without re-purposing development assets for mobile and console are ever few and far between.

Within 10 years, there will be no "hardcore gaming PCs" to buy, anyway. They'll be extinct.

Grievance 2: Subscriber services are going to increase my personal computing and application costs.

The move to a subscriber-based sales model for major ISV applications like Microsoft Office 365 seems to rub people the wrong way. There's no question this is an entirely new way of doing things, and that having to pay a yearly fee per seat rather than assume an upfront cost for a license that may be used for four or five years sounds more expensive.

The reality is that software-as-a-service (SaaS) and subscriber software services are actually less costly to both the end-user and the enterprise in the long run. Much of this has to do with the burden of maintenance and updates. It also has to do with the elimination of software piracy, which has artificially inflated the costs of software for at least two decades.

Things like Office suites and content creation suites like Adobe CS6 and stuff like Intuit Quickbooks Pro cost a lot of money to produce, because a tremendous amount of man hours go into their development. Traditionally, one might spend $200 to $400 on such a suite per PC, and then, in four years or so, upgrade for a lesser amount.

That's not accounting for things like academic or student discounts, which still exist in a subscriber model.

That's if you honor things like End-User License Agreements and you don't take that copy and install it on, say, 10 more PCs, or you never bought the software in the first place and are using pirated license keys.

If you're one of those people, then all I have to say is that you're just going to have to pay for your software like everyone else. Or try your hand at the open-source stuff like LibreOffice that was designed for folks who don't want to pay for software, and see if it works for you.

However, if you're a law-abiding citizen, and you've also budgeted for your own IT needs and understand that software is a cost of doing business, your costs are going to essentially remain the same or might even be cheaper.

Plus there's the added benefit that under the subscriber model, you will always be running the current version of the software, and you will always be at a current level of support as well. For small businesses as well as enterprises that live or die by their line of business applications, this is a very big deal.

Grievance 3: I don't trust the security or the integrity of the cloud.

OK, we know there have been a few notable security breaches at some big-name companies who have had some kind of cloud presence in the last few years.

But look, every time something like this has happened, it's been a learning experience, and the folks who run real, business-grade clouds that supply a specific quality of service (read as: they charge for this stuff and have to perform according to Service Level Agreements rather than provide free services in exchange for advertising eyeballs) tend not to be the ones that are susceptible to these problems.

By the way, the folks who I am talking about are not these fly-by-night data storage startups such as Dropbox and "gotta have it for free" cloud-based apps like Twitter that have had all kinds of security incidents. I'm talking about significant telecom carriers and hosting providers and strategic outsourcing vendors and software companies that build both public and private cloud offerings who have been running secure enterprise datacenters for many, many years.

Theseare the folks who will be coming out with all sorts of consumer, end-user cloud offerings, and the ones who you should be trusting your data with.

They will be the ones to invest in the best security technologies, employ the highest-trained security professionals to insure the storage of and network connectivity to that data is isolated from other tenants and walled off from the outside world, and to put the most amount of capital investment into their redundant infrastructure to ensure the integrity of your data and the continuity of your business.

Grievance 4: I don't think I'll ever have enough connectivity.

There's not a single article in which I mention the cloud that I don't get some comment that sounds like "I live in a van down by the river! In East Bumscrabble in Sub-Saharan Africa! My municipal government stinks in deploying broadband! And I live in a developing country where we only have GSM connectivity and 300 baud modems! The connectivity to the cloud will never be fast enough to where I live!"

Yeah, well, sucks to be you.

Look, nobody expects the cloud and broadband initiatives to deploy to every single person in every single country in an equal opportunity fashion. We know that governments drag their feet and infrastructure takes a while to build out. That stinks.

But the bottom line is that for the majority of folks, and for many types of application scenarios, cloud computing does not require a heck of lot of bandwidth. For the types of things I have talked about, such as DaaS and access to remote applications via thin WAN-optimized protocols and web services, the cloud is actually made for reduced bandwidth scenarios, and is far, far less bandwidth intensive than something like video on demand or even CD-quality music streaming.

Once the data livesin the cloud, it doesn't need to leaveor move to the cloud. Because it will simply bein the cloud. Indeed, there will be many types of clouds, and you'll have your choice of where to keep your information and will have the ability to federate the information and services from or migrate to and from other clouds. But pushing tons of data back and forth from the cloud once the majority of our computing existence is cloud? No.

This is a process that will take years. It will not happen overnight. We are still facing fundamental issues for things like what the heck we're gonna do to deal with increased video traffic like a national 4K rollout using IP-based delivery, and how to provision virtual and physical infrastructure at the largest scale according to increasing demand, but these problems will eventually be solved.

The bottom line is, you will be assimilated. Cloud hater or not. It is simply a matter of time.

Will you still resist the cloud and not go quietly into that good night?

Coming Events of Interest

2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.

2013 NAB Show - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB Show - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.

Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.

CLOUD COMPUTING EAST 2013 - May 19th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Copyright 2008 Distributed Computing Industry Association
This page last updated March 17, 2013
Privacy Policy