Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

Kulabyte

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

November 21, 2011
Volume XXXVII, Issue 5


Alcatel Promises Better Clouds for Carriers

Excerpted from PC World Report by Stephen Lawson

Alcatel-Lucent is developing a cloud computing platform for carriers that aims to take full advantage of their networks to deliver guaranteed performance.

Carriers can use the platform, called CloudBand, both to run their own software and to offer cloud computing services to enterprises. For internal purposes, the cloud can make it faster and cheaper to launch and operate services, and for subscribers it will offer more predictable performance than current clouds, according to Alcatel. Carriers will be able to sell cloud computing services with guaranteed availability and response times, the company says.

Service providers already can build their own cloud data centers and link them to their infrastructure, which can provide an edge in performance over using the open Internet, said Dor Skuler, Vice President of Cloud Solutions at Alcatel. But CloudBand goes beyond this with software that examines a wide range of conditions and user requirements to find the best settings for a given application at a certain time.

Meeting a customer's service-level agreement (SLA) might require giving certain packets a higher priority, setting a higher quality of service, setting aside a portion of network bandwidth or using a server in a particular location. "There are so many choices, and you have limited resources," Skuler said. CloudBand can determine how those resources can best be used.

At the heart of CloudBand is the CloudBand Management System, based on algorithms developed by Bell Labs, the former Lucent and AT&T research center. It takes into consideration factors such as server load, network congestion and latency to make sure service-level agreements are met, Skuler said.

The other component of CloudBand is the CloudBand Node, the set of computing, storage, virtualization and cloud management components at each cloud facility on the network. Alcatel will work with Hewlett-Packard, under a 10-year global partnership, to build sets of gear optimized for use with CloudBand. But carriers can implement CloudBand without buying that particular data center gear, Skuler said.

Alcatel envisions carriers taking advantage of their networks to set up CloudBand Nodes in locations near the edges of their networks. These would work in the same way as local content caches, in this case giving enterprises in that area quicker access to cloud resources, Skuler said.

Carriers will also be able to take advantage of public-cloud capacity where they don't have the resources to serve a customer, or where that cloud is less expensive, he added.

Using the same cloud it builds for subscriber services, a carrier can virtualize its own infrastructure for delivering standard services such as voice, video, and short message service (SMS), Skuler said. These types of offerings today are powered by dedicated hardware, making it expensive and time-consuming to roll out a new service. Deploying them on a cloud will make it faster and easier to start up and expand a service using virtualized resources, he said.

Alcatel plans to make CloudBand generally available to carriers in the first half of next year, following trial deployments starting early in the year. It's already using the technology to deliver its own hosted telecommunications services for carriers. Initially, CloudBand will depend on Alcatel network gear, but the company is working with groups such as the Alliance for Telecommunications Industry Solutions (ATIS) and the Internet Engineering Task Force (IETF) on standards that would let it extend to other vendors' equipment, Skuler said.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA is proud to announce our partnership with Phoenix Marketing International (PMI) in sponsoring the upcoming CONTENT IN THE CLOUD Conference at CES.

Founded in 1999, and now led by a team of 130 senior professionals, PMI is one of the fastest growing marketing research firms in the United States. The firm has already been recognized with a Honomichl Top 25 ranking and is now considered a world-class market research and consulting firm.

PMI operates in offices located throughout the US serving global and domestic clients with expertise in both syndicated and custom research for B2C and B2B enterprises. PMI is probably best known for its highly consultative approach to research.

Of particular relevance to the CONTENT IN THE CLOUD conference, PMI has a specific focus on converged technology and media.

PMI's senior team has consulted with technology heavyweights including IBM, Microsoft, AT&T, Nokia, Verizon, Lucent, HP, and many more such firms. On the media side, PMI has worked with the major broadcast television networks and media companies.

Examples of PMI's syndicated offerings include the examination of consumer technology advertising success in the consumer market and B2B technology in both large and small business markets.

PMI's staff is largely comprised of senior managers who have worked in the technology and media industries as product managers, designers, and marketing professionals. They understand how things work because they have "lived the life."

At the CONTENT IN THE CLOUD Conference within CES, PMI will provide highlights from its fresh market research to support each of the four major focal points of this event: the impact of cloud computing adoption for content distribution on 1) consumers, 2) telecom/broadband operators, 3) media and entertainment companies, and 4) manufacturers of CE devices.

For example, PMI has just completed a comprehensive market research study with teens, Generation X, Generation Y, and adults across America.

The results and insights gained from this research will be presented at the conference. This will provide valuable benchmarks on how consumers are adopting the latest technologies and how new trends are evolving in content consumption.

Here are a few of the themes that emerge in this analysis:

Why are tablets changing media consumption patterns, altering lifestyles, and accelerating cloud adoption?

How is content delivery and media rental changing as streaming and downloading services gain traction?

What are the hottest CE devices for teens, Gen X, Gen Y and adults?

Where will cloud-based content distribution be most effective in the short term, and how should service providers respond?

The DCIA and PMI believe that adding a "heavy dose" of fresh market research at the conference will add a new dimension to all the planned presentations and discussions.

In coming weeks, we will share some specific insights from the market research with you. We hope it will peek your interest and amp your enthusiasm for the upcoming program where all the results will be aired and debated. Share wisely, and take care.

The Collaborative Consumption Revolution

Speaking of CONTENT IN THE CLOUD at CES, have you checked out CEA President and CEO Gary Shapiro's latest column at Forbes? This week, he addresses how Americans are slowly moving away from the belief that owning something is better than sharing.

"After all, sharing is a far more efficient method of resource distribution than owning, and perhaps we will all have 'more' in the end," Shapiro writes. What do you think?

Techcrunch features a video interview of Lauren Anderson from Collaborative Consumption by Andrew Keen at Fast Company's Innovative Uncensored event.

Everything, it seems, is becoming collaborative. From Airbnb to RentCycle to Zipcar, we are swapping our cars, our homes, even our clothes with each other. According to Anderson, this change might be as profound as the industrial revolution. It will result, she says, in a world driven by "reputational capital" in which the "We" of our collaborative age will replace the "Me" of the industrial age.

Keen adds, "While Anderson might be right, I'm not sure it's such a great thing for people like myself who aren't naturally participatory. Indeed, I find the whole idea of an always-on reputational economy a little creepy - especially since this may not be a world that is able to either forgot or forgive. But Anderson isn't bothered by oddities like myself, insisting that

"everybody benefits" in this networked, sharing economy."

So is Anderson right - is this shift from the Me to the We as significant as the industrial revolution? And should we welcome this revolution with, so to speak, open arms?

Will Cloud Computing Make Everything (and Everyone) Work Harder?

Excerpted from NY Times Report by Quentin Hardy

What do the following have in common: Computers, limousines, empty beds, and stay-at-home moms?

The cloud keeps them busy.

The rest of us are next.

Virtualization of computer servers, a core element in the development of cloud computing, made it possible for a single PC that was used 20% of the time to be used 80% or more. Software monitored workloads, spotted when a machine was free, and assigned it a workload that would keep it busy without distracting it from the original function.

Now, thanks to the cloud's ability to cheaply connect a lot of people and information over a broad array of devices, a similar use of spare resources is going on elsewhere in the economy.

A company called Uber connects limousines that are between jobs with people who want a ride on the spur of the moment - after a boozy dinner, say, or a late night at work. The service Airbnb turns people's spare rooms into a cheap alternative to hotels, sometimes with mixed results.

It is possible to see this trend as a function of the cloud itself, since announcing, finding, and occupying vacancies on the fly requires access from a lot of locations to common databases. It was always theoretically possible to rent out your room or pick up a fare in between jobs, but now it can be done cheaply and through a common system.

Another example of this trend is LiveOps, a cloud-based call center service. Companies like Pizza Hut, NationsHealth, and Kodak typically pay LiveOps's agents 25 cents a minute to do things like take orders, field customer problems, and sell products.

LiveOps farms the work out to 20,000 people who connect via the cloud from their homes. They sign on weekly for however many half-hour sessions they want, filling out what would otherwise be vacant time.

"The computers in the cloud find the right person among our agents, then routes the call to their house," says Marty Beard, LiveOps's chief executive. "They're mostly students, parents, veterans looking for work - people with a little extra time."

In addition to phone calls, he is gearing the company up to respond to people's Twitter and Facebook postings about the companies that have contracted with LiveOps.

Typically people work 25 hours to 30 hours a week, Mr. Beard says, gaining income by selling products, or getting licensed to sell insurance products. "If you're really good, you can make $45,000 or $50,000 a year," he says. "More likely, it's half that."

In effect, they are filling out their free time, like a limo on Uber or a server in a system, thanks to a cloud system.

This kind of machine-made urgency to utilize everything, creating lower prices (and for many, lower wages) will very likely find a lot more areas to attack. Much of our work may come to resemble a Uber or LiveOps model, as shared calendars and documents, along with location-aware devices, make work possible from more times and locations.

Ignite Technologies Is Citrix Ready for Video Delivery with XenDesktop

Ignite Technologies, the leader in providing enterprise content delivery solutions enabling customers to efficiently publish, deliver, and manage digital content, today announced that it has joined the Citrix Ready program and has completed the process that verifies its Enterprise Content Delivery platform is compatible with Citrix XenDesktop.

Enterprises leverage desktop virtualization solutions to optimize the delivery of desktop applications and data to users. XenDesktop allows these enterprises to centrally manage and deliver virtual desktops and applications to users across the enterprise.

Traditionally, enterprise networks face challenges in providing a high quality viewing experience for employees in use cases such as live video broadcasting, on-demand video, video-driven webcasts, training, social networks, and room-based conference suites.

As a Citrix Ready partner, Ignite provides an optimal solution for businesses leveraging video in a virtualized desktop infrastructure without additional infrastructure investment. The Ignite integration to XenDesktop ensures all rich media - live and on demand - and other bandwidth sensitive traffic, such as software downloads and patches, are delivered in a network efficient manner using Ignite managed, secure, peer-to-peer (P2P) Enterprise Content Delivery platform, independently from the XenDesktop communication channels.

"Ignite Technologies adds significant value and innovation to our customers who deploy video using XenDesktop and we welcome the company as a Citrix Ready partner," said Joe Keller, Vice President of Alliance and Community Marketing at Citrix Systems. "The growth of video usage within enterprises supports the need for partner solutions that can enhance the efficiency of delivery and viewing of video."

"We are excited to join the Citrix Ready partner ecosystem, and provide customers with a solution that will enhance the efficiency of video and large file delivery within a virtualized enterprise infrastructure," said Jim Janicki, President and CEO at Ignite Technologies.

Google Music: Your Great Music Locker in the Cloud

Excerpted from ZDNet Report by Steven Vaughan-Nichols

Is Google Music perfect? No, far from it. But, for the price, zero, it's great. Google Music lets you put all your music a cloud-touch away.

I've been using Google Music since it was in beta. At first, it didn't interest me that much. Yet another way to save my music to the cloud? How much good really was that?

Well, after using it for several months, and now that Google Music is open for everyone in the US to use, I'm here to tell you that Google Music has proven to be a great way for me to listen to my music wherever I am with whatever computing device I have at hand.

Why? Well, let's start with the basics.

Google Music enables you to you store your music on the cloud. While Google will now let you buy music from the Android Store, it's really more of an online music storage locker than a competitor with Apple's iTunes Store.

Unlike other cloud music and storage services, Google doesn't give you a fixed amount of storage space. Instead, you can it to store up to 20,000 songs. On the Google Music Web page, Google provides a counter to let you know how close you are to hitting your limit. At an estimated 5MBs a song that works out to about 100GBs of storage. The cost? Not one red penny.

Free. I love the sound of that. I especially love the sound of free since I currently have 13-thousand plus songs in my library.

While many other services offer cloud-based music storage libraries, Apple's iTunes Match, part of iCloud; Amazon's Cloud Drive and Player; and the pioneer of online music storage MP3tunes, no one else offers so much storage without restrictions for free.

To file music into your Google Music library, you need to use Google's Music Manager. This program is available for Linux, Mac OS X, and Windows. You can use Music Manager to load files from iTunes, Windows Media Player, or directly from directories. You can load your entire music collection-up to the aforementioned 20,000 songs-or just certain play-lists or directories.

You can upload your MP3, AAC, ogg, and FLAC encoded songs into your Google Music library. You cannot, however, load Digital Rights Management (DRM), AFLAC (Apple Lossless), wac, aiff, or ra files. You can load Microsoft's WMA files if you use the Windows version of the Music Manager. Once there, FLAC, ogg, and aac files are transcoded into Google Music's default 320Kbps MPS format.

How fast it will take you to upload your music depends on your Internet speed and the size of your music files. In my experience, with mostly 256Kbps encoded songs and a 60Mbps Internet connection I was uploading about 100 songs an hour. What I did was I just let Music Manager run in the background and in a few days almost all my collection was up on the cloud. I couldn't upload all my music at first. The troublesome songs turned out to be almost entirely songs I've purchased from iTunes when Apple still encrypted music with DRM. Once, I replaced them with copy-protection free files I was able to place them in my library.

You can also upload music from multiple PCs. While I haven't tested this, Google states that "Google Music will automatically combine duplicate albums if the metadata is identical, so you can prevent duplicate albums from appearing." For my friends with messy music collections I can see how that would be really useful.

Unfortunately, unlike iTunes Match. Google Music requires you upload every last bit of your tunes. It can't confirm that there's a good copy already available on the cloud and simply duplicate that song into your personal library.

You can play your music using a modern Web browser on any PC. Google Chrome, Firefox, Safari, and Internet Explorer 7 and higher are specifically supported, but I was able to play music on Opera as well. You also need to have Adobe Flash Player up and running. If you're not running Google Chrome, you'll also need to have JavaScript enabled

On Android devices, you can listen to your music with the Google Music App.. This requires Android 2.2 and above with OpenGL2. If your smart-phone doesn't meet those requirements you can still listen to you music via the Web browser once you have Adobe Flash installed. Curiously, you can also listen to your music on iPhone, iPads, and iPod Touches running iOS 4 or higher via their built-in Web browser without Flash.

That's actually one of my favorite Google Music features. I don't need to have a particular device or even run a particular program. So long as I have something that runs a Web browser, I've got my music. Nice.

You can play your Google Music tracks on any number of PCs and up to eight Android devices. However, you can only listen to them on one device at a time. So, forget about the idea of 'broadcasting' to friends to family. In my experience, and I've listened to my music for months now on a wide variety of Linux, Windows and Mac OS X PCs and, on portable devices on my mark 1 iPad and my Motorola Droid II smart-phone. I have yet to run into any playback trouble.

The music quality, whether I was at home, using a coffee-house Wi-Fi signal, or Verizon 3G, has always been good. Of course, there was times when I was away from any Internet connection. For those times, I used the Android application to download music to my smart-phone so I was never music-less.

So far, this is great, but Google Music is no Apple iTunes killer. It can't rip music from CDs. You'll still need a local program to get your music off all media into your computer.

The Google Music store is also nothing to write home about. The store has the basic features but not a lot of selection on its virtual shelves yet. While Google has signed distribution deals with three of the big four music companies, they couldn't come to an agreement with Warner Music Group. That means you won't be buying Led Zeppelin or Green Day from Google anytime soon. On the other hand, any music you do buy online from Google can easily be downloaded, without DRM and in 320Kbps MP3 format, to your PC with the Music Manager.

Another nice feature about the Google Music store is you can share any music you get via the store with your friends on Google+. People in your Google+ Circles can listen to entire songs or albums.

So, why do I like Google Music so much since it's certainly no iTunes killer and other programs offer similar services?

Well, I like it because 1) It's free; 2) It offers an incredible amount of music storage; and 3) and it lets me listen to my music anywhere I go. It also doesn't hurt any that, unlike many of the smaller musical storage lockers, I'm reasonably sure Google will still be here next year.

Sure the Spotify and Pandora, Internet streaming music services, are great in their own right, but if you want free access to your music whenever and wherever you want it on any device, you can't beat Google Music.

BitTorrent Remote for Android Now Available for Download

Excerpted from Redmond Pie Report by Ben Reid

Torrenting is one of the primary ways Internet users snatch their digital content. While most of the media available on the top public sites falls on the wrong side of the law in terms of copyright infringement, the actual act of downloading and seeding torrents is perfectly lawful and accounts for a high portion of bandwidth use.

There are many hundreds of private sites, in which individuals work as more of a unit, often donating towards server costs and get rewarded with ratio perks. The better ones can be rather difficult to get invited into, with others virtually impossible - but once you do get in, it can be like finding treasure in a cave.

Unfortunately, there hasn't as yet been a decent torrent app to grace our mobile devices. With many enjoying fast, unlimited 3G (and often 4G-LTE) connections, it would be useful if we could capitalize on them with a little on-the-fly torrenting.

Until that day comes, we've to make do with so-called middleman applications, which allow us to monitor the progress of our down and uploads from the comfort of our smart-phone devices. Most of these mediating apps are developed by third parties, but the latest - developed for Android users, is developed by BitTorrent for the company's official torrent client.

BitTorrent Remote gives users the ability to manage, start/stop, and control torrent downloads on-the-go via their smart-phone. In addition, it also provides users with the option of viewing detailed info regarding each download such as speed, file size, speed, and ETA, so there's something there for the statisticians, too.

Users can add torrents to the app through web pages in browsers, and they will begin downloading to the linked computer.

Early ratings on the Android Market suggest that this app is indeed the real deal, with 12 of the 13 early adopters giving a five-star rating.

It's free of charge, and can be downloaded via the Android Market.

AT&T Launches Platform-as-a-Service for App Development

Excerpted from Telecompetitor Report

We've all done it at some point. You're at work, doing the same-old-same-old, and thinking certain processes could probably be made easier and simpler through technology. Now you can build a cloud-based app for that - without having to write a single line of code - and make it accessible from any device, all at the click of a mouse.

AT&T Platform-as-a-Service (PaaS) allows any business professional to build, develop, and deploy cloud-based business apps without the need for complex coding expertise, while also providing a robust platform for true developers.

"AT&T Platform as a Service is like rocket fuel for developing cloud apps," said Steve Caniano, Vice President, Hosting and Cloud Services, AT&T Business Solutions. "This is another step in our commitment to helping businesses deliver cloud-based solutions, and we are unique in our ability to surround the ease of use of Platform as a Service with the flexibility, reliability. and security of the AT&T global network."

With the announcement, AT&T becomes the only telecommunications service provider to offer truly enterprise-grade Platform as a Service capabilities in the US, pursuing a piece of the cloud computing market that independent research firm Forrester Research Inc. estimates will grow from $0.8 billion this year to $12.15 billion by 2018.

Key features of AT&T Platform as a Service include a complete cloud-based development and deployment platform; web tools and customizable templates for software development, including a library of 50 pre-built apps which can be customized or used as-is; a high-performance, redundant and scalable infrastructure to run online applications and databases; development tools to mobilize applications; integrated social networking features; 7x24x365 infrastructure monitoring, management, and support; and a monthly per-user fee for access to AT&T Platform as a Service applications.

Independent software vendors, corporate line-of-business leaders, and information technology departments are typical customers for PaaS capabilities. AT&T Platform as a Service offers many benefits to application developers.

Line of business managers can easily create new enterprise-grade applications, distribute, and manage them across an entire user base without causing application slow-down or downtime. Independent software vendors can accelerate the time-to-market of their apps. And enterprise developers can consolidate many back-office applications onto a single, fully-managed, secure environment that allows them to be accessible by PCs or mobile devices.

"We've been able to radically accelerate our app development and delivery using AT&T Platform as a Service," said Juan Perez, CEO, ekeepo, an independent software company specializing in delivering cloud apps for businesses.

AT&T's Platform as a Service capability is integrated with AT&T's network-based cloud to offer a complete enterprise-grade package, allowing application developers to build business apps that can take advantage of the scale, performance, capacity, security and reliability of the AT&T global network.

Built with ease of use in mind, AT&T PaaS is based on LongJump's cloud technology platform, which has been recognized by Forrester Research in its evaluations of Platform as a Service providers. The AT&T solution is billed monthly on a per-user basis, and being cloud-based means it can scale up or down depending on demand - and be billed accordingly.

"Platforms for application development and deployment are usually either highly productive and narrow in scope, or challenging to use but able to address complex activities," said Stephen D. Hendrick, Group Vice President for application development and deployment research at IDC. "Vendors that can provide the best of both worlds, by combining comprehensive enterprise class application development capabilities, simplified management & lifecycle support, and a secure reliable network will find success in the market. AT&T appears well positioned to address these emerging Platform as a Service needs."

AT&T plans to make it easy for developers to use its application programming interfaces (APIs) through this offer, allowing them to further innovate with AT&T technologies.

Google, Microsoft, Intel, Verizon among New Cloud-Security Registry Members

Excerpted from Network World Report by Tim Greene

Google, Verizon, Intel, McAfee, and Microsoft are joining a voluntary program set up by the Cloud Security Alliance that provides public information about whether contributors comply with CSA-recommended cloud-security practices.

By reading reports submitted to CSA's Security Trust and Assurance Registry (STAR), potential customers of participating providers can more readily assess whether products and services meet their security needs.

To encourage other participants, CSA is encouraging businesses to require that any cloud vendors they deal with to submit reports to CSA STAR.

For example, eBay is requiring the submissions from all cloud vendors it works with, says the company's CISO Dave Cullinane. He says the information will help eBay security and its customers' privacy. Similarly, Sallie Mae will look for cloud vendors to demonstrate their security via CSA STAR filings.

CSA STAR lets participants file self-assessment reports about whether they comply with CSA best practices. The registry will also list vendors whose governance, risk management, and compliance (GRC) wares take the CSA STAR reports into account when determining compliance. The idea is that customers will be able to extend GRC monitoring and assessment to their cloud providers, the CSA says.

Google, Microsoft and Verizon will submit information about their services and Intel and McAfee will file reports about security products.

CSA announced the keystone participants in its STAR program at CSA Congress 2011 in Orlando, FL, this week.

CSA also announced it is extending its scrutiny to cloud-based security service providers - businesses that offer security services from cloud platforms.

Customer concerns with security as a service include: systems might not be locked down properly; personnel might not be vetted thoroughly; data leakage among virtual machines within multi-tenant environments; and cloud-based security services might not meet compliance standards.

"When deploying Security as a Service in a highly regulated industry or environment," says the CSA's latest Guidance for Critical Areas of Focus in Cloud Computing, "agreement on the metrics defining the service level required to achieve regulatory objectives should be negotiated in parallel with the SLA documents defining service."

These cloud-based security services are wide-ranging and include identity and access management, data loss protection, Web and email security, encryption, and intrusion prevention, CSA says.

Read more about security in Network World's Security section.

Why SOPA Would Hurt Start-Ups, Not Pirates

Excerpted from Inc. Report by Lindsay Blakely

It's not at all clear that the bill would even succeed in catching online piracy. But it could significantly harm innovation on the web.

What's currently being called the Stop Online Piracy Act (SOPA) may very well acquire a new name if it manages to come to a vote and pass: The Bill That Broke the Internet.

And tech start-ups - as well as innovation on the web in general - are going to be the real losers in the end, say SOPA's biggest critics. I caught up with Corynne McSherry, intellectual property director for the Electronic Frontier Foundation (EFF), an organization that's vehemently opposed to the bill, to get her take on why it puts online businesses and tech start-ups in the crosshairs.

Inc: Why do you see this bill as dangerous?

McSherry: I've seen some bad bills introduced in Congress but this one takes the cake. What it essentially does is set up a system where any intellectual property rights holder-and that [term] is not defined well-can identify a portion of a site that is enabling, facilitating, or taking steps to avoid confirming [copyright] infringement, whatever that means. Then they can send notice to payment processors and ad networks alerting them that the site is doing bad stuff. The processors and networks have five days before they must cut off working with the website.

Inc: Then what?

McSherry: The website has five days to figure out how to respond before it's cut off. If you're YouTube, you have the resources, lawyers, and wherewithal to fight back. Now imagine you're an everyday start-up that's just getting off the ground. It doesn't want to spend its money on lawyers; it needs to spend its money developing the technology that's going to make it succeed.

Inc: You mentioned that the vagueness of the bill is particularly troubling.

McSherry: Sometimes people who don't have the IP rights in question tell a website to take something down because they don't like content. Or maybe the person is a rights holder but doesn't know the rules of fair use. So even if it's an illegitimate claim, the content comes down and usually doesn't get put back up for two weeks. This is an abuse of the takedown provisions in the current law. I have every reason to believe there would be more abuse with SOPA.

Inc: Could Congress rewrite this bill so that it protects against piracy and counterfeiting without hurting start-ups?

McSherry: I don't think this is a problem that is going to be fixed by legislation. It's not going to be addressed in courts. Past history should tell us this. For well over a decade the recording industry has been suing every file sharing site in existence. Has that stopped file sharing? Of course not.

Inc: So what's to be done about the issue of online piracy and counterfeiting?

McSherry: To be honest, I tend to think we have everything in place that we need. It may be that we all have to accept that just as there is shoplifting in the world, there will be a certain amount of piracy. It's the price of doing business.

If you're a music or a video fan, you have more access to creative content than ever before. If you're a creator - and we're really talking about big media here - you have more ways to get content out than ever before. That's what we should be focusing on and supporting, not damaging the YouTube of tomorrow. That would be a really bad trade off.

The Essence of Cloud

Excerpted from E-Commerce Times Report by Dick Benton

The industry roils with definitions and explanations of cloud. These definitions come from product vendors cloud-washing their products, cloud providers positioning their cloud infrastructure, IT teams attempting to cloud-paint their efforts in virtualization, and even from consultants writing articles like this.

This article will examine the essence of cloud and will provide a solid working foundation against which progress toward the cloud can be objectively assessed.

Cloud computing is essentially a deployment model that sets a new paradigm for how services are selected, provided, and billed. The service consumers are typically computer-literate business units, application developers, and IT capacity planners.

Technologies that make cloud computing possible include: Internet access, a shared pool of virtualized resources, and the ability to support an elastic services pool that can be turned on and off depending on capacity demand (this is achieved through a combination of capacity planning and technology).

So, what makes the cloud deployment model different from other deployment models (i.e., the platform oriented services or service provider models)? Quite simply, the cloud is defined by the following three components and technologies: 1) self-selection of services, 2) automated provisioning of those services, and 3) billing for services ordered.

These deployment components are made possible by three key enabling technologies: 1) the web interface, 2) virtualized shared resources, and 3) the ability to support elastic demand.

Historically, there existed two deployment models: platform oriented and service provider. The first generation was the legacy deployment model, where services where delivered based on a specific platform and nominated by a piece of hardware. This first-generation model required each service request to have not only multiple approvals through a chain of authorization, but also it inevitably resulted in a conflict with considerable backlog. As a result, each application, and its associated infrastructure, needed individual care and attention.

Success with this model is only achieved with considerable administrative effort with extensive backlogs and little opportunity for consolidation or repeatability improvement. In this scenario, costs were known only at the grossest level of total expense and were allocated annually as a budget overhead. Additionally, the concepts of elasticity and a virtualized or shared environment did not yet exist, while Internet access to administrative functionality was rare or non-existent.

The second-generation deployment model was delivered by the ITIL-based service provider model. This was structured with tiers of service through a service catalog with results delivered under the guarantees of a service level agreement. Within each tier of service, all infrastructure components had to meet quality of service needs, including performance, recoverability and availability. As for pricing, this model allows these functions to be performed at a determined unit cost per GB or per GHz.

The service provider model has significant advantages over the legacy application or platform oriented approach because this model forces a rationalization to a small number of tiers (typically three-to-five) differentiated by business aligned service attributes and delivered via a known and visible unit cost. This reduces IT administration to manage information from multiple platforms into three to five tiers while also encouraging cost-conscious decision making on the part of the consumer.

Alas, the provisioning effort with this ranges from days to weeks because of the multiple handoffs required for the approval process. This process can be further complicated if there is a need to acquire additional new resources. This second generation deployment model offered businesses the ability to consciously select services by cost but failed to provide a discrete invoice for these services. The service provider model implemented virtualized storage and servers as tiers of service; however, elasticity was only available through thin provisioning.

Administrative technologies made the ability to reclaim resources and return them to an available pool too complex for most organizations. Elasticity was achieved to some extent through the use of thin provisioning technologies but it was still a difficult task to release and reclaim unused or no longer used resources.

This takes us to our third and current deployment model - the cloud. Cloud computing is distinguished by three critical consumer-facing criteria.

First, the cloud deployment model assumes that the service consumer is competent and able to select the appropriate services and the money to pay for it. Neither of these assumptions hold true in the legacy first-generation deployment model.

In the second-generation service provider model, both these assumptions may be true, but more often the second-generation deployment model will include a significant "authorization" process because resources are finite and require consumption audits. Note that under the cloud deployment model, the service selection process may include some automated policy enforcement to replace the legacy authorization process.

The second criteria of the cloud deployment model is the concept of auto-provisioning. At once, this eliminates both procedural approval overheads and delays while also preventing technical configuration overheads and delays. Consequently, this allows for considerable labor cost savings.

Perhaps most importantly, consumer satisfaction runs rampant as requested resources are provided almost immediately following the request or, at worst, same day. Gone are the days of IT departments saying "no" or "that's going to be difficult." Now IT says "YES, and here's what it will cost." With the cloud, IT is no longer the denier of services but rather the enabler.

The third key element of the cloud deployment model is the requirement of a formal billing (and collection) of monies. With the first-generation deployment platforms and applications costs (especially at the unit level) were generally unknown. Costs for services were often an annual transaction seen as overhead at budget time.

Under the second-generation deployment of the service provider model, costs were usually (but not always) visible at the time of service selection, and only in rare instances were they charged back at the service order level. Typically, second-generation models saw a similar approach to first-generation deployments with an annual budgetary cross charge of "overhead."

Although the cloud is a big step forward, this model also has interesting issues, particularly when it comes to internal or private implementations. Once a consumer realizes that storage can be ordered in any quantity, they may not only order a generous helping, but they might double or triple the order just in case someone else jumps in and uses up what is left, leaving nothing for others to use.

This is a perfect example of why billing or chargeback is mission-critical to the cloud deployment model and why it is critical for this billing/chargeback to occur at the individual consumer level. Without this component, all resources could disappear overnight from a private cloud, and in the public cloud, a nasty surprise may await the CFO when the monthly bill is received.

As IT struggles to do more with less, the server virtualization arena is often seen as an opportunity to consolidate servers and reduce the number of hard entities under management, while showing senior management their commitment to "state of the art" cloud computing.

It is not unusual for an IT team to define their virtualized environment as an internal (or private) cloud. Now, it is absolutely true that virtualization of platforms are an essential technology enabler to the cloud deployment model.

Without the ability to pool and share resources, the cloud model is challenging to the point of impossible to implement; however, virtualized platforms are not the "essence" of cloud computing, but rather the enabling technology.

The essence of the cloud is the ability of the infrastructure to offer the consumer an automated capability to self-select the service/resource they want, to have that service/resource made available almost immediately, and to have those services billed to the consumer through a classic invoicing function.

The enabling technologies that support the cloud deployment model include Internet access/capability for service selection and service delivery monitoring, virtualization of the compute and storage environment to support shared resources, and the capacity planning skills and technology to support elastic resource allocations.

By understanding the critical components of the cloud and its enabling technologies, we see how the deployment strategies used in the first- and second-generation models have changed. With this understanding, IT and consumers are better prepared with a more realistic expectation for the transition ahead.

Coming Events of Interest

2012 International Consumer Electronics Show (CES) - January 10th-13th in Las Vegas, NV. With more than four decades of success, the International CES reaches across global markets, connects the industry and enables CE innovations to grow and thrive. This is the world's largest consumer technology tradeshow. 

CONTENT IN THE CLOUD at CES - January 11th in Las Vegas, NV. Gain a deeper understanding of the impact of cloud-delivered content on specific segments and industries, including consumers, telecom, media, and CE manufacturers.

2012 NAB Show - April 14th-19th in Las Vegas, NV. From Broadcasting to Broader-casting, the NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, the NAB Show has proudly served as the incubator for excellence helping to breathe life into content everywhere. 

CLOUD COMPUTING CONFERENCE at NAB - April 16th in Las Vegas, NV. Don't miss this full-day conference focusing on the impact of cloud computing solutions on all aspects of production, storage, and delivery of television programming and video.

Copyright 2008 Distributed Computing Industry Association
This page last updated November 27, 2011
Privacy Policy