Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2P Leaders

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

June 28, 2010
Volume XXXI, Issue 4


Judge Dismisses $1 Billion Copyright Suit Against YouTube

Excerpted from Washington Post Report by Cecilia Kang

A federal judge dismissed Viacom's $1 billion copyright infringement lawsuit against YouTube, after a three-year battle that raised questions about how websites can use and share original content.

Judge Louis Stanton, of the US District Court of the Southern District Court of New York, ruled in favor of Google, which owns YouTube, saying a "safe harbor" in the Digital Millennium Copyright Act (DMCA) protected the search giant because the firm immediately took down videos owned by Viacom when the clips were discovered.

"When they received specific notice that a particular item infringed a copyright, they swiftly removed it," Stanton wrote in his summary judgment order released Wednesday. "It is uncontroverted that all the clips in suit are off the YouTube website, most having been removed in response to DMCA takedown notices."

Those actions protected Google from liability for copyright violations, the judge said. But Viacom said in a statement that it will appeal the case.

"We believe that this ruling by the lower court is fundamentally flawed and contrary to the language of the Digital Millennium Copyright Act," Viacom said in its statement. "The intent of Congress, and the views of the Supreme Court as expressed in its most recent decisions. We intend to seek to have these issues before the U.S. Court of Appeals for the Second Circuit as soon as possible.

Google wrote in a blog posting that the ruling was an "important victory, not just for us, but for the billions of people around the world who use the Web to communicate and share experiences with each other."

"The decision follows established judicial consensus that online services like YouTube are protected when they work cooperatively with copyright holders to help them manage their rights online," Google said in its blog posting.

Google and other Web companies such as Facebook, Yahoo, and IAC/Interactive sided with YouTube and filed friend-of-the-court briefs in the case, asking the judge to dismiss the lawsuit. Those firms had much at stake in the case, as their websites use and exchange a wide range of content, including as news articles, videos and photos.

But content publishers such as Viacom, which owns MTV and Paramount Pictures, have argued that they lose money on their programming when users exchange that content freely without advertising and other fees going to the production houses.

Public Knowledge, a public advocacy group, said the decision strikes a good balance for content and Web services companies.

"The burden to point out allegations of infringement is with the content provider, and the burden of taking down material lies with the service provider," said Sherwin Siy, Deputy Legal Director of Public Knowledge.

"Had Viacom won this case, that burden would have shifted dramatically. As the law now stands, prompt compliance with take-down notices shields an online service provider from liability."

EFF's Fred von Lohmann Heads to Google

Excerpted from CNET Report by Greg Sandoval

Fred von Lohmann, likely the technology's sector most recognized legal advocate, has called it quits as senior staff attorney for the Electronic Frontier Foundation (EFF). One of Grokster's lead attorneys in the landmark MGM v. Grokster case, von Lohmann confirmed he is leaving EFF to take a job as Google's senior copyright counsel.

If you're a fan of unimpeded innovation, the free distribution of content over the web, and Internet users' right to privacy, then you should take your hat off to von Lohmann. He has toiled to prevent tech start-ups accused of copyright violations from being stomped into jelly by mammoth entertainment conglomerates.

A supporter of the free flow of information, von Lohmann, 42, has spoken out on behalf of or offered legal advice to a score of companies, including YouTube, Veoh, TorrentSpy, LimeWire, IsoHunt, and RealNetworks.

In a 2007 CNET story about the growing number of file-sharing services forced to close down, von Lohmann sized up how technologists viewed the copyright clash between them and content creators and the stakes involved.

"Everybody forgets that when the VCR was first developed, most of the uses were infringing copyright," said von Lohmann, a past recipient of California's Lawyer of the Year award. "There was no Blockbuster video rental store or legitimate way to rent movies back then."

Later, the VCR would be the foundation for the multi-billion dollar home-video industry and it wouldn't have existed if the studios had succeeded in killing the VCR. "It's vital to leave room for innovation," von Lohmann said. "You have to give technology a chance to develop into something."

In his backing of cases that typically faced long odds, von Lohmann's competence as a lawyer and reasonable demeanor has stood out in a copyright debate that grows more mean spirited and acrimonious by the day.

Jonathan Zittrain, a Harvard law professor and Co-Director of Harvard's Berkman Center for Internet & Society, said von Lohmann reminds him of the fictional Dr. Seuss character, The Lorax, a defender of the environment.

It's like "'I am the Lorax and I speak for the trees,'" Zittrain said. "To me, Fred is somebody who has been in the trenches as a litigator and that means you must take views and stick with them to do battle. Yet, I don't know him as ideologically inflexible.

"It's rare to see somebody in the trenches that long who still has the flexibility to say, 'What is the right answer here?'" Zittrain continued. "That's why those who may have had interests implicated by EFF policies and positions may have had reason to fear him but not consider him a foe."

New Cloud Computing White Papers Available

DCINFO readers are encouraged to read these two new free white papers about cloud computing.

First, please download this Focus Brief to understand the top ten trends in cloud computing so your business can take advantage.

Cloud computing is just like the weather; it blows this way and that, and no one really knows exactly where it's going or what the cloud will cover. But looking into a murky crystal ball, one can see trends in cloud computing that deserve attention.

Cloud computing may be ill-defined, but it's here to stay and it's having important effects upon the competitive landscape. Businesses small and large should understand the cloud's impact and how it can be leveraged.

Second, please download Securing the Physical, Virtual, Cloud Continuum.

As cloud technology continues to evolve, there is a strong need for the evolution of security practices in the space.

Security and compliance concerns trump the adoption of external cloud computing options in most industries. Trust is a key concern, and an overhaul of security practices to include the physical, virtual, cloud continuum is now needed like never before.

This free Nemertes Research white paper, sponsored by Juniper Networks, details the issues and provides recommendations for this growing movement.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThis week's summary judgment in favor of YouTube in the three-year billion-dollar copyright infringement lawsuit brought against it by Viacom more than anything underscores the growing urgency of the need for the entertainment marketplace to adopt to the realities of the Internet.

The court basically found that YouTube is protected against such claims by the safe harbor of the Digital Millennium Copyright Act (DMCA). The decision follows established judicial consensus that online services are protected when they comply with copyright holders' takedown notices. But it does nothing to address the issue of lost value.

This is not meant to assign blame. The three private sector constituencies that have not yet solved the underlying problems here, in more than a decade, have no precedent upon which to base an easy fix: the entertainment industry, the telecommunications industry, and the software industry.

If this was as easy as recognizing that the VCR could be used for a new revenue stream, movie rentals, major players wouldn't find themselves in the increasingly divisive situation they are in today.

Viacom has already said it will appeal the decision, which may take years to reach a final conclusion given the entertainment conglomerate's own deep pockets and those of YouTube's parent company Google.

The entertainment industry will also lobby Congress to close what content providers will now portray as a gaping loophole in the current DMCA, which led Judge Louis Stanton to reach his conclusion.

But this first way of addressing the problem, litigation, won't solve the problem - as we should have learned from MGM v. Grokster, mass consumer lawsuits, and other related legal actions. And this second way, legislation, as we have should have learned from countless associated efforts in that realm, won't work either.

The inadequacy of the DMCA to reflect the reality of today's broadband networks, Internet-based software applications, and digitized entertainment content cannot be corrected by redrafting what would likely be outdated and unenforceable before the ink had dried on the new bill that would be intended to accomplish this.

To borrow a phrase adopted by FCC Chairman Julius Genachowski in the ongoing net neutrality debate, what we need is a "third way." In this case, that way is licensing.

A leading US Senator commented after the Congressional Hearing at which then MetaMachine CEO Sam Yagan announced that he was abandoning efforts to legitimize eDonkey after being rebuffed by the entertainment industry: the phenomenon of consumers sharing media content online is not going to go away. The way forward for all parties has to be one where this behavior is accepted and monetized.

Judge Stanton rightly concluded that Congress intended for the DMCA safe harbors to cover YouTube's consumer-based functionality:

"The tenor of the foregoing provisions is that the phrases 'actual knowledge that the material or an activity' is infringing, and 'facts or circumstances' indicating infringing activity, describe knowledge of specific and identifiable infringements of particular individual items.

Mere knowledge of prevalence of such activity in general is not enough. That is consistent with an area of the law devoted to protection of distinctive individual works, not of libraries.

To let knowledge of a generalized practice of infringement in the industry, or of a proclivity of users to post infringing materials, impose responsibility on service providers to discover which of their users' postings infringe a copyright would contravene the structure and operation of the DMCA."

And as he continued, this is quite logical:

"That makes sense, as the infringing works in suit may be a small fraction of millions of works posted by others on the service's platform, whose provider cannot by inspection determine whether the use has been licensed by the owner, or whether its posting is a 'fair use' of the material, or even whether its copyright owner or licensee objects to its posting.

The DMCA is explicit: it shall not be construed to condition 'safe harbor' protection on a service provider monitoring its service or affirmatively seeking facts indicating infringing activity."

And moreover, the DMCA as enacted is working as intended in this instance:

"Indeed, the present case shows that the DMCA notification regime works efficiently: when Viacom over a period of months accumulated some 100,000 videos and then sent one mass take-down notice on February 2, 2007, by the next business day YouTube had removed virtually all of them."

It can be predicted that the entertainment industry will now take issue with the fact that the burden falls on them to identify infringing properties and to have to provide notice to service providers. This will be deemed as not good enough given today's mass distribution capabilities on broadband networks.

Rather than engaging in a new round of whack-a-mole by trying to encompass general awareness of infringement as actionable for contributory or vicarious infringement, progress would be better served by finally devising a way to license the use of media in social networking and via file-sharing applications.

In many ways, super-distribution, where users themselves contribute to expansion of the delivery channel of a given copyrighted work should be accepted and even encouraged as a good thing rather than prevented. 

The two problems that should be attacked in order to make this desirable are: 1) YouTube, at its inception, had no possible way to obtain the necessary but non-existing licenses to enable its users to redistribute material from MTV Networks and other Viacom media platforms; and 2) Viacom and YouTube had no means to monetize this redistribution in a way acceptable to all participants in the distribution value chain.

Marketplace solutions to these two challenges would go a long way in advancing online delivery of media and pre-empting the expected rush back to more litigation and legislation.

If you are a rights holder, you need to set terms-and-conditions for digital distribution of your works. These can reflect volume discounts for major resellers and a myriad of unique circumstances, but what is no longer acceptable as a practical matter is to have no license available at all.

If you are an Internet-based service that can facilitate the copying and/or transmission of copyrighted works, you need to provide means for monetizing this material for rights holders and you must obtain the relevant license.

The music industry (notwithstanding contrary rumblings from BPI and IFPI this week) should be commended for its exemplary work in this regard with YouTube. 

Major labels did establish blanket licensing for the use of music in videos submitted by YouTube users and provided information for a filtering mechanism YouTube designed to identify and address music included in uploaded videos. This now needs to be taken further for smaller players in the music marketplace, as well as expanded to video.

That's where the emphasis should be going forward. That's where resources should be deployed rather than wasting them on futile efforts to enforce outdated business models. Share wisely, and take care.

Fast Changing Consumer Behavior Forcing New Business Models

According to PricewaterhouseCoopers' (PwC) Global Entertainment and Media (E&M) Outlook: 2010-2014, global entertainment and media spending is expected to rise from $1.3 trillion to $1.7 trillion by 2014, growing at a compound annual growth rate (CAGR) of 5.0%.

The US E&M market is expected to grow at 3.8% CAGR reaching $517 billion in 2014, from $428 billion in 2009. Fast changing consumer behavior is expected to be the catalyst of the entertainment and media industry change over the next five years,

Ken Sharkey, US Leader, Entertainment, Media & Communications practice, PwC, notes that, "The digital pace of change has proven to be even quicker than anticipated with consumers embracing new media and digital downloads at often-unexpected speeds. The continued fragmentation of the E&M sector will fuel greater experimentation by both established industry giants and niche players."

Digital services continue to be the primary growth engine, but traditional revenue streams are expected to remain significantly larger throughout the forecast period. The industry will need to embrace digital not as a competitor to traditional services, but as a complement. Digital spending in the US is expected to account for 26% of all E&M spending in 2014, up from 19% in 2009.

While there are signs of a rebound, advertising is unlikely to return to former levels. By 2014, the US advertising spend is expected to still be 9% below its level in 2007. Overall US advertising is expected to increase at a 2.6 % CAGR from $159 billion in 2009 to $180 billion in 2014. In the US, Internet advertising is expected to surpass newspaper advertising spend in 2010.

Advertising spending for Internet, television, radio, out-of-home, and videogames is expected to be larger in 2014 than in 2009, while consumer magazines, newspapers, directories and trade magazines are expected to be smaller.

These projections reflect the market fragmentation and consumer behavioral changes. The advertising industry is responding to consumers' shifting attention and migrating towards total marketing or total brand communication. Brands are changing their focus from advertising on a medium, to marketing through, and with, content.

Consumer feedback and usage provides the only reliable guide to the commercial viability of products and services, and the global consumer base is being used as a test-bed for new offerings and consumption models. PwC has identified three themes that are expected to emerge from changing consumer behavior and the industry must anticipate and pre-empt the needs and wants of consumers:

1) Rising power of mobility and devices: Advances in technology are expected to see increasingly converged, multi-functional mobile devices come of age as a consumption platform by the end of 2011. By 2014, US mobile Internet access subscribers are projected to increase to 96.1 million, a 40% CAGR from 2009.

2) Growing dominance of Internet experience over all content consumption: Increasingly, the consumer has moved beyond thinking of the Internet as an end in itself, and expects all forms of media to embed the convenience, immediacy and interactivity of the Internet. People are already consuming magazines and newspapers on Internet-enabled tablets, and streaming personalized music services in preference to buying physical CDs.

3) Increasing engagement and readiness to pay for content-driven by improved consumption experiences and convenience: Consumers are more willing to pay for content when accompanied by convenience and flexibility in usage, personalization and a differentiated experience that cannot be created elsewhere. Local relevance is also expected to enhance the content providers' ability to charge.

Digital migration and consumer behavior changes have put extreme pressure on existing business models. The proliferation of platforms and rising consumer expectations mean companies can no longer "be everything."

"The industry must radically rethink its approach to monetizing content in capturing new revenue sources, from transactions or from participation with others operating in the evolving digital value chain," added Sharkey.

Former NBC TV Prez to Pace Business Dean

Excerpted from Media Daily News Report

Neil Braun, former President of the NBC Television Network and CEO of Viacom Entertainment, has been appointed dean of Pace University's Lubin School of Business. He takes over from Joseph Baczko, former CEO of Blockbuster, who is returning to private industry.

Braun is currently CEO of The CarbonNeutral Company, which assists companies in analyzing and reducing their carbon footprints.

Pace is committed to teaching and practicing sustainability; it also has a strong environmental law program.

Everyone Wins When Cloud Computing Meets the Channel

Excerpted from GigaOM Report by Derrick Harris

Selling cloud computing - especially of the externally hosted variety - to established businesses is no easy feat. They understand the potential benefits, but they've just spent years and possibly large sums of money on virtualization efforts, and they have their own specific problems that aren't easily addressed by one-size-fits-all cloud offerings.

In order to remedy such sales obstacles, many cloud companies are turning to channel partners - systems integrators, resources, telcos, and the like. As trusted faces for many businesses, they can offer personalized service that many cloud providers cannot.

One provider making no secret of its channel strategy is OpSource. The company said this week that it's appointed telco veteran Keao Caindec as Senior VP and CMO, with the hopes that he'll use his experience to make that channel a major distributor of OpSource's cloud services.

Indeed, the combination of telco experience serving enterprise IT with OpSource's enterprise-grade cloud-computing and software-as-a-service (SaaS) hosting platforms is a near-perfect match. And Caindec said OpSource expects that going forward, more than half its business will be driven by channel partners.

Even cloud vendors are getting into the channel game. During a recent conversation with Elastra's Stu Charlton, he said that the company is seeing a large degree of interest from systems integrators. The rationale is quite simple: they want a solution that lets them test customers' applications on the cheap, or that lets them port legacy applications to the cloud, and Elastra provides just such a solution with Cloud Server.

Recently, IT distribution giant Ingram Micro launched its Cloud Conduit program, which gives its channel customers the ability in turn to sell cloud services to their customers. Proving that the biggest names in cloud computing also see the value of channel partnerships, the three initial partners in Ingram Micro's program are Amazon Web Services, Rackspace Hosting, and Salesforce.com.

All in all, cloud computing channel partnerships appear to be a win-win-win arrangement for all parties involved. Resellers get in on a hot market without making huge investments. OpSource's Caindec mentioned one telco partner that believed it lost several million dollars in the last quarter because it lacked a cloud offering. Cloud providers increase the visibility and attractiveness of their offerings. And end-users get cheaper information technology (IT) without having to learn the intricacies of developing or managing applications to run in the cloud.

For more on this burgeoning cloud ecosystem, including who might end up losing as a result of it, read the full post here.

BitTorrent Client uTorrent 3.0 Alpha Released 

Excerpted from Ghacks Technology News Report

The BitTorrent client uTorrent made a huge leap forward this week with the release of the first public uTorrent 3.0 alpha build. This continues the release policy of the developers who have always offered access to both stable and development builds of the popular torrent downloader.

The latest stable version offered on the uTorrent homepage is version 2.02, the latest official beta 2.03.

Utorrent 3.0 Alpha replaces the latest development release uTorrent 2.x.

The release notes of uTorrent 3.0 Alpha include bug fixes, changes, and the ability to disable UDP trackers.

The latest alpha of uTorrent 3.0 is available for download at the uTorrent forum. Users who have already worked with development releases before will get an automatic update notification in the Bittorrent client to install the new version directly.

BT Conferencing Selects Kontiki Video Platform

BT Conferencing, a leading global provider of audio, video, and web collaboration services, has announced an extension of its unified communications portfolio through the introduction of Kontiki's enterprise video solutions, including live video webcasting and video-on-demand (VoD) services.

Available immediately as part of BT Conferencing's offerings, the new service means that even the largest multi-national organizations can quickly and cost effectively send broadcast quality video content to their employees' PCs regardless of which network they are on. Specific uses include the following:

CEO all-hands town hall meetings: Live video webcasting that reaches all employees for company-wide events, without requiring any network infrastructure upgrade.

Turning telepresence into "global presence": Extends investments in telepresence as well as desktop video conferencing systems, driving up viewership by giving employees the ability to view the conference as it is happening or the flexibility to watch it on demand at a more convenient time.

Any live video broadcast to employees: For business unit executives, product managers, HR, or the training department who need to reach a large audience of employees using video.

Aaron McCormack, CEO of BT Conferencing, said, "Video is becoming a 'must have' tool for business communication programs, and the new BT solution offers a natural step for customers who are already using collaboration services and want to add or expand the use of live video webcasting or video on demand to their company-wide events."

BT's unified communications portfolio is designed to provide users with real-time access to all forms of communication and collaboration services from multiple devices.

McCormack continued, "With this latest offering, BT Conferencing becomes the only player in the market that can provide this complete unified communications portfolio. Our customers are able to communicate instantly and uniformly to their employees around the world with high-quality video. This includes small offices and home workers who are an increasing percentage of the workforce and are often unable to take part in town hall and all-hands meetings."

Kontiki's Enterprise Video Platform delivers business video every day to hundreds of thousands of global workers in many of the world's largest companies in the financial services, retail, technology, telecommunications, and manufacturing market sectors. Kontiki's patented software-as-a-service (Saas) based Enterprise Content Delivery Networking (CDN) platform deploys in days and scales to all employees while enabling enterprise-grade management and control.

In addition to enabling live video webcasting and VoD, Kontiki's platform also leverages corporate investments in video content, making it available via a next-generation, consumer-style video community for searchable and user-generated content with familiar social networking capabilities.

Eric Armstrong, President & CEO for Kontiki, said the new service with BT Conferencing will deliver live video webcasting and VoD to all of its customers' employees simply and easily.

"Video is delivering real ROI in the enterprise and doesn't have to be complicated - it can be as easy as e-mail with no impact on existing network traffic. We are thrilled to help complete BT's leadership position within the unified communications category and leverage the growing demand for unified communications, enterprise video and social media."

Cloud Computing Software Offers Fast ROI

With the advent of large scale data mining and text analytics software, The Internet Time Machine has broken new ground in trend analysis. By monitoring worldwide search volume and keywords on search engines, and then comparing it to supply, or "results" in worldwide search, the Internet Time Machine software is able monitor what people are looking, yet can't find.

"We are always excited when the software is able to find a strong emerging trend with high search volume from around the world and little or no supply to satisfy that search volume." said Founder Curt Dalton. "The vast trend and analytics intelligence out there only looks at demand, like the 'Hot 100,' but we study supply as well. We are able to create a supply/demand curve to monitor what people are looking for and what is a new trend, yet also know that supply, or information out there on the subject, is not keeping up with demand."

The software, which runs on the new search engine, NowRelevant, recently showed an emerging trend with huge demand growth and very little supply.

"The most recent alert we got centered around Facebook social media games." said Dalton. "As Facebook has grown, so has the competition level to create the next Farmville or Mafia Wars. The real money making niche with little or no competition is creating a game guide, or cheat sheet, for these games. The most popular Farmville cheat sheet course online is making hundreds of thousands of dollars a month, and if you know what are the five fastest growing games on Facebook, you could be the first one to have a guide or course out there on it. We did a social media niche post and video for people where we showed viewers how we picked up this trend and how we figured out the fastest growing games on Facebook."

The forum section of the Internet Time Machine's website has been active with members collaborating on ideas and teaming up on projects to produce guides.

The second alert coming from this niche marketing software centered around the huge growth in demand for online training videos. With the current economic situation forcing so many people back to educational degrees and new training, the demand for certain trainings and information has skyrocketed.

"Since we study supply and demand, we are able to find the training people are looking for online but that no one is offering yet." said Ali Khan, CTO of The Internet Time Machine. "There are over 300 million broad based searches on the term 'pivotal response training video' worldwide and no one has a course or e-book out there on it." said Khan.

"Potty training regression videos or courses would do very well online right now as well. Millions of searches worldwide for potty training regression help and not much out there." The new post and video about the 20 Top Trainings People Are Looking For Online went live recently and his getting heavy online buzz.

"Response has been great since we went live to the public and we continue to grow the software and the new search engine out."

PC Hypervisors Virtually Change Everything

Excerpted from Virtual Strategy Magazine Report by Amrit Williams

As I was pondering the challenges of current desktop management, researching the latest and greatest from the desktop virtualization vendors and talking to a lot of large organizations, I was trying to find that one thing that explained the operational benefits of client-side virtualization. It really does come down to the need for standardization in our end-user computing platforms. Consolidation is the "killer app" for server/data center virtualization and standardization is the major benefit for client-side virtualization.

Computing environments have changed radically over the years, from static, tethered devices residing behind a well-protected perimeter that only required minimal protection and maintenance to complex, highly-distributed computing environments with a large population of remote, intermittently connected computing devices accessing not only centrally managed corporate resources, but also corporate resources managed by a third-party SaaS or cloud computing provider.

This evolution to distributed computing is occurring in parallel with the most hostile threat environment we have ever experienced. With law enforcement officials suggesting cybercrime is hitting its zenith and becoming more profitable than even the international black market for illicit drugs, nation-states opining on cyberwar and of course the prevalence of disclosure and identity theft, most of the country has been left numb to the dangers of online transactions.

There is no question that poorly managed systems are less secure, cost more, and lead to reactionary spending with different groups implementing disparate management and security tools, none of which are easily aligned with adjacent IT departments within the organization.

Gartner publishes research on the total cost of ownership (TCO) for PCs, in which they find that locked down and well-managed PCs can cost up to 42% less than unmanaged systems. They also found that somewhat managed and mismanaged PCs were only slightly less costly than unmanaged PCs due to the costs incurred from the management systems themselves.

The volume of technical support calls that result from deviations of IT standards or common operating environments are not only significant but consume a disproportionate amount of time to troubleshoot and resolve.

The reason desktop management is so costly is in large part due to a lack of standardization. The problem is that IT has not been able to maintain a common operating environment (COE) that enables them to effectively manage and secure their end-user population. For every application deployed or updated; for every patch release, AV data file update or system modification; for every downloaded widget and system reboot, there is some segment of the user population that experiences downtime, conflicts or other technical issues resulting from variability in their computing environment.

These challenges, coupled with the success of server virtualization, have driven a lot of attention toward desktop virtualization. The problem is that even though standardization is good, the majority of companies will find desktop virtualization an exceedingly difficult and unacceptably costly proposition.

With VDI, virtual desktop images are stored in a data center and provided to a client via the network. The virtual machines will include the entire desktop stack, from operating system to applications to user preferences, and management is provided centrally through the backend virtual desktop infrastructure.

The promise is that VDI will replace the need for myriad systems management and security tools that are currently deployed. No more demands for traditional desktop management tools for OS deployment, patch management, anti-virus, personal firewalls, encryption, software distribution and so on. In fact, many are suggesting that we can return to thin client computing models.

Standardization has major implications for security management as well. In the latest Verizon Data Breach Investigations report the forensic analysis found that almost 80% of the all data breaches were from an external source, and in 98% of those external breaches the attacker exploited poorly administered or mis-configured systems.

Before embarking on a desktop virtualization project, organizations will need to understand the impact: Does your organization support remote, intermittently connected mobile computing devices? Have you considered the cost of the backend VDI/HVD infrastructure (network, storage, hardware, etc.)? Will the project require specialized FTEs in addition to current IT staff? How will the organization manage and secure the virtualization infrastructure? How will the organization maintain the health and security of roaming virtual desktops or desktops in use? What are the licensing costs for the operating systems, applications and other end-user components aside from the virtual software itself?

In some select situations, VDI or server-hosted virtual desktops hold promise for improved efficiencies, lower management costs and improved security, but its effectiveness is limited to those environments that can adopt thin client computing models, do not require offline or mobile support and can enforce draconian usage policies on their user population so that personal computing power does not impact productivity or end-user satisfaction.

VDI has additional problems as well. First is the inherent cost and complexity in simply implementing VDI. In many cases the backend requirements for storage, networking, connection brokers and management systems can be 4-10 times as expensive as traditional solutions.

Second, the reality is that regardless of the marketing hype, media frenzy, and vendor misinformation, these systems still require real-time systems management and security solutions. Centralizing the desktop image does not magically protect it from viruses, intrusion attempts, system compromises, or operational failures.

Third is that even if one could efficiently and with limited investment implement virtual desktops, the user population would still be unable to work offline or in a disconnected fashion while operating under the same integrity and protection provided while tethered within the corporate network. Additionally, most users would never allow themselves to be deprived of personal computing power, so a thin-client model would only work in those situations in which the user population required little more than access to a single or small set of corporate applications, and the devices themselves had 'always-on' static network connectivity.

Fourth, and most importantly, is that VDI now introduces a single point of compromise. An attacker only needs to attack the central data center servers to bring down the entire end-user population.

As cumbersome and inappropriate as VDI models may be, there are alternatives that provide the benefits of desktop virtualization while maintaining the integrity of distributed computing models. A PC hypervisor is a software layer between the operating system and the PC hardware that allows hardware resources (CPU, RAM, Disk, etc) to be shared between multiple execution environments.

PCs have very different requirements than servers; therefore, PC hypervisors have very specific attributes that are not available in server-based hypervisors, such as Hyper-V or ESX. Most important of these is the ability to support device pass-through, which is essentially the ability for operating systems to have direct access to hardware and peripherals without use of emulation or para-virtualization.

To illustrate the difference let's look at video card support. In the server environment there is no requirement for high-end video processing, like 3D modeling, since this is generally done at the client. Server-based hypervisors use "emulation" or "para-virtualization" which will emulate the HW itself but are unable to take advantage of the video card's GPU (graphic processing unit). When it comes to end-user computing however, there is a requirement to be able to take advantage of the computer's hardware, such as high-end video cards.

PC hypervisors provide another very important benefit, and that is the abstraction of management outside of the OS. Information security and operational management of end-user computing devices is becoming a more challenging and untenable problem day by day. The reality is that we continue to build on top of inherently and fundamentally weak computing foundations. We need an alternative to the current computing paradigm and we need it to support the growing demands of personal computing power and mobile computing.

The real issue is not just the computing paradigm but the reliance on the OS itself, which is the root of all Internet evil. IT has more tools deployed to manage and secure the operating system. All these security and systems management tools rely on the integrity of the operating system. The majority of commercial operating systems are inherently insecure and carry a lot of legacy baggage. Operational failures and compromise render traditional management tools useless.

The evolution of computing from a centralized, tethered model highly reliant on perimeter security and data center management to highly distributed, complex, and globally interconnected networks supporting remote intermittently connected mobile computing devices will make VDI models extremely unpalatable for most organizations.

Although real-world desktop virtualization deployments have not met the market hyperbole, the development of PC hypervisors offer radical change to desktop management as they provide desktop standardization, support for distributed computing and the ability to abstract management outside of the OS itself, while providing all the purported benefits of desktop virtualization without the infrastructural costs and management headaches.

Libox Offers Unlimited Media Sharing

Excerpted from Notebooks Report by Carter Sprunger

Libox, launching today as a public beta, offers users the ability to share an unlimited number of files, with an unlimited number of people. Users can share pictures, songs, or even high-definition (HD) video without limits.

Libox can be used via an application compatible with Mac and PC, its web-based interface, or on a mobile device using the mobile version of the website, making Libox a versatile service that anyone can use.

Content is initially streamed from the computer that uploaded the file. Basically, Libox uses other computers as its servers, reminiscent of a file-sharing network like LimeWire.

With this unique networking, Libox is able to offer its unlimited file-sharing services for free, while still keeping user information private.

Usually anything free has its downfalls. However, all HD video is preserved in the highest quality possible.

This new unlimited service could put pressure on other new sharing services, such as Dropbox, which only offers 2GB for free.

It will be interesting to see if Libox faces litigation as has been the case for file-sharing networks like LimeWire and Kazaa. For more information, visit Libox's website.

LimeWire Readies Cloud-Based Subscription Music Service

Excerpted from FindTechNews Report by Jaikumar Vijayan

File-sharing software vendor LimeWire will launch a subscription-based music service for consumers.

The service is scheduled to go live later this year and will allow users to download and stream music to laptops, smart-phones, and other mobile devices for a monthly fee.

Spokeswoman Tiffany Guamaccia said that what the company is launching is not just a licensed version of LimeWire, as some have speculated, but a completely new service that it has been working hard on for some time now.

"Essentially, the new music service will be an ecosystem comprised of a desktop media player, mobile applications, and a web-based music experience for downloading and streaming," Guamaccia said.

The new service will have several cloud integration features including one that allow iTunes content to be synched to the cloud. The subscription service will allow users to have "complete and instant" access to music on their desktops, mobile devices and stored in the cloud.

The desktop media player will have "robust" music discovery features and will be capable of dynamically generating playlists based on user preferences, and recommendations, Guamaccia said. The media player will also include other discovery features, such as finding community editorial ratings and reviews.

News of LimeWire's planned service comes even as the company is coming under tremendous pressure from the music industry over copyright infringement issues.

Just last week, eight music publishers sued the company for enabling what they claimed was massive copyright infringement. The lawsuit was filed as a related case to another, previous lawsuit filed by the Recording Industry Association of America (RIAA) over the same issues.

Paramount COO: 3 Strikes Won't Stop Infringement 

Excerpted from NewTeeVee Report by Janko Roettgers

Paramount Pictures Chief Operating Officer (COO) Frederick Huntsberry told the audience of the Cinema Expo in Amsterdam this week that file sharing isn't the biggest threat for Hollywood anymore.

Instead of downloading movies via The Pirate Bay (TPB) and other file-sharing sites, users simply go to one-click hoster sites (or cyberlockers, as Huntsberry likes to call them) like Megaupload to get their latest blockbuster fix.

The Hollywood Reporter quoted Huntsberry with the following assessment: "Cyberlockers now represent the preferred method by which consumers are enjoying pirated content."

Of course, such a shift of consumer behavior also has implications in the fight against copyright infringement. The music industry used to focus for a long time on suing individual file sharers, and Hollywood has been hunting down uploaders and movie release groups as well as fighting against file-sharing site administrators.

Entertainment industry executives have in the past also pressed for so-called three-strikes graduated response programs, which would force Internet service providers (ISPs) to disconnect file sharers from the Internet after three offenses, but Huntsberry seems to believe now that this won't really stop movie infringement.

German movie industry magazine Blickpunkt Film reports that Huntsberry called such measures ineffective, pointing out that there's simply no way to identify individual infringers who download movies from sites like Megaupload.

He suggested ISPs should simply block these sites completely, and lawmakers should amend copyright laws to make such measures mandatory. There was simply no alternative to blocking websites to fight copyright infringement in light of the growing popularity of one-click-hosters, Huntsberry said.

The Paramount COO also blamed advertising agencies and major brands for supporting these types of sites, which sometimes show ads for companies like Kentucky Fried Chicken and Netflix. He estimated that Megaupload alone could make anywhere from $30 million to $260 million through ads and subscription fees.

Coming Events of Interest

Distributed Computing & Grid Technologies - June 28th - July 3rd in Dubna, Russia. This  fourth international conference on this subject, also known as GRID2010, will be held by the Laboratory of Information Technologies at the Joint Institute for Nuclear Research and is focused on the use of grid-technologies in various areas of science, education, industry and business.

NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year, this show features lively debate on timely cutting-edge business topics.

Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.

Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.

Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

Copyright 2008 Distributed Computing Industry Association
This page last updated July 5, 2010
Privacy Policy