Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

November 23, 2009
Volume XXVIII, Issue 9


Sign-Up Now for the P2P MEDIA SUMMIT and Save

The P2P MEDIA SUMMIT at CES is coming January 6th to the Las Vegas Convention Center. There's no better way to get a head-start on what to expect in the New Year from the very forefront of online service development than by personally attending.

This seminal industry event features keynotes from top peer-to-peer (P2P), social networking, and cloud-computing software companies; tracks on policy, technology and marketing; and panel discussions covering content distribution and solutions development. Enterprise deployment as well as consumer adoption trends will be covered.

A key theme will be the use of P2P and cloud computing for games. DCINFO readers can now view Abacast's archival video of the DCIA's first special event on this topic here.

Early-bird rates end on December 1st. Register now for the 2010 International Consumer Electronics Show (CES), sign-up for this Partner Program, and save hundreds of dollars.

Solid State Networks Releases New Game Publishing Software

Solid State Networks, the leading developer of P2P-based game publishing software, this week released CURRENT 3.0, which is now available to game developers free-of-charge for commercial use.

CURRENT 3.0 provides a unified platform for downloading, installing, patching, and launching games. Built upon Solid State's industry-proven delivery and patching technologies, CURRENT 3.0 enables game developers to provide players with a quality user experience during the process of acquiring games and updates.

CURRENT 3.0 also supports seamless upgrading to CURRENTpro and DIRECT, which offer a variety of features to further enhance the game acquisition experience, increase operational efficiencies, and improve player conversion and retention.

"With the release of CURRENT 3.0, we're helping game developers provide a superior experience to their players when it comes to acquiring games and updates - and we're doing it free-of-charge," said Rick Buonincontri, Founder & CEO of Solid State Networks.

"We believe that a positive user experience is the key to successful digital game distribution and to new revenue from games. We're confident that developers who implement our game publishing software will find that they are able to promote and monetize their games in compelling new ways in addition to providing a great experience for their gamers."

Since early 2007, Solid State Networks has steadily gained recognition within the gaming industry as a highly innovative company with reliable technology and versatile game delivery and patching solutions for companies such as Funcom, Riot Games, Wizards of the Coast, Abandon Interactive, Vogster Entertainment, and others.

P4P Remodels P2P for Even Greater Efficiency

Excerpted from MIT Technology Review by Erika Jonietz

File sharing, arguably the best-known use case for a subset of P2P technology implementation, unfortunately is synonymous with copyright infringement and bandwidth consumption on the Internet. But now, Internet service providers (ISPs) and content companies are taking advantage of new technology designed to speed the delivery of authorized content through P2P networks.

Meanwhile, standards bodies are working to codify this breakthrough technology into the Internet's basic protocols.

Rather than sending files to users from a central server, P2P networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems.

P2P traffic is growing in volume. In June, Cisco estimated that P2P transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of a provider-portal-for-P2P (P4P) as a way to ease the strain placed on networking companies. This system reduces the bandwidth needed for P2P by having ISPs share specially encoded information about their networks with special "trackers" - servers that are used to locate files for downloading. P2P software providers can then make their traffic more efficient by connecting computers that are closest together and reducing the amount of data shared among different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force (IETF), which develops Internet standards, continued work on building P4P into standard Internet protocols. Xie believes that those efforts will take one-or-two more years to come to fruition. In the meantime, he says, many P2P application makers and Internet carriers are already implementing their own versions of P4P.

Pando Networks, which facilitates Internet content delivery, was the first company to adopt P4P techniques. In collaboration with Yale, Pando worked with Verizon, Telefonica, AT&T, and Comcast to run two sets of P4P tests last year; the results showed that P4P could speed up download times for P2P users by 30-to-100%, while also reducing the bandwidth costs for ISPs.

Since then, Verizon and Telefonica have both implemented versions of P4P within their networks, though the network maps may not be available in all regions or to every P2P provider. Several other ISPs are considering implementing P4P, Xie says; Comcast, for instance, publicly stated its interest in the technology following last fall's trial.

Robert Levitan, Pando's CEO, says that the company used the expertise it gained through those trials to develop algorithms that automatically derive network maps, based on information gathered from software installed on individual users' machines (more than 30 million computers have Pando's media booster software installed).

The company uses the maps to help route content more quickly to those same computers. The company's clients include Nexon America, one of the largest free-to-play online video-game companies, and NBC Universal, which uses P4P to deliver full-length HD shows over the Internet.

Indeed, Xie says, as multimedia becomes more-and-more dominant on the Internet, demand for P4P implementations will grow, particularly from ISPs seeking to lower the amount of money they need to spend on new fiber and inter-ISP data transmissions. Video and audio streaming from sites such as YouTube and Hulu already account for almost 27% of global Internet traffic, according to a report by network-management systems vendor Sandvine.

Cisco predicts that by 2013, video alone will account for over 60% of all consumer Internet traffic. With this kind of increase in high-bandwidth traffic, Levitan says, "we're not going to be able to have the Internet we all want" without P4P, or a similar technology, to help scale the physical networks at a reasonable cost.

Xie and Levitan see two main difficulties for the continued growth of P4P. The first is P2P's association with file-sharing infringement of software, music, and video. ISPs want to make sure that working with P2P companies to improve their service won't make them liable for unauthorized file sharing.

But Levitan is optimistic that increasing numbers of authorized commercial uses for P2P technology will help reform its image. For example, Internet telephony service Skype relies on P2P connections, as does Activision Blizzard, maker of the popular online game "World of Warcraft." CNN began using Octoshape's P2P technology to boost its delivery of live streaming video earlier this year, and the PGA, NBA, and NASCAR all use it to support live webcasts of sporting events.

The other potential problem is perhaps trickier: even though P4P benefits both consumers and ISPs, because it treats P2P traffic differently from other data flowing over the Internet, it could technically violate the Federal Communications Commission's (FCC) proposed net neutrality regulations. In fact, one of Xie's original motivations in developing the P4P protocol was to help carriers avoid having to limit P2P traffic for cost reasons. He admits that P4P would seem to violate the letter of net neutrality, if not the spirit, by "helping" P2P applications preferentially.

"I don't have a good, clear answer to those concerns," Xie says. Still, he and other P4P proponents remain optimistic that the technology's advantages will win the day.

Levitan thinks that the benefits such companies are seeing will allow P4P to move forward. "On a technology basis, and even from a policy basis, I think the FCC could see - wow - this could really help networks, and maybe it changes the network neutrality debate," Levitan says - because there wouldn't be a scarcity of network capacity anymore.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe question I've been asked most often this week has been, "What's your position on the bill introduced Tuesday by fourteen-term Congressman Edolphus Towns (D-NY), Chairman of the US House of Representatives Committee on Oversight and Government Reform, regarding the use of file-sharing software on government computers?"

HR 4098 "The Secure Federal File Sharing Act" would direct the Office of Management and Budget (OMB) to issue guidance generally prohibiting the use of file-sharing software by government employees and contractors on federal networks and to establish procedures for agencies to seek permission to use the software, a process that would be decided on a case-by-case basis.

Exceptions would be granted in instances where file-sharing programs are necessary for an agency's business operations; the completion of a particular task or project that supports an agency's mission; collaboration among federal, state or local agencies; or to advance law enforcement investigations.

The stated purpose of this measure would be to help protect classified and sensitive government information from inadvertent distribution.

A common-sense approach here, rather than singling out file-sharing software, might be to require workers handling such data to do so only on computers not connected in any way to the public Internet, so that other more prevalent types of unintentional exposure, such as through online hacking or e-mail error, would also be prevented. For collaborative classified projects, workers would connect their machines with others only via closed, protected networks.

Similar advice, incidentally, could be given to consumers to protect personal or sensitive files: First, disconnect your computer from the Internet. Then perform tasks involving this data. When you finish, save the files to a removable USB stick (or comparable storage device) and delete all copies from the computer's hard drive before reconnecting to the Internet. For those who can afford two computers, simply use one for online recreational purposes, the other for working with confidential information offline.

Our written testimony submitted for the Committee's hearing on this topic in July reflected our proactive support for the stated purpose of HR 4098 through ongoing self-regulatory work being undertaken by the industry's Inadvertent Sharing Protection Working Group (ISPG).

The work of the ISPG is focused on eliminating this possible source of user-error with default settings, affirmative steps, consumer communications, and substantial changes in functionality.

By way of background, within weeks of the Committee's initial hearing on this subject in 2007, the DCIA established the ISPG. Over several months, we recruited participants among leading file-sharing software and other tech-sector companies and engaged with Federal Trade Commission (FTC) staff to address issues associated with unintended publishing of confidential data by file sharers.

This effort began by providing demonstrations for FTC staff of how current file-sharing programs work in terms of users uploading material for distribution. It continued through a collaborative process involving private sector and regulatory participants to develop a program of voluntary best practices - for file-sharing-software developers - to protect users against inadvertently sharing personal or sensitive data.

This program was announced in July 2008. It defined terms relevant to 4098, such as "recursive sharing," "sensitive file types," and "user-originated files." It then outlined seven steps that are required to be in compliance: 1) default settings, 2) file-sharing controls, 3) shared-folder configurations, 4) user-error protections, 5) sensitive-file-type restrictions, 6) file-sharing status communications, and 7) developer principles. The principles address feature disablement, uninstallation, new-version upgrades, and file-sharing settings.

In August 2008, the DCIA announced that compliance monitoring would begin in December to allow developers time to integrate required elements of the ISPG program into their planned upgrades and new releases. Compliance monitoring resulted in reports from top brands that use file-sharing and related software for downloading, live streaming, open-environment sharing, and corporate intranet deployments, and for both user-generated and institutional content.

Specifically, seven leading representative software program distributors submitted detailed reports to FTC staff in February 2009. In March, the DCIA prepared and submitted a summary. Since then, an iterative process of compliance reviews and company responses, including additional software changes, has taken place, and is ongoing today. By any objective measure, the ISPG has made real and substantive progress and continues to do so.

While we agreed with the Chairman's conclusion at the close of this summer's hearing - government employees and contractors should not use recreational file-sharing applications on computers intended for national security work - we continue to have concerns about lawmaker involvement in this area.

The Chairman indicated that his staff tested a leading file-sharing software program over the weekend prior to the hearing. This application featured new default settings, in compliance with ISPG principles, which protect users from inadvertently sharing their files.

But the Chairman nevertheless condemned this software because staffers could still use it to find confidential data - shared by others - even though the staffers themselves were protected from sharing their own files. 

This extremely important distinction has been and still is totally missed by Members of Congress.

There has been exponentially increasing digitization and online transferring of all kinds of information and a proliferation of new web-based applications that use common networks to create multiple data seed-points for global distribution. Concurrently, there has been widespread institutional laxity among government agencies and private sector entities with respect to safeguarding their sensitive data resulting in its dissemination to hundreds, thousands, and in some cases millions of networked devices in error, as well as its addition to an ever-expanding list of diverse search platforms.

The challenge of deleting the massive amount of information that is mistakenly in distribution on the Internet is indeed a very disturbing problem. It is of a different order of magnitude, however, from protecting users against accidentally uploading it in the first place as the ISPG is doing.

To seriously blame file-sharing software for that latter problem, which occurred a number of times during this summer's hearing and remained a primary impetus for HR 4098, as we have previously noted, is like chastising NASA, after complying with an order not to launch a spacecraft, for the fact that airplanes and satellites continue to fly through the sky.

This will not change even if HR 4098 is fully implemented. A voluminous amount of inadvertently disclosed material, copies of which are now stored on many computers, will still be available to users in response to relevant searches.

Nothing in HR 4098 as contemplated will mitigate the massive availability of such sensitive information online. And until our industry's powerful opponents learn how to embrace and harness file-sharing software, they will continue their richly-financed lobbying efforts to ensure that at some time in the not-too-distant future these same lawmakers or their successors will hysterically blame file-sharing software for this problem all over again.

More fact-based discussion based on a deeper understanding of the relevant issues and technologies, and above all, honest collaboration among affected parties aimed at real solutions, are needed here. Share wisely, and take care.

Ignite Upgrades Secure P2P Content Delivery Solution

Ignite Technologies, the leader in providing secure P2P-based content delivery solutions enabling its customers to efficiently publish, deliver, and manage digital assets, this week released version 7.10 of the Ignite Content Delivery Solution (ICDS), supporting live streaming video and on-demand downloads.

The new release further underscores Ignite's ability to deliver any content, anywhere, at any time with new features enriching the use of both live and pre-recorded content. Ignite v7.10 establishes ICDS as a comprehensive delivery platform that supports live streaming video, on-demand download or instant view, and targeted push delivery. These delivery methods are demanded by today's enterprises to securely and efficiently deliver video and rich media for corporate communications, e-learning, and sales.

"To successfully leverage video in the enterprise, large organizations need a comprehensive delivery platform that can integrate with their new and existing corporate communications technologies and devices," said Melissa Webster, Program Vice President, Content & Digital Media technologies at IDC. "As our research shows, more than three quarters of large organizations are delivering video-on-demand (VOD) to employee desktops today, and by the end of 2010, that will rise to more than 90%, with 80% also delivering live streaming video to the desktop."

In v7.10, Ignite enhanced its enterprise streaming solution to include the ability for viewers of live events to submit questions to the presenter directly from the end-user interface. Moderators or event producers can view submitted questions in real-time along with contextual information about the employee who submitted the question such as geographic location, department, and other information imported from the company's user directory.

The moderator can filter questions using custom criteria, prioritize, edit, and submit notes concerning the questions to assist the presenter. Presenters can view the questions forwarded by the moderator and choose which questions to address. Following the live event, all the questions are archived to be accessed later in reports, and answers can be pushed to users or made available via on-demand.

"In today's environment, immediate feedback after town hall meetings or quarterly updates is critical in helping to gauge the reactions and concerns of employees," said Brian Jensen, Managing Director of Global Communications for Cushman & Wakefield. "Our executives want employees to be able to submit questions during the live event providing a more engaging, face-to-face experience for the employees."

In v7.10, Ignite also introduces an Ignite 'clientless' capability that greatly extends the reach of Ignite's audience by providing the ability to view content from devices without requiring the Ignite native P2P client application. For example, when a user requests to view content from a corporate website, if the Ignite client is detected, the user has the option to either download and view the high-quality version or immediately view a lower-quality version online. The viewing option is ideal for enterprises' partners, customers, or employees that often need access to video or rich content from non-enterprise-standard computers such as Macintosh, Linux, home, and personal computers or mobile devices.

Companies can also leverage this new feature to offer a short preview or trailer of the content to enable the user to determine if they want to download the high-quality version.

"The 'YouTube-like' experience, where immediate viewing of the video online from a company's Internet or intranet portals, Microsoft's SharePoint sites or other media portals, further expands on Ignite's ability to deliver rich media corporate communications anywhere, at any time to any device," said Vasu Avadhanula, Vice President of Product Management at Ignite.

Contractor Gets Serious About Data Security

Excerpted from Infosecurity Magazine Report

Lockheed Martin has formed an information security alliance with a collection of technology providers that will focus on self-healing systems to help solve information security problems.

Lockheed Martin formed the information security group along with APC by Schneider Electric, CA, Cisco, Dell, EMC Corporation and its RSA security division, HP, Intel, Juniper Networks, McAfee, Microsoft, NetApp, Symantec and VMware.

The companies will carry out information security test scenarios in simulated customer environments, along with systems integration pilots. Other activities include improving early threat detection. Many activities will take place at the NexGen Cyber Innovation and Technology Center, which opened at the same time as the alliance was announced.

The information security center is a 25,000 square-foot design and collaboration facility at Lockheed Martin's headquarters. It includes distributed computing and virtualization capabilities to simulate networks under attack.

The defense industry could do with better information security, as illustrated by the theft of electronic details pertaining to the Joint Strike Fighter project last April. Lockheed Martin was the prime contractor on the project, which was worth around $300 billion. Reports emerged that sensitive records concerning the fighter program had been stolen, possibly by hackers operating from overseas.

CloudShield Partners for Secure Cloud Computing

CloudShield Technologies, a leading provider of global infrastructure security and service management solutions, this week announced a strategic relationship with Sensory Networks, the leader in pattern matching and software acceleration technology, to deliver a high performance, scalable security platform for cloud computing applications.

Cloud computing has exploded in recent months, with analyst firm Gartner recently predicting revenue from cloud computing services to exceed $56 billion in 2009. Despite an overall decline in 5% spending for the IT industry this year, cloud computing experienced 21% in growth year-over-year from 2008. The total cloud market is expected to reach $150.1 billion by 2013.

Along with its partners, CloudShield aims to provide network infrastructure systems, infrastructure software and policy-driven software solutions that enable customers, clouds, and providers to offer the requisite service control, security, and transport functionality required in cloud computing. At the heart of the offering are solutions that optimize performance, enable network services, and protect infrastructure while driving down costs and increasing revenues for managed services by clouds and providers.

"This relationship further demonstrates our commitment to protecting the valuable network infrastructure of the world's leading service providers and national governments," said Peder Jungck, CTO & Founder of CloudShield. "With the rise in new technologies such as cloud computing, we're adapting our offerings to keep our customers protected despite a changing IT landscape. This partnership brings two technology leaders together to create a scalable platform that customers can deploy across a broad range of secure network installations."

"Call of Duty" Generates Record $550 Million in First 5 Days

Excerpted from Digital Media Wire Report by Mark Hefflinger

Videogame publisher Activision Blizzard announced on Wednesday that its Call of Duty: Modern Warfare 2, which features a P2P-based multiplayer component, has become "the biggest entertainment launch in history," racking up $550 million during its first five days on sale.

The previous five-day sales record for a videogame was set last summer by "Grand Theft Auto IV," which sold 6 million units and made $500 million; the largest five-day opening worldwide box office gross is held by "Harry Potter and the Half-Blood Prince" ($394 million). 

Activision added that more than 5.2 million P2P multiplayer hours were logged playing the game on Microsoft's Xbox Live service during its first day of availability. "In just five days of sell-through, 'Call of Duty: Modern Warfare 2' has become the largest entertainment launch in history and a pop-culture phenomenon," said Activision Blizzard CEO Robert Kotick. 

"The title's success redefines entertainment as millions of consumers have chosen to play 'Modern Warfare 2' at unprecedented levels rather than engage in other forms of media."

P2P Comes to the Aid of Audiovisual Search

Excerpted from ICT Results Report

Current methods of searching audiovisual content can be a hit-and-miss affair. Manually tagging online media content is time consuming, and costly. But new 'query by example' methods, built on P2P architectures, could provide the way forward for such data-intensive content searches, say European researchers.

A team of researchers have turned to P2P technology, in which data is distributed and shared directly among computers, to power potent yet data-intensive audiovisual search technology. The technique, known as query by example, uses content, rather than text, to search for similar content, providing more accurate search results and reducing or even eliminating the need for pictures, videos, and audio recordings to be laboriously annotated manually. However, effectively implementing content-based search on a large scale requires a fundamentally different approach to the text-based search technology running on the centralized systems of the likes of Google, Yahoo, and MSN.

"Because we're dealing with images, video, and audio, content-based search is very data-intensive. Comparing two images is not a problem, but comparing hundreds of thousands of images is not practical using a centralized system," said Yosi Mass, an expert on audiovisual search technology at IBM Research in Haifa, Israel. "A P2P architecture offers a scalable solution by distributing the data across different peers in a network and ensuring there is no central point of failure."

Currently, when you search for photos on Flickr or videos on YouTube, for example, the keywords you type are compared against the metadata tags that the person who uploaded the content manually added. By contrast, in a content-based search, you upload a picture or video (or part of it) and software automatically analyzes and compares it against other content analyzed previously.

Working in the EU-funded SAPIR project, Mass led a team of researchers in developing a powerful content-based search system implemented on the back of a P2P architecture. The software they developed automatically analyzes a photo, video, or audio recording, extracts certain features to identify it, and uses these unique descriptors to search for similar content stored across different peers, such as computers or databases, on a network.

"In the case of a photograph, five different features are used, such as the color distribution, texture, and the number of horizontal, vertical, and diagonal edges that appear in it," Mass explains.

In the case of videos, different frames are captured and analyzed much like a photograph to build up a unique descriptor. Audio is converted into text using speech-to-text software, while music is analyzed by its melody. The extracted features are represented in standard formats such as XML, MPEG7, MPEG21, MXF and PMETA, allowing complex queries from multiple media types.

Processing and data transmission demands are kept in check by ensuring that searches target specific groups of peers on the network.

"When someone initiates a search, the system will analyze his or her content and compare it to other content across specific peers rather than across the entire network. For example, if an image has a lot of red in it, the system will search the subset of peers that host a lot of images in which the dominant color is red," Mass notes. "This helps ensure the search is faster and more accurate."

In the network, each peer - be it a home user's personal computer or a media group database - can be both a consumer and producer of content. All push data for indexing by the P2P network and make it searchable.

To further enhance the search capabilities, the SAPIR team developed software that compares a newly uploaded image to similar images and then automatically tags it with keywords based on the most popular descriptions for the similar images in the database. This automated tagging technique, based on metadata generated by the "wisdom of the crowd," is being further researched by IBM and may find its way into commercial applications, Mass says. It could, for example, automatically and accurately tag photos uploaded to Flickr from a mobile phone, eliminating the need for users to battle a small screen and keypad in order to do so manually.

Mass sees additional applications in security and surveillance by incorporating face recognition and identification into the image and video analysis system, as well as, evidently, for media companies looking for a better way to organize and retrieve content from large audio, video, and image collections.

"IBM and the other project partners are looking at a variety of uses for the technology," Mass notes.

Project partners Telefonica and Telenor are also looking to use the audiovisual search commercially.

One scenario envisaged by the SAPIR researchers is that of a tourist visiting a European city. She could, for example, take a photo of a historic monument with her mobile phone, upload it to the network and use it to search for similar content. The city's municipal authorities and local content providers, meanwhile, could also act as peers, providing search functionality and distributing content to visitors. Combined with GPS location data, user preferences and data from social networking applications, the SAPIR system could constitute the basis for an innovative, content-based tourist information platform.

The SAPIR project received funding from the ICT strand of the EU's Sixth Framework Program for research.

Kontiki Delivers P2P-Based "YouTube for the Enterprise"

Kontiki, the leading provider of peer-to-peer television (P2PTV) enterprise video solutions, this week debuted the Kontiki VideoCenter (KVC) enterprise employee video portal, defining a new class of employee video community that allows companies to align their organization around a corporate vision using communication that connects and inspires every employee.

The P2P-based KVC delivers on the promise of "YouTube for the Enterprise" by engaging all employees with social media and user-generated content (UGC) - while overcoming the challenges of securing and controlling this powerful communication tool in a way that is consistent with company policies and culture.

KVC offers familiar and rich social media capabilities such as easy upload of UGC; simple content discovery via search and channels as well as syndication, ratings and rankings, comments, and tagging; dynamic channel building that fosters communities; support for live video, on-demand video, video-conferencing, video webcasting, and video podcasts to create a single video destination.

In terms of security, KVC provides "flip the switch" on-and-off for as little or as many social media elements as a company is comfortable with; secure role-based access controls for publishing and viewing to govern "who can see what and who can share what" on an individual, group, business unit, or company-wide basis; and robust reporting.

Thanks to its reliance on P2P, KVC overcomes network bandwidth limitations and "last mile" branch office problem leveraging Kontiki's fully integrated Enterprise Content Delivery Network (ECDN) - with no impact on existing network business traffic and no costly hardware upgrades.

Also this week, Kontiki announced that it will provide its P2P-based enterprise video services to United Technologies Corporation (UTC) reaching more than 100,000 employee PCs around the world. UTC is the 18th largest US manufacturer, 37th largest corporation, and the 61st largest publicly held manufacturer in the world.

Kontiki President Eric Armstrong says Kontiki is uniquely placed to deliver a solution to such a large company as UTC, which does business worldwide and generates more than $50 billion in revenue each year.

"High-quality video as a means of communication is engaging, personal and has a bigger impact than traditional forms of corporate communication," Armstrong said. "It's difficult for CEOs to communicate a vision, and effectively lead employees with only e-mail or voice-mail in their communications toolset. Video is a critical component for effective leadership, and we are thrilled to be solving this problem for UTC."

Kontiki has successfully delivered hundreds of millions of high-quality P2P videos to desktops around the world for some of the world's largest companies in the financial services, retail, technology, telecommunications and manufacturing market sectors such as Charles Schwab, Coca Cola, Sephora, Wachovia, and more.

Record Labels Pressure P2P Streaming Service Spotify

Excerpted from Softpedia Report by Lucian Parfeni

The online music business has proven a tough nut to crack again-and-again. Pioneering free service iMeem is more or less dead in its current form and in the process of being acquired by MySpace.

Countless others have failed to provide a free music streaming service that actually generates revenue, and it looks like Europe's P2P-music-streaming darling Spotify won't be able to deliver on the promise in the US either.

This has been known for a while, as the service has pushed back plans to launch in the US after concerns that it won't be able to offer a free service like it does in Europe. And now the major record labels are "concerned" that the free model just doesn't work and it would be unwise for Spotify to launch the service in the US, as the Financial Times reports.

There's just one small glitch in the labels' rhetoric though. While it's true that free services have failed, the reason they have failed has entirely to do with the ridiculous license fees these services have been forced to pay to the very same labels which are now "unconvinced" that the model works. It doesn't get much more hypocritical than that.

The real problem here is that the labels are preventing Spotify from offering US users the same service which millions of Europeans have fallen in love with. It's true that Spotify has been struggling to drive up revenue figures and that ad-revenue alone can't cover its costs, which, again, are mostly due to licensing fees, not the actual costs of the service.

The company is trying to please the labels, which are pushing for a subscription model in the US, meaning that, for one, Americans will have to wait longer for the service, and second, they may only get a severely crippled version if they get one at all.

The Pirate Bay Goes More Distributed

Excerpted from TechDirt Report by Mike Masnick

So this is interesting. The folks at The Pirate Bay (TPB) have shut down its tracker for good, and switched entirely to a decentralized system, called distributed hash table (DHT).

As others are noting, this is quite a milestone, but I actually wonder if it will also have legal implications. Basically, using such a distributed system takes TPB even further out of the equation in terms of its role in the sharing of content, and in theory could impact the ruling against TPB.

Of course, the entertainment industry will say it doesn't matter, and the courts (who don't seem to understand these things very well) might not realize the difference, but it is meaningful in terms of how involved TPB actually is in the activity that's happening. But, of course, even if this makes no difference in how the courts view TPB (as expected), it does show the inevitable trend of these things: making them ever more-and-more decentralized and harder to shut down.

When the RIAA shut down Napster, what came out of it was even more decentralized and harder to stop. Now the same thing is happening with the attempted shut down of TPB.

Even if you don't like what sites like TPB do, at some point you have to wonder what good it does to keep shutting down these offerings when all it does is drive people to the "next" offering that's even more difficult to stop? 

At some point, someone is going to get the message that you can't stop this stuff. So why not figure out a way to use it to your advantage?

Ofcom Talks to Spook Firm on File-Sharing Snoop Plan

Excerpted from The Register Report by Chris Williams

The UK's Ofcom has held talks over a monitoring system that would peer inside file-sharing traffic in an attempt to determine the level of copyright infringement, in preparation for new laws designed to protect the music, film, and software industries.

The Digital Economy Bill, published by Lord Mandelson on Friday, requires the communications regulator to measure how file sharers who exchange copyrighted material respond to a regime of warning letters.

If the overall level of infringement is not cut by 70% in a year, further provisions will be triggered, compelling Internet service providers (ISPs) to impose speed restrictions after warnings. Internet access will be suspended for the most persistent infringers.

Detica, a BAE subsidiary specializing in large volume data gathering and processing, is aiming for a central role implementing the plan. The well-connected firm has developed CView, a deep packet inspection (DPI) product that looks at the actual content of file-sharing traffic to try and determine whether it is copyrighted, and if so, whether or not it is licensed, before calculating the overall level of infringement on a network.

The cost of such a system could be shared between ISPs and rights holders, Detica suggested in its September submission to the Department for Business, Innovation and Skills' (BIS) consultation on copyright infringement via file-sharing networks.

It explained that CView "applies high volume, advanced analytics on anonymized ISP traffic data, and aggregates this information into a single measure of the total volume of copyright infringement." By examining the content of communications, it would measure copyright infringement via newsgroups, as well as via BitTorrent and other P2P protocols.

"Detica would like to explore with BIS and Ofcom how CView could be used to baseline the level of unauthorized file-sharing activity - ahead of the proposed notification process - and thus measure the impact this remedial action has on file sharing," it added.

Under the government's proposals, it's envisaged that rights-holder organizations such as the BPI and FACT will harvest the IP addresses of infringers from BitTorrent swarms. Detica suggested that a drop in the number of IP numbers collected would not be an accurate enough measure of the impact of the subsequent warning letters.

Ofcom told The Register it had met the firm to learn more about CView, in expectation of the Digital Economy Bill becoming law. "I can confirm we have met with Detica and other stakeholders," a spokeswoman for the regulator said.

In its submission to BIS, Detica said it planned to test its system with a UK ISP soon. "CView is targeted to move into beta trial with a UK ISP during the autumn of 2009," it said.

BT, which in common with most ISPs uses other Detica products, declined to confirm or deny whether it is the ISP referred to in the submission. "BT is not using or trialing CView," it said. When pressed, a spokesman added, "You'll have to ask Detica what they meant... It's not our document."

It's understood that Sky and Virgin Media, the two other ISPs which are most active in development of DPI-based "network intelligence", are not the trial firm referred to by Detica's consultation response.

Today Detica said it was unable to discuss details of the system beyond the document, but said the beta trial had not yet started.

The Register was particularly interested in details of how CView determines whether file sharing traffic is authorized or not, which is not explained in the consultation submission.

It's also unclear whether Detica envisages Ofcom measuring unauthorized file sharing across all ISPs, or by sampling a single network.

While the emphasis in the consultation response is on gauging the overall level of unauthorized file sharing - more of interest to the government and regulators than ISPs - Detica also claims CView could help ISPs operate a "carrot and stick" approach to rein in infringers and profit. It is clear the firm aims for a central role in future commercial deals between ISPs and the record industry.

"We have also been in active discussion with a number of the major music labels, the BPI and the UK Music and Performing Rights Society," it said.

A spokesman for the BPI confirmed it had a meeting about CView and found the technology "interesting."

The licensing of ISPs by rights holders is an obvious new business for Detica to target. With its close links to intelligence agencies and law enforcement, it is also closely involved in the Home Office's controversial Interception Modernization Program, which aims to use DPI to capture and store details of every communication that takes place online. There is no suggestion of a direct link between that project and CView, however.

Monitoring unauthorized file sharers' behavior would mean they could be targeted with "tailored products and services" - such as licensed music services - and hit with "tailored remedial actions" such as bandwidth restrictions, expected to be part of the enforcement system government imposes on ISPs.

Such targeted applications, dependent on linking ISP subscribers to surveillance of their Internet activity, are likely to prove the most controversial aspect of products like CView. Aiming to head off concerns, Detica said it would operate on an anonymous basis with no data stored.

"CView does not, and cannot, identify individual Internet users," it told the government.

It appears CView would however classify users, making targeting them for licensed music or film services possible. The approach seems similar to Phorm's targeted advertising system and relies on the same foundation of DPI technology.

Further announcements about CView, perhaps concerning the trial, are expected from Detica soon.

Suing File Sharers Is Like Terrorism

Excerpted from The Inquirer Report by Ed Berridge

A top attorney for Viacom told a gathering of Yale law students that suing file sharers in federal courts "felt like terrorism."

Michael Fricklas, the company's General Counsel, said he's a huge fan of fair use, doesn't want to take down YouTube mash-ups, and has no plans to start suing file sharers users in federal courts. But he still loves digital rights management (DRM) and "three-strikes" laws.

He told the Yale Law School audience that suing end-users for online copyright infringement was "expensive and painful and it felt like bullying."

Fricklas said the way it came across to the public when some college student went up against "very expensive lawyers and unlimited resources" was very bad and "felt like terrorism."

According to Ars Technica, Fricklas said that customers "need to be treated with respect."

Coming Events of Interest

Future of Film Summit - December 8th in Santa Monica, CA. This inaugural event brings together top executives, creators and professionals from major and independent movie studios, film distributors, talent agencies, law firms, financiers and digital media companies for high-level discussions and debate.

P2P MEDIA SUMMIT at CES - January 6th in Las Vegas, NV. The DCIA's seminal industry event, featuring keynotes from top P2P, social networking, and cloud computing software companies; tracks on policy, technology, and marketing; panel discussions covering content distribution and solutions development.

2010 International CES - January 6th-10th in Las Vegas, NV. The industry's largest educational forum to help companies expand their businesses and understand new technology. Over 200 conferences and more than 300 expert speakers encompass International CES.

MIDEM & MidemNet - January 23rd-27th in Cannes, France. MIDEM  is where music professionals from across the industry meet face-to-face to do business, analyze trends and build partnerships. MIDEM brings together music leaders looking for concrete solutions and insights. MidemNet's renowned digital business conference program is now included free with your MIDEM registration.

P2P MARKET CONFERENCE - March 9th in New York, NY. Strategies to fulfill the multi-billion dollar revenue potential of the P2P and cloud computing channel for the distribution of entertainment content. Case studies of sponsorships, cross-promotion, interactive advertising, and exciting new hybrid business models.

Media Summit New York - March 10th-11th in New York, NY. MSNY is the premier international conference on media, broadband, advertising, television, cable & satellite, mobile, publishing, radio, magazines, news & print media, and marketing.

Copyright 2008 Distributed Computing Industry Association
This page last updated November 28, 2009
Privacy Policy