Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

August 11, 2008
Volume XXIII, Issue 2


Pando Networks Receives 2008 Innovator's Award

The Distributed Computing Industry Association (DCIA) announced Monday that Pando Networks is the recipient of the 2008 DCIA Innovator's Award. The award was presented at a conference luncheon ceremony to Pando Networks' CEO Robert Levitan at the inaugural P2P MEDIA SUMMIT Silicon Valley.

The DCIA Innovator's Award is presented annually to that company which epitomizes the overall advancement of distributed computing technologies for commercial purposes.

Pando Networks, through its leadership role in the P4P Working Group (P4PWG), has engaged Internet service providers (ISPs) in the deployment of peer-to-peer (P2P) related technologies. Pando has also been a leader in educating content owners about the benefits of distributed computing.

With the development of the Pando Content Delivery Suite, commercial content owners everywhere can now seamlessly add secure peer-assisted delivery to their existing content delivery networks (CDNs).

Pando Networks has exemplified innovation in the advancement of P2P technologies, underscoring the importance of distributed computing and the vital role it will play in media delivery going forward.

Doug Pasko, Verizon Senior Technologist and Co-Chair of the P4P Working Group said, "Pando Networks has been instrumental in bringing together the ISP and P2P communities to help address critical issues impacting the evolution of the Internet."

"Pando defied the rhetoric by seeking a cooperative industry-based solution which now offers great promise for improving the quality of user experience while simultaneously addressing issues of network resource utilization. Our co-chaired working group should serve to remind the industry that we are all seeking the same thing; viable solutions to provide the best possible experience for our customers."

The CDN of the Future Will Include P2P

Excerpted from Contentinople Report by Ryan Lawler

Pando Networks' CEO Robert Levitan says the dirty little secret about online video is that "the business model is really bad." He attributes this to today's current delivery model where, he says, "the more video you deliver, the more money you lose."

In a rallying-the-troops speech at the first ever DCIA P2P MEDIA SUMMIT Silicon Valley in San Jose, CA Monday, Levitan said that P2P technologies would be necessary to enable wide-scale delivery of video content.

While he says CDNs like Akamai have developed very sophisticated edge networks in order to speed the delivery of rich media, they're not well prepared for the massive growth of high-definition (HD) video to come. "You need to go beyond the edge and go right to the desktop."

That's where P2P comes in. According to Levitan, "The CDN of the future is one that can deliver high-quality media; it can scale; it can lower costs, and it has to use some P2P protocols to make that happen."

One thing P2P companies have to do, he says, is "get ISPs friendly with P2P delivery that is actually good for their networks." Despite progress that has been made with projects like the DCIA P4P Working Group (P4PWG), Levitan says there's still work to do.

But that's only one part of the equation. "For legitimate P2P adoption to really take off, we need the content owners," he says.

Despite the usage of P2P by companies like NBC Universal and the British Broadcasting Corporation (BBC) in their delivery plans, other content owners have been slow to adopt P2P as a delivery technology.

"Content owners want someone else to take the first step," Levitan says. Once one company takes the plunge, he predicts, there will be a rush towards P2P adoption.

Until one takes that first step and proves that P2P can be used in a successful video delivery model, however, Levitan says technology companies must continue to move P2P technologies forward by continuing to work towards technology standardization and market development.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThanks to all who participated in the very stimulating P2P MEDIA SUMMIT Silicon Valley in San Jose, CA this week as well as the highly productive P4P Working Group (P4PWG) meeting held in conjunction with that event. 

Congratulations, too, to CEA and Digital Hollywood for the very informative Building Blocks conference.

Presentations from the following speakers are now online here: Kontiki's Eric Armstrong, mBit's Chunyan See, Ignite Technologies' Fabian Gordon, Packet Exchange's Chuck Stormon, Octoshape's Stephen Alstrup, Leaf Networks' Jeff Capone, Lime Wire's Brian Dick, TVU Networks' Dan Lofgren, Bingham's Joshua Wattles, Motorola's John Waclawsky, and Microsoft's Ravi Rao. Abacast is in the process of editing videos from the conference, and we will alert DCINFO readers when these are being posted.

We are especially grateful to the exceptional speakers who participated in our Next Generation P2P panel at Building Blocks. These included Kristin Rolla, Director of Business and Content, AOL Television and Moviefone; Stephen Alstrup, CEO, Octoshape; Daniel Leon, Head of Strategic Partnerships, Hiro-Media; Bill Wishon, Director of Marketing, Kontiki; Chunyan See, CEO, mBit; and Scott Sahidi, CEO for ioko, North America Operation.

Kristin discussed her current work with movie studios while also looking forward to the next steps towards providing faster, more convenient content systems, which involve P2P. Sampling of content needs to be driven by the lowest cost distribution solutions, while purchased content needs to be supported with the highest quality options.

End users' and content suppliers' needs are different and must be addressed differently. We must not let bad-apple copyright infringers spoil things for the whole bunch of technology innovators. It is important to stay in front of a trusted audience while outstanding distribution issues are worked out. A major content company should invest in a P2P company, such as one of the other panelists, to help accelerate progress. More marketplace experimentation is called for.

Stephen defined Octoshape as a content delivery network (CDN) service that uses P2P. Reducing costs while also delivering the highest quality are its mantras for its success. Octoshape builds-in redundancy, or multiple fail-over capacity, to achieve higher reliability than competitors. Historically, unlicensed P2P file sharing programs and content owners have been like fast-moving cars being pursued by police on foot; and the solution needs to involve putting police into the cars.

The costs of online distribution are becoming too great, as are the problems with scaling and security, as high-definition (HD) content is entering the mix, and this is where P2P can successfully compete. ISPs and P2Ps are now collaborating through the P4P Working Group (P4PWG) to optimize efficiency and localization. As P2P streaming takes costs below five cents per gigabyte, there will be tremendous uptake.

Daniel explained his Israeli-based company's innovative offering as the download of advertising into P2P networks with digital rights management (DRM) that protects content and commercials so that material can be safely distributed without the threat of stripping ads. The industry is at an early stage of development, but already it's clear that only 2% of people will pay for content, and therefore the focus should be on developing ad-supported P2P distribution.

Marketers and advertisers need to learn to trust that content can be securely distributed and monetized in the manner that Hiro-Media supports. Trying to impose regulatory constraints that underestimate the intelligence of the marketplace will be doomed to failure. Instead, realism is needed along with a focus on learning new ways to make money. Each day that a content owner lets pass without exploiting commercial P2P is a day that company loses money. Interested parties should join the DCIA.

Bill described Kontiki's mission as building commercial P2P software that is the best at powering on-demand high-quality video services in the entertainment and enterprise markets. P2P is increasingly being regarded rationally as a technological tool without the stigma that had been associated with earlier file-sharing applications because of their abuse for copyright infringement. Understanding this behavior is key to developing viable solutions for it.

The Internet is not without a moral code, but it is clear that technology is outpacing both old business models and attempts at legislative fixes. True collaboration and real cooperation among affected parties are called for to make progress. Education of content rights holders is extremely important at this stage along with fresh approaches and new thinking with regards to monetization models. ISPs can now see enormous savings through P4P.

Chunyan shared that Singapore-based mBit is a provider of a P2P search and file-sharing service for the mobile market. It is possible for cell-phones to become the de facto transmission device for user-generated content (UGC) and also to be involved in very large video file delivery. Modestly priced mobile P2P subscription services can provide attractive new revenue streams for wireless operators. In China, for example, there are 600 million cellular customers comprising a very large market opportunity for such an offering.

The major US telecommunications laws date to 1996 and are seriously outmoded at this point. In Asia, so far the governments have not attempted premature intervention. Copyright holders are coming to realize that it is inevitable that P2P will be used for content distribution. By working with network operators through P4PWG, efficiency can be optimized even in wireless implementations. Premium and viral content sales can generate new revenue.

Scott outlined ioko's market position as a 300 employee $100 million firm that leverages P2P for content delivery around the globe and helps its customers address such questions as client versus client-less, DRM versus DRM-less, etc. The industry is at an in-between stage, with costs for CDN services coming down while demand for increasingly large rich media files is going up. There will be a volume inflection point for peer-assisted hybrid-P2P CDN services and the market will ultimately catch up with the technology.

The advantages of P2P in enabling delivery of large amounts of content will prevail. Technological change will continue to outpace the law's ability to control it. P2P will come to be viewed as a weapon in the content delivery arsenal and its role in overall distribution platform will become clearer. The technology sector needs to continue to reduce burdens on end-users by improving interoperability and efficiency.

Audience Q&A centered on recently announced experiments with metered Internet usage and how incentives will have to be aligned for both subscribers, whose behavior should be encouraged to use network resources as efficiently as possible - which will mean substantial use of P2P - and carriers, whose return on investment (ROI) must be rationalized for expanding and upgrading infrastructure - which will also mean substantial use of P2P. Share wisely, and take care.

P4P Test Results Improve, Execs Say

Excerpted from Contentinople Report by Ryan Lawler

Joint trials with Internet service providers (ISPs) aimed at driving efficiencies in their networks via P2P technologies are working even better than expected, according to industry executives.

At the inaugural DCIA P2P MEDIA SUMMIT Silicon Valley in San Jose, CA Monday, execs said the second round of service provider trials in the P4P Working Group (P4PWG) came off even better than the first.

The first trial, held in February with the participation of Verizon Communications and Telefonica Group, showed a 50% increase in P2P efficiency when the file-sharing protocol was made network-aware and drew from local sources.

In an interview with the Associated Press in March, Verizon senior technologist Doug Pasko said that with traditional P2P, only 6.3% of all data comes from other Verizon customers. In the P4P trial, Pasko said this number improved to 58%.

A second trial was held in June, which included AT&T and Comcast. Laird Popkin, CTO of Pando Networks and Co-Founder of the P4P Working Group, said that without the approval of ISPs, he couldn't give details, but he said the results of the second trial were very good.

"So far, the ISPs love the idea of reducing the cost of P2P on their infrastructure," Popkin says.

According to Popkin, the second trial was also effective in showing improved peer efficiencies in asymmetrical traffic environments, where downstream bandwidth is much higher than upstream bandwidth.

"P4P gave much better performance to customers without having any impact on uplink speed, so we're pretty comfortable with cable, fiber, or DSL," Popkin says.

While the first two rounds focused on the transmission of on-demand content using P2P protocols, the next trial will focus on live, P2P-based streaming, according to Abacast CEO Mike King.

Live streaming via P2P provides different challenges, due to the way live streams are downloaded. Unlike on-demand P2P downloads, which can pull bits from peers wherever they are and aren't as constrained by the sequence in which those bits are downloaded and assembled, live P2P technologies depend on peers being close to the end-user and bits being assembled in order.

"There's never been a live P2P technology that randomly selected peers," King says. "We also can't receive bits from the end of the file. It's very much time-sequenced, and timing is very important."

The live test will be important, particularly as CDNs look towards P2P as a way to boost their live-streaming capabilities.

Last month, Velocix and CDNetworks announced partnerships with P2P technology firms that will allow them to offer live streaming via P2P.

Kontiki Presents at IEEE International Conference

Kontiki, the leading provider of managed peer-assisted delivery for high-quality video and digital content, participated in a discussion of the current research, development, and deployment status of P2P applications at the seventeenth annual International Conference on Computer Communications and Networks on August 6th is St. Thomas, VI.

Kontiki's Vice President of Corporate Development and Strategy, Harvey Benedict, joined P2P industry leaders and Internet service providers (ISPs) to discuss the growing popularity of P2P systems and technological requirements in providing a more scalable environment, while reducing pressure on ISPs.

Attendees gained a better understanding of the challenges and opportunities of P2P content delivery as well as learning about growth possibilities and new P2P research.

IETF: Find More P2P Bandwidth, Use Sparingly

Excerpted from Ars Technica Report by Iljitsch van Beijnum

During its 72nd meeting this week in Dublin, the Internet Engineering Task Force (IETF) held two "birds-of-a-feather" (BoF) sessions about bandwidth for P2P applications. The first was about finding more bandwidth for P2P apps, the second about how such apps can use less of it.

The topic for the Application-Layer Traffic Optimization (ALTO) BoF was to come up with a way for ISPs to signal which network paths P2P applications such as BitTorrent should use as their first choice. BitTorrent typically sets up several dozen connections to BitTorrent users elsewhere. It then tries to download over each connection, but it only uploads data over about four connections at a time. If BitTorrent manages to set up connections to destinations reachable over high-bandwidth paths when starting a download, it can reach much higher download speeds more quickly.

On the surface, an "oracle" server that knows which paths have the best bandwidth might seem to be a bad thing for ISPs like Comcast that have been trying to control P2P bandwidth. But that's not necessarily the case: the all-knowing oracle could also be equipped to guide P2P traffic to avoid congested parts of the network and use cheap or under-utilized network paths instead.

The idea still has to face a host of open questions, so it's not a given that the "oracle" service will ever see the light of day (or the inside of the tubes).

A couple of days later, the Techniques for Advanced Networking Applications (TANA) BoF discussed implementing a scavenger service for P2P traffic. Under normal circumstances, packets flowing over the Internet receive a best-effort service: the routers do what they can to deliver the packets.

There has been plenty of work in the IETF to allow for better than best-effort service, but the idea behind the "scavenger service" is different. If you can give certain packets a worse than best-effort service, they won't get in the way of packets sent by other applications. So it's no problem to fill-up network links with packets that are marked with a code requesting the scavenger service - if there's not enough bandwidth, the scavenger packets are the first to be thrown away.

Unfortunately, most routers aren't set up to support this scavenger service, and even if they were, using the service requires complex settings in the few applications that support it, like Azureus. Solution: rather than let the routers throw out scavenger packets as needed to keep the tubes from getting clogged, implement a scavenger service in TCP's congestion control.

Normal TCP behavior is to increase its transmission speed relatively slowly, and then back off quickly when packets are lost, presumably because the network couldn't handle the rate at which TCP was sending. The scavenger service is implemented by simply ramping up more slowly and backing off even more quickly than with regular congestion control. There seemed to be the prerequisite "rough consensus" in the BoF that this idea has some merit.

Unfortunately, there is a slight complication. TCP congestion control is implemented in the kernel of an operating system. Changing this code is always tricky, but not a huge deal if you have the kernel source, as with Linux, BSD, and even Mac OS, but it gets much harder in the case of Windows. Microsoft has implemented a number of changes to TCP in Vista, but it seems unlikely it would push out an additional TCP congestion control algorithm in an update - this would probably be a feature in Windows 7, if that. And it's unlikely many Mac users will be compiling their own kernel just so their BitTorrent downloads are friendlier on the network.

The alternative would be to use the only other protocol that gets through firewalls and NAT devices reliably: UDP. In the BoF, some argued that this is the only workable choice, while others expressed disgust at re-implementing TCP over UDP just because it's inconvenient to wait for the OS vendors.

Getting a BoF group approved is the first hurdle to jump in having a new working group chartered in the IETF. Now that the BoFs have taken place, it's up to the IETF's leadership to determine whether this work should find a permanent home.

P2P Traffic Coming to Forefront in 3G World

Excerpted from Fierce Broadband Wireless Report by Lynette Luna

It's interesting to see issues mobile operators will face as 3G services proliferate in the US. Mainly, customers are beginning to do what they want with their broadband connections, despite the fact that they violate operators' restrictive user agreements.

One example is an iPhone application that allows Apple's iPhone users to share EDGE or 3G connections with other devices to create a portable WiFi hotspot of sorts. The app briefly appeared on Apple's App store but was pulled minutes later and then restored again the next day. Such an application violates user agreements.

3G operators will continue to face growing pressure over these issues as customers increasingly want to use these connections like they would their home connection, especially with devices that have user interfaces like the iPhone.

In fact, a new study released in late July by market research firm Pioneer Consulting says user-generated content (UGC) looms as a source of serious discontent for mobile operators.

Since a significant percentage of multimedia content on mobile devices is either user-generated or just stored on the device, Pioneer says that a growing number of subscribers are now exploiting alternative technologies like Bluetooth, WiFi, and WiMAX to effectively bypass operator networks when sharing their content with friends, family, and social networking contacts.

The study suggests subscribers who circumvent the traditional content value chain could rob operators of as much as $16.4 billion in potential annual revenues by 2012, more than a quarter of the projected total revenue for the year in question.

The report argues that operators must re-evaluate the relevance of the traditional client-server content delivery architecture in an environment where a growing chunk of media originates from the device.

Moreover, they must come to terms with the inevitable bandwidth bottleneck between the base station and handset brought on by an oversubscribed air interface.

Most important, the study says carriers must embrace P2P sharing within their networks.

Operators can't simply add more bandwidth. They need to add more spectrum and increase the capacity of their networks with more efficient technology. Does that mean everyone really has no choice but to accept these limitations on their wireless broadband?

They certainly won't do it voluntarily. Will IP-based networks such as Long Term Evolution (LTE) technology be the answer?

In the meantime, if operators don't embrace new services like P2P, they miss out on lost revenue and anger customers along the way.

Deciphering the Term "Cloud Computing"

Excerpted from TechRepublic Report by John Sheesley

"Cloud computing" is a bit of a catch-all term that can mean different things to different people. It's very conceptual in nature. Some vendors use the term interchangeably with the term "distributed computing."

Others substitute it for the term "utility computing" or "hosted computing." Others use it when they mean "SaaS".

When you boil it down, "cloud computing" is really a mix of all those phrases. What's key is to understand what a particular vendor means by it.

The concept of "distributed computing" has been around for some time now. The idea is pretty simple: You take a bunch of computers and link them together for a specific use. All the computers share the same data, harnessing the power of their collective CPU cycles and storage space. You wind up with one giant virtual supercomputer or a massive network that's more than the sum of its parts.

You've probably heard of distributed computing first through the Seti@Home project. This is an effort of volunteers who have linked together their computers in the search for extraterrestrials. Each computer downloads a slice of data that's been gathered from the Arecibo observatory and runs an algorithm that searches through the data for patterns that would indicate ET communications.

The key word in the term "utility computing" is the word "utility." Vendors use the term to put you in the frame of mind of a public utility like the phone company or the electric company. The amount of computer resources you use is metered, and you're charged for the usage. The more you use, the more you pay.

Most likely you're using a group of computers in a utility computing solution, either as a group of CPUs working together in a form of distributed computing or as a massive storage solution. However, it's also possible to have a single computer, such as a mainframe, at a vendor's co-location that you have access to.

In networking, we sometimes use the term "cloud" to refer to any wide-area networking scenario such as a Frame Relay cloud. When used in the term "cloud computing," the word "cloud" refers specifically to the Internet. It's one of those cases where marketing folk have co-opted technical jargon for their own nefarious purposes.

"Cloud computing" is used by many different companies including Amazon, IBM, Salesforce, Sun, and Google, just to name a few. Interestingly enough, however, Dell has recently said that it owns the term. Chances are that such a trademark wouldn't hold up in court. Even so, it's interesting to see what an important term "cloud computing" has become - big enough that a major company wants to claim it as its own.

So what does it mean when you put the concept all together? "Cloud computing" boils down to little more than a way for you to take some of the work you're doing today on systems that you run in-house and doing them elsewhere.

When a hardware vendor like Sun or Dell uses the phrase, they'll mean it in more of the "utility computing" sense, where you're renting the services you need from an off-site area.

Companies like Salesforce, Google, and Amazon lean more to the "SaaS," meaning of the phrase, where you're running services that they provide on their equipment. IBM, for one, uses the term to mean whatever it takes to get your business. Either you can run your services on their machines, or they'll work with you to create custom apps that they also host.

Because so many companies use the term differently, it can be hard to keep up with its meaning. Here are some resources that discuss "cloud computing:" IBM Introduces Ready-to-Use Cloud Computing, Computing Heads for the Clouds, How Cloud & Utility Computing Are Different, Dell Cloud Computing Solutions, Amazon Web Services @ Amazon.com, Is Google's Plan Realistic or Do They Have Their Heads in the Cloud?, Putting the Desktop in the Clouds, Cloud Computing: A Look at the Myths, How Much Is a Unit of Cloud Computing?, and Enterprise Cloud Computing Gathers Steam.

CBS Interactive News has video with Dan Farber that covers some of the issues around "cloud computing."

When a vendor starts throwing terms like "cloud computing" at you, try not to let your eyes glaze over. Depending on what the vendor's main line of business is, they could mean several things with the term.

Just remember what "cloud computing" can mean to you - taking some of the systems that you're using now and handing them over to someone else. This can be a good thing from a management and budgetary standpoint, but it can be a bad thing from a security and reliability standpoint.

When you take all of it together, you can drill through the fog of the marketing meaning of "cloud computing" to see if the vendor is selling nothing but air.

File-Sharing Networks Return with Legit Music

Excerpted from Knowledge@Wharton Report

After the US Supreme Court declared in 2005 that Internet file-sharing sites Grokster and StreamCast had unlawfully aided their customers' efforts to share infringing copies of copyrighted music and video files, many commentators predicted the demise of businesses that depended on online file-sharing.

But the technology that Napster, the pioneer of music file-sharing, Grokster and StreamCast unleashed has returned, supported by a business plan that respects copyright laws. Three years after the high court's ruling, several start-ups have found ways to make P2P file sharing lawful and perhaps profitable.

"Although the early use of P2P networks was for digital copyright infringement, P2P networks are increasingly being used for legitimate content distribution, including music, video, and software," writes Kartik Hosanagar, a Wharton Professor of Operations and Information Management, in a recent paper on P2P business models.

"For example, Grooveshark, rVibe, We7, and iMesh are firms that use P2P networks to distribute music to users. The music is licensed from the music labels, the files are distributed from users' machines and the P2P firms provide software and billing. A number of technologies have also emerged to prevent unauthorized redistribution in P2P networks. As a result, distribution of digital products through P2P networks is likely to become more prevalent."

Still, Hosanagar says that these new commercial outfits need to tweak their business plans. He predicts that they will make more money if they are savvier about pricing content and about paying their customers to share it.

In "Dynamic Referrals in P2P Media Distribution," Hosanagar and two co-authors at the University of Washington, Yong Tan and Peng Han, create a mathematical model of the ebb and flow of supply and demand on a P2P network.

Their model suggests that file-sharing firms should often pay high fees to users who provide content to other users - sometimes higher than the retail price of the file itself - and that they should vary these referral fees and their prices according to the demand for particular files.

The three scholars have also recently completed a related paper on the topic titled, "Diffusion Models for P2P Content Distribution: On the Impact of Decentralized, Constrained Supply."

To appreciate their arguments, it helps to understand how P2P file sharing works.

Each computer in a P2P network acts as both a store and a customer. Users of these networks can provide content by making the files on their computers available to other users of the network. In return, they can use the network to reach into other customers' computers and copy the files that have been made available.

Participants in a P2P network can share any kind of content, though music has proved most popular. The P2P networks provide software that enables their customers to organize their content, search the network, and swap files with each other. These firms often charge a fee for each file acquired though the network, and use that revenue to pay royalties to content creators and referral fees to users who share their files.

An obstacle to the growth of these networks, the paper's authors write, is that many customers are eager to copy files but reticent to make their own files available to others, a practice called free riding. The free riders may have files that other users would like to have, but can't find.

Vibe and Grooveshark try to induce free riders to share files by paying referral fees for making the files available to other customers. But Hosanagar says that they may not be paying enough. "rVibe pays 5 cents for a 99-cent track. Grooveshark has recently changed its policy, but I think it was originally 10 cents a song. Our conclusion is that you want to offer really high payments early on and that payments shouldn't be fixed. We found in many cases that the referral payment could be higher than the price. For example, you might initially pay $1.50 for a 99-cent file."

Why would a company pay more for a product than the eventual sale price? Is that not a recipe for bankruptcy? Not at all, Hosanagar says. It's a logical response to the law of supply and demand. "The initial high payment brings in a lot of distributors, and it gets them to share files when there's scarcity."

As a result, he says, the firm ensures that its customers can get the files they want and don't shop elsewhere or resort to unauthorized file sharing. By encouraging early distribution, the arrangement also feeds the buzz that any media company seeks: as more and more people hear a song or see a video, they may recommend it to friends who then may buy it, too. Those users, in turn, distribute the file, scarcity abates, and the firm can gradually reduce the referral fee.

Retail prices for content on a P2P network should work similarly, Hosanagar says. That is, they should be flexible and reflect demand. Firms would then have two pricing options. If they want to create buzz, they may initially set a low price for files to encourage people to buy them. Or they may decide to price based on scarcity, charging more early on to capture maximum revenue from the zealots who'll snap up anything new from a favorite artist.

Later, once that hardcore demand has been sated, they could charge less. "The optimal strategy seems to be to price the product low at first and pay the majority of the amount collected to the P2P distributor," Hosanagar says.

"So initially your profits are lowest. Over time, you increase your price and reduce your referral payments." On the Internet, firms can easily implement flexible prices. "In the past, it was difficult to do customized pricing," notes Hosanagar. "The music industry has generally been one that was slow to adapt."

Anyone who has bought a song from the popular iTunes store might wonder why P2P firms don't forego all of these hassles and organize themselves like Apple, with tightly policed central servers. Apple controls the content on its website and sets the terms of use, including its famed fixed price of 99 cents for any song.

Its customers can then count on songs always being available. "It's extremely expensive to distribute media centrally," Hosanagar notes. "An Apple has the capital and expertise to manage that, but not everyone does." In contrast, P2P "allows a firm to efficiently distribute media at a relatively low cost," Hosanagar and his co-authors note. "Further, the distribution infrastructure automatically scales as new consumers join the network."

What's more, organizing a company as an enabler of P2P sharing potentially provides a wider menu of offerings for customers. "Apple is going to find it impractical to negotiate deals with a lot of independent, unknown artists, while with a Grooveshark or rVibe, the independent artist signs up for the service, creates an account and uploads his songs," says Hosanagar.

That allows the artist's fans, no matter how small a group they are, to download songs and pay whatever price the artist and Grooveshark agree to with Grooveshark taking a small percentage of each sale.

Likewise, that obscure musician will have a tough time arranging a distribution deal with iTunes. "Apple isn't going to want to negotiate contracts with a bunch of unknowns," Hosanagar points out.

The most ardent advocates of P2P networking argue that it will eventually displace Apple because consumers will tire of Apple's inflexible pricing, and content providers will rebel against its stranglehold on online distribution.

"Personally, I don't see the P2P model displacing iTunes," Hosanagar says. "But I do think it's here to stay. I recollect a statement that I read in early 2000 where somebody said P2P was a solution in search of a problem.

That's moot now," with many people around the world sharing files on legal P2P networks.

Still, the question remains: can sharing become a booming business like iTunes? Certainly many consumers want to swap files online, but it's not as clear that they're willing to pay to do so. The brief history of file sharing is littered with failed ventures.

For practical purposes, Napster, the brainchild of a Northeastern University freshman, created the niche. The service allowed anyone to swap - or in the opinion of the major music labels, steal - music files online. If you wanted a song by the White Stripes and someone on the network shared it, you could download and play it on your PC.

Napster kept, on a central server, an index of all the music available on its network, which made searching easy. Music labels sued, arguing that Napster was abetting the infringement of copyrighted music files. A federal court agreed and Napster was eventually liquidated.

When Grokster and StreamCast entered the market, they provided file-sharing software but didn't create central indexes. Their software allowed users to search each other's computers, seeking files that they wanted. When the major labels sued them, they argued that they couldn't control what people did with their software and, without central indexes, didn't even know. The US Supreme Court didn't buy their arguments and ruled that they, too, had abetted infringement.

But file sharing lived on. Solid estimates of the extent of unauthorized file sharing are scarce, but some range as high as one billion songs a year. rVibe and Grooveshark are trying to persuade at least some of these folks to purchase downloads by combining appeals to guilt and greed.

Grooveshark's website stresses that musicians go hungry if people don't pay for their music, and of course it pays those referral fees.

Rock musician Peter Gabriel has endorsed another approach. He's an investor in a P2P file-sharing firm called We7. Its customers can download free songs with short ads at the beginning.

"The revenue generated from these advertisements goes to artists, labels, and other rights owners," We7 explains on its website. "You get music for free, and the artist gets fairly paid." The ad disappears after four weeks. Or customers can elect to purchase a file outright and skip the ad. Users can also share files.

Which of these models will triumph? Hosanagar isn't sure.

"I can think of three or four outcomes we might see. There might be free content that's used to stimulate demand for the other things, like concerts and T-shirts. There might be free ad-supported content. Or there might be a model where you buy the songs, but it will not be the rigid pricing model that we see today. Or lastly, it might be a model where payment is on a per-play basis rather than a per-purchase basis."

Music Industry Should Embrace File-Sharing Services

Excerpted from Financial Times Report by Andrew Edgecliffe-Johnson

The music industry should embrace unauthorized file-sharing websites, according to a study of Radiohead's last album release that found huge numbers of people downloaded it in unlicensed formats even though the band allowed fans to pay little or nothing for it.

"Rights-holders should be aware that these non-traditional venues are stubbornly entrenched, incredibly popular and will never go away," said Eric Garland, co-author of the study, which concluded there was strong brand loyalty to controversial "torrent" and P2P services.

Radiohead's release of "In Rainbows" on a pay-what-you-want basis last October generated enormous traffic to the band's own website and intense speculation about how much fans had paid.

He urged record companies to study the outcome and accept that file-sharing sites were here to stay. "It's time to stop swimming against the tide of what people want," he said.

The study by the MCPS-PRS Alliance, which represents music rights holders, and BigChampagne, an online media measurement company, found that authorized downloads of "In Rainbows" were far exceeded by unlicensed torrent downloads of the album.

Almost 400,000 torrent downloads were made on the first day and 2.3 million in the 25 days following the album's release, compared with a full-week's peak of just 158,000 for the next most popular album of the period.

"The expectation among rights-holders is that, in order to create a success story, you must reduce the rate of copyright infringement - we've found that is not the case," said Mr. Garland, chief executive of Big Champagne, who highlighted the benefits that Radiohead received from the album's popularity, including strong ticket sales for its concerts this year.

The findings could add impetus to rights-holders' efforts to license digital services that are at present beyond their reach, following the pattern of the MCPS-PRS Alliance's recent move to license YouTube, the Google-owned online video-sharing site.

"Developing new ways and finding new places to get something as opposed to nothing" was important, said Will Page, MCPS-PRS chief economist and co-author of the report.

Those new places could be P2P services or Internet service providers (ISPs), he added.

Record companies should ask themselves: "What are the costs and benefits of control versus the costs and benefits of scale?" said Mr. Page.

He also challenged the assumption that no other band could achieve the same benefits, saying Radiohead's experiment had reduced the marginal cost and risk for those following their lead.

He described the launch of "In Rainbows" as "stunt marketing at its best."

Vatata Launches P2P Set-Top Box Solutions

Vatata, a P2P technology solutions provider, has launched its conceptualization of P2P set-top box (STB) solutions. These can support secure, closed P2P networks for STB devices and also connect directly to public P2P networks, based on STB manufacturer specifications.

They can download media content and play it directly on television sets. The solutions include two components: front-end embedded software and a back-end system. The front-end is a set of multi-protocol, highly efficient P2P embedded software programs, including BitTorrent, eMule, Gnutella, and Vatata. The STB can access these P2P networks freely, download media content and play it, including both video-on-demand (VOD) and live programs.

Because most of these STB devices will be in home and behind the home gateway, NAT traversal performance will be more important than for PC desktop applications. Vatata extends public P2P protocols, not only to share resources among STB devices, but also to share resources among STB devices and public P2P networks.

This open approach, making P2P-STB-based networks a part of the public P2P network infrastructure, will help make the public networks more stable, while supporting P2P-STB devices' continued access to content resources, improving the ecosystem.

The front-end embedded software has been tested in many STB hardware platforms, such as: TI / INTEL / NXP / RMI, etc. It runs very well in these platforms. The back-end P2P-STB management system performs public content access and analysis and includes a search engine for the public P2P networks, a real-time health checking system for network links, a public P2P network transmission optimization system, a media formats intelligent identification system, and a user self-service portal system.

The search engine will build a public content links database. Based on this database, the real-time health checking system will scan links in real-time and choose those that are optimal. For speeding up content downloading, it will make use of these links. Optimization of the public P2P networks, including two aspects, using technology such as P4P technology to optimize the network topology is one aspect, and the other will be to establish cache content super nodes for connected links.

The next step, media formats intelligent identification system will filter out formats suitable for specific STB devices from the links database. Users can use the self-service portal system to manage their STBs or customize their personalized channel menus.

Digicorp Youth Media Distributes Branded Content

Digicorp has entered into a content agreement with Footprint Worldwide, a branded entertainment and content company in Los Angeles, Beijing, and Shanghai, that has a strategic partnership with sports marketing pioneer, James Warsaw. This agreement serves as an important step in advancing Digicorp's China youth marketing strategy.

In June, Digicorp entered into a 20-plus-10-year cooperation agreement for exclusive advertising and media rights with China Youth Net which positions Digicorp to market to China's 45 million, upwardly mobile student elite, with direct P2P access online, on campus, and via mobile.

Through its wholly owned subsidiary Youth Media, Digicorp plans to offer advertisers and corporations direct and centralized access to this highly desirable but hard-to-reach demographic through a dedicated campus peer-to-peer television (P2PTV) network, campus events, and advanced mobile marketing.

Under the agreement with Footprint, Youth Media will receive both Chinese and Western music, television, sports, and other entertainment content for distribution through Youth Media's P2PTV network, as well as brand-sponsored live events to be held at college campuses throughout China.

Digicorp, Footprint, and James Warsaw will work to increase the visibility of such leagues as the NBA, NFL, MLB, Tennis, and Major League Soccer among China's vast student population. Warsaw is the former head of one the world's leading licensed sports marketing firms, Sports Specialties, which Nike acquired in 1993.

"We see Footprint and the relationship with Jim Warsaw adding great depth to our content offering, particularly in the area of branded on-campus events," commented Digicorp CEO Jay Rifkin. "This is part of our strategy to position Youth Media as a one-stop shop in China where brands can get multiple and targeted impressions with a single media buy: online, on campus, and on mobile."

"This deal with Digicorp gives us an immediate opportunity to bring the biggest stars in sports and international touring acts directly to the most sought-after demographic in China," commented Footprint President & Co-Founder Jackie Subeck. "Youth Media offers the most comprehensive and effective solution to break through China's youth market and build a loyal following."

BitTorrent Realigns to Support DNA & SDK

While layoffs are always a difficult course of action, BitTorrent has reduced its staff to better align its resources around its core content delivery infrastructure business. The company remains focused on generating the most value for its partners and customers to drive long-term success.

BitTorrent reduced less than 20% of its team, and those impacted were distributed across its organization, rather than focused on a single department. The layoffs were unrelated to any ongoing discussions to divest a portion of the company's business.

The company has been involved in strategic discussions with potential partners who are interested in the BitTorrent online store. These discussions continue.

With the explosive growth of online video, P2P technology will continue to be an integral part of the Internet infrastructure as it enables the most efficient distribution of large files.

Not only is BitTorrent the global leader in the P2P space with the largest client footprint, it is also working closely with the world's leading ISPs, including Comcast, to implement solutions that will provide the best P2P user experience to accommodate all network topologies.

BitTorrent is seeing healthy demand for its Delivery Network Accelerator (DNA) service and its Software Development Kit (SDK), which brings rich Internet media to the TV.

It is working with many online video, gaming, software, and hardware companies to integrate BitTorrent technology. As such, its top priority is to deliver the most valuable and efficient solutions to the BitTorrent community and technology marketplace.

Switzerland Network Testing Tool

Developed by the Electronic Frontier Foundation (EFF), Switzerland is an open source software tool for testing the integrity of data communications over networks, Internet service providers (ISPs), and firewalls. It will spot IP packets which are forged or modified among clients, inform you, and give you copies of the modified packets.

You can download the latest release of Switzerland here. Before you run it, be sure to check out the notes about privacy, security, and firewalls. Switzerland is currently in alpha release as a command line tool.

In other words, right now it is aimed at relatively sophisticated users. However, because it's an open source effort, EFF anticipates making it easier to use over time.

Switzerland is designed to detect the modification or injection of packets of data traveling over IP networks, including those introduced by anti-P2P tools from various vendors.

The software uses a semi-P2P, server-and-many-clients architecture. Whenever the clients send packets to each other, the server will attempt to determine if any of them were dropped, forged, or modified (if you're interested in how it does that, you can read the design document here).

Switzerland is a much more sophisticated successor to the pcapdiff software that EFF released last year. It automates many of the things that had to be done by hand with the earlier code.

One advantage this architecture has over other network testing tools is that it can spot arbitrary kinds of packet modifications in any protocol - it doesn't assume that the interference comes in the form of TCP reset packets or web-page modifications, and it isn't limited to BitTorrent or any other specific application. 

In the future, EFF expects it to offer a good platform for collecting statistics on bandwidth, bidirectional latency, jitter, and other traffic performance characteristics that might be signs of prioritization of some applications over others.

French P2P Movie Downloads Outnumber Ticket Sales

Excerpted from Digital Media Wire Report by Mark Hefflinger

French citizens downloaded 13.5 million movies from file-sharing sites during May, while French movie theaters sold 12.2 million tickets, Reuters reported, citing a study by French anti-piracy group Alpa. 

"This is a major phenomenon that could endanger the cinema and audiovisual industry," Alpa representative Frederic Delacroix told French newspaper Le Figaro. 

The study also found that an average of 10 million unauthorized movies are downloaded in France each month, about 66% of them American films and 19% of them are French.

Central Michigan University Files Complaint

Excerpted from The Morning Sun Report by Lisa Satayut

Central Michigan University (CMU) has filed a complaint with the Michigan Department of Labor and Economic Growth (DLEG) against the Recording Industry Association of America (RIAA) for its role in the investigation of personal student information.

"We want to make sure we are protecting our students in every way we can," CMU spokeswoman Heather Smith said.

A number of CMU students are wanted by a contractor hired by the RIAA for alleged copy right violations for downloading music. The contractor has been intercepting the IP addresses off computers that are in CMU's network, Smith said.

The most recent subpoenas were sent to the university in March and May and had more than 20 names listed in each subpoena. "We answered those and then provided the names linked to the IP address," Smith said. She said the university had no choice.

"We have to give out those names, but it is not being done lawfully," she said of the investigative tactics of obtaining the subpoenas. The complaint filed by CMU's Assistant General Counsel, Mary Roy is requesting that the DLEG issue a cease-and-desist letter.

"They are not licensed to privately investigate. We want to cover all of our bases and make sure student rights are protected," Smith said.

In the complaint, Roy states that the investigation of CMU students has sought to determine the identity, conduct, and acts of the students; their involvement or responsibility for the alleged copyright infringement; and to secure evidence to be used against the students before a court in litigation involving the alleged acts of infringement.

"All of these activities would clearly constitute the activities of a 'private investigator' under the Private Detective Licensing Act (PDLA) and as such would require a license to engage in such activities," Roy stated.

On May 28th the Office of the General Counsel at CMU received a letter from the RIAA that alleged 21 students had engaged in copyright infringement.

This practice has been challenged in federal court in at least eight states, including Michigan.

Another University Pushes Back

Excerpted from Digital Music News Report

Tufts University is now pushing back against the RIAA, the latest in a string of university challenges. In response to a subpoena demanding identities tied to various IP addresses, the Boston-based school noted that it would be impossible to properly identify the infringing students using the current network infrastructure.

The university indicated that its systems for matching an IP address to a specific user could be updated, though within the current structure, exact identifications remain difficult.

"We recognize the inherent limitations of the network data retention system that we are currently using, and are actively looking at possible adjustments," university Vice President Mary Jeka told a federal judge in a recent letter.

An earlier court decision actually addresses this very issue, and calls for network administrators to submit all possible suspects identified. But given the potentially large number of individuals identified by Tufts - up to seventeen in one case - the school has expressed some reservations about releasing the names to the court.

"We believe that it would be unfair to identify all possible individuals meeting the plaintiffs' criteria, given the low likelihood of identifying the guilty party," Jeka said.

Coming Events of Interest

International Broadcasting Convention - September 11th-16th in Amsterdam, Holland. IBC is committed to providing the world's best event for everyone involved in the creation, management, and delivery of content for the entertainment industry. Uniquely, the key executives and committees who control the convention are drawn from the industry, bringing with them experience and expertise in all aspects.

Streaming Media West - September 23rd-25th in San Jose, CA. The only show that covers both the business of online video and the technology of P2PTV, streaming, downloading, webcasting, Internet TV, IPTV, and mobile video. Covering both corporate and consumer business, technology, and content issues in the enterprise, advertising, media and entertainment, broadcast, and education markets. The DCIA will conduct a P2P session.

P2P MEDIA SUMMIT LV - January 7th in Las Vegas, NV. This is the DCIA's must-attend event for everyone interested in monetizing content using P2P and related technologies. Keynotes, panels, and workshops on the latest breakthroughs. This DCIA flagship event is a Conference within CES - the Consumer Electronics Show.

Copyright 2008 Distributed Computing Industry Association
This page last updated December 14, 2008
Privacy Policy