March 31, 2008
Volume XXI, Issue 7
BitTorrent & Comcast Agree to Collaborate
DCIA Member company and P4P Working Group (P4PWG) participant BitTorrent, the company behind the industry leading peer-to-peer (P2P) protocol, and Comcast, the largest US cable ISP and also an active P4PWG participant, have agreed to work together and with other P2P and ISP companies to more effectively address issues associated with rich media content and network capacity management.
The groundbreaking announcement from these respective industry leaders comes two weeks after DCIA Member companies and P4PWG co-founders Verizon Communications and Pando Networks announced initial results of the first P4PWG field tests, demonstrating dramatic improvements in P2P efficiency for rich media distribution.
Comcast will migrate by year-end 2008 to a capacity management technique that is protocol agnostic. "We will rapidly reconfigure our network management systems to a technique that is more appropriate for today's emerging Internet trends, and will refine, adjust, and publish that technique based on feedback and initial trial results," said Tony Werner, Comcast Cable's Chief Technology Officer (CTO).
"Recognizing that the web is richer and more bandwidth intensive than it has been historically, we are pleased that Comcast understands these changing traffic patterns and wants to collaborate with us to migrate to techniques that will be more transparent," said Eric Klinker, BitTorrent's Chief Technology Officer (CTO).
"Earlier this year, Comcast announced its plans for the aggressive deployment of wideband Internet services using the DOCSIS 3.0 standard, which we project will be available in up to 20% of Comcast's households by the end of this year," said John Schanz, Comcast Cable's Executive Vice President of National Engineering and Technical Operations. "Additionally, we plan to more than double the upstream capacity of our residential Internet service in several key markets by year-end 2008."
BitTorrent and Comcast will also step-up their work with other P2P companies and ISPs to more efficiently distribute rich media content through their active participation in the P4PWG.
P4PWG active participants currently include Abacast, AHT-International, Alcatel-Lucent, AT&T, Bezeq International, BitTorrent, CableLabs, Comcast, Cisco Systems, GridNetworks, Joost, Juniper Networks, LimeWire, Manatt, Microsoft, Nokia, Orange, Oversi, Pando Networks, PeerApp, RawFlow, Rinera Networks, Solid State Networks, Telecom Italia, Telefonica Group, Thomson, University of Toronto, University of Washington, Velocix, VeriSign, Verizon, Vuze, and Yale University. There are also a similar number of P4PWG observers.
"In the spirit of openness and fostering innovative solutions, BitTorrent will enhance our client applications to optimize them for a new broadband network architecture and will publish these optimizations in open forums and standards bodies for all application developers to benefit from," said Ashwin Navin, President & Co-Founder of BitTorrent.
"P2P technology has matured as an enabler for licensed content distribution, so we need to have an architecture that can support it with techniques that work over all networks," said Werner.
BitTorrent and Comcast believe these technical issues can be worked out through private business discussions without the need for government intervention.
"BitTorrent and Comcast can serve consumers best by working together along with other P2P companies and ISPs to jointly develop more efficient networks and applications. This should prove to be a productive partnership that will provide consumers with a better Internet experience," said Doug Walker, CEO of BitTorrent.
"We appreciate the recent dialogue that we have had with BitTorrent and the progress that we have made in addressing our respective concerns. Working together, we can deliver a truly superior experience to all of our customers," said Steve Burke, President of Comcast Cable.
Report from CEO Marty Lafferty
The DCIA generally believes in the fundamental wisdom of relying on the marketplace rather than governmental intervention to advance commercial development of the peer-to-peer (P2P) distribution channel.
Advancements are proceeding too rapidly and the underlying technological considerations are too complex for this area to be an attractive candidate for effective tactical regulatory action.
We appreciate the support of Kyle McSlarrow, President & CEO of the National Cable & Telecommunications Association (NCTA), who applauded both BitTorrent and Comcast "for their willingness to work together on immediate solutions to challenges like traffic management, and for their leadership in working with others to look for opportunities that will continue to ensure a vibrant Internet."
As Kyle also noted, "Government interference in the development of this market could easily foreclose or otherwise prevent the emergence of efforts such as this one, and it could never anticipate the kinds of consumer-responsive approaches that further improve and enhance the user experience, including efforts to respect the rights of copyright owners and to fight piracy."
We are also pleased that Federal Communications Commission (FCC) Chairman Kevin Martin's reaction to this week's very promising announcement from BitTorrent and Comcast, coming on the heels of the related announcement of extremely positive P4P Working Group (P4PWG) field test results by Verizon and Pando, includes the commitment to remain vigilant in ensuring that consumers are enabled to access content of their choice from applications and services of their choice on the Internet.
We need to see to it that the promise of this bi-lateral agreement extends through the ongoing work of the P4PWG. Other current and future P2P companies need to be able to continue to develop their innovative applications and services without the fear of being blocked or throttled; and other current and future ISPs need to be able to continue to expand their capacity and capabilities without the fear of being prevented or discouraged.
More than anything, we need to provide certainty that P2P software and related offerings will consistently work over broadband networks without being subject to anti-competitive interference. Full functionality must be provisioned equitably to all P2P services and other web-based offerings without prejudice. An open and non-discriminatory Internet is critical to both consumers and innovators.
We agree most of all with Commissioner McDowell who said, "It is precisely this kind of private sector solution that has been the bedrock of Internet governance since its inception. Government mandates cannot possibly contemplate the myriad complexities and nuances of the Internet marketplace. The private sector is the best forum to resolve such disputes. Today's announcement obviates the need for any further government intrusion into this matter."
We also support Commissioner Tate's favoring of "competition and market forces rather than government regulation across all platforms especially in this dynamic, highly-technical marketplace" and her looking forward to "even more collaborative, industry-based solutions, which are often the most effective and efficient means of resolving complex, technical network disputes."
As Commissioner Adelstein noted, the FCC hearing on April 17th at Stanford University will afford the opportunity to learn more details about the BitTorrent and Comcast collaboration, and to continue encouraging broadband providers to listen to the chorus of consumer calls for open and neutral broadband Internet access. And as Commissioner Copps added, this process, which has brought together network operators, edge content providers, and consumers, will help clarify rules of the road to benefit American consumers and provide much-needed certainty to both ISPs and P2P entrepreneurs.
There's no question that public pressure played a role in this week's developments and, in our view, provided additional evidence that the FCC's open petitioning process helps the marketplace to work.
The bottom line here is that the FCC continues to have the responsibility to protect the rights of consumers against discriminatory practices, and the encouraging progress marked by BitTorrent and Comcast's announcement does not eradicate that. It is incumbent on all of us in relevant positions in the private sector, rather, to build on the promise of that agreement, as well as on the positive initial field test results from the P4PWG.
It is up to us now to ensure that opportunities for greater efficiency and benefits of ongoing innovation apply to all P2P companies and all ISPs in an open and inclusive way. Share wisely, and take care.
How ISPs Learned to Stop Worrying and Love P2P
Excerpted from Web World Report by Brad Reed
One month after Comcast aggressively defended its targeted use of TCP reset packets to delay or stop BitTorrent uploads at an FCC hearing last month, the company has reversed course and says it will stop targeting individual P2P protocols when managing network traffic.
Comcast's reversal came as a welcome development for BitTorrent, which had argued against the ISP's techniques at last month's FCC hearing on broadband network management practices.
Ashwin Navin, BitTorrent's President & Co-Founder, says his company has been negotiating with Comcast for more than two years on traffic-management issues, and that the recent media attention to Comcast's traffic-management practices served as "a catalyst" to announce the two companies' collaboration.
In return for Comcast's cooperation, he says that BitTorrent will work with other companies to develop more-efficient P2P technology that will place less of a burden on network architecture.
"We are particularly enthusiastic about Comcast's commitment to make their network management protocol agnostic - neutral to all applications - as well as their efforts to upgrade broadband speeds for both downstream and upstream traffic," Navin says.
"We will optimize our application to take advantage of their network upgrades and share those techniques with the broader Internet community."
Marty Lafferty, CEO of the Distributed Computing Industry Association (DCIA), says this newly announced collaboration between Comcast and BitTorrent was inevitable given the ever-increasing consumer and business demand for bandwidth and the potential of P2P protocols to deliver large files rapidly over the web.
"We're at the point right now where the advantages of P2P technology are so enormous that commercial development has to go forward," says Lafferty, whose organization sponsors the P4P Working Group (P4PWG) that is working with ISPs and P2P companies to optimize P2P content delivery.
"P4P is not an individual technology, but rather a set of practices that will enable ISPs to ensure the most-efficient possible delivery of payloads for their customers."
Typically, P2P technology such as BitTorrent distributes large data files by breaking them up into small pieces and sending them through multiple sources. After all the data is received, the file is reassembled as a whole.
While this method of file sharing is much faster and more efficient than relying upon one centralized server, it can cause traffic-management problems for ISPs because P2P protocols are mainly designed to download large chunks of data from sources wherever they can be found, and without particular regard to network efficiency.
This has led some ISPs to use controversial methods to either slow or stop P2P traffic on their networks. Last year, for instance, the Associated Press reported that Comcast has been employing technology that is activated when a user attempts to share a complete file with another user through P2P technology such as BitTorrent and Gnutella.
As the user is uploading the file, Comcast sends a message through TCP RST packets to both the uploader and the downloader telling them that there has been an error within the network and that a new connection must be established. Because the message sent to users does not appear to be sent directly from Comcast, many critics have accused Comcast of sending forged or spoofed packets that they say are deceiving to consumers.
But as Comcast's reversal this week shows, such traffic-management practices are increasingly going out of fashion, and many ISPs already have a policy of not targeting individual protocols when they manage traffic. Jeffrey Sopha, Manager of Network Development for wireless technology at Sprint Nextel, says that Sprint doesn't believe that it is in a position to "police the Internet" and that his company works to add capacity during peak times rather than slow targeted applications.
"We recognize that wireless is following the trends that wireline has followed for P2P traffic," he says. "So we take measurements in real time at various points throughout the network and we determine when it's the appropriate time to add more radio capacity, firewall capacity, and so forth."
At last month's FCC hearing, Tom Tauke, Verizon's Executive Vice President for Public Affairs, Policy and Communications, also said that his company did not use RST TCP packets to manage P2P traffic "because of the capacity of the network that we're currently deploying."
However, he also noted that the rapid growth of high bandwidth-consuming applications made it very difficult for ISPs to determine just how much to invest in building out capacity. To mitigate these problems, many ISPs and P2P vendors have started to look at ways to make P2P architecture more sensitive to network needs.
Some of the most high-profile ideas have come from the DCIA's P4PWG, which already includes major players such as Verizon, AT&T, Comcast, BitTorrent, and Pando.
Last week, the group announced that it had successfully tested experimental P2P software developed with researchers at Yale University that the group says could eliminate many of the headaches that P2P systems have traditionally caused ISPs.
Rather than taking data from wherever it's available, the new system actively directs file sharing among multiple users and puts far less strain on network capacity. Haiyong Xie, a researcher who helped develop the software while a Ph.D. student at Yale, says that the protocol uses an ISP's topology map to make suggestions for which cloud of P2P clients should peer with other clouds of clients.
"Suppose we have clients trying to peer with one another in three different cities - New York, Boston, and Washington, DC," he says. "After analyzing the data provided by the network topology map, then the iTracker may tell the pTracker that it would be optimal for the New York users to peer with the DC users 90% of the time, and with the Boston users 10% of the time."
And this is only one of the projects that P4P members have been working on.
Content delivery network vendor Velocix, for example, recently unveiled a hybrid-P2P protocol for live video streaming that relies both upon traditional peers and also on content that has been cached on Velocix's CDN. Thus, while the system relies primarily on multiple cache servers to deliver video streaming, it also can also accelerate content delivery by adding P2P sharing.
Broadband media delivery company PeerApp, meanwhile, has developed a protocol to cache P2P content at the edge of the network that the company says allows ISPs to more easily manage their traffic by generating additional bandwidth during peak hours. Thus, P2P users can download their files from cache servers during peak hours rather than relying exclusively on peers.
Gartner analyst Mike McGuire says that such innovation will be a terrific asset for the growth of P2P technology, as it will make more ISPs willing to tolerate high volumes of file sharing on their networks. He also says that as P2P technologies continue to develop, more ISPs will look at them as important content-distribution tools rather than threats to their networks.
"Five years ago, it was very easy for ISPs to say, 'We have to crack down on piracy' and leave it at that," he says of ISPs' past attitudes toward P2P. "At the time, we were telling them to not assume that all P2P architectures are the enemies of their businesses. These technologies aren't going away and you can't sue them out of existence."
Irwin Lazar, an analyst at Nemertes Research, notes that if P2P protocols can overcome their reputation as tools for piracy and make inroads as legitimate content delivery applications, then more ISPs will be willing to invest in optimizing them for their networks.
"The music industry fought unauthorized file sharing for so long, but they eventually let Apple roll-out iTunes, which showed that people would pay for music online if it was offered at a reasonable price and was relatively easy to download," he says. "If P2P systems can do the same thing for, say, HD video, then ISPs are going to have to use some kind of P2P storage system."
Comcast Outlines New Capacity Management Plan
Excerpted from Telecommunications Reports Daily Report by Lynn Staunton
Comcast is still conducting trials of the new protocol-agnostic capacity management technique that it plans to deploy by year-end, but essentially the plan is to assign a lower priority to communications involving customers that are consuming the lion's share of capacity, according to Tony Werner, Chief Technology Officer (CTO) for Comcast Cable.
Mr. Werner fleshed out the announcement Comcast made this week of its plans to migrate to the new network management approach that will focus on high-bandwidth users, and away from its previous focus on high-bandwidth applications - such as P2P file sharing.
That announcement included plans to work with P2P application developer BitTorrent to resolve network management issues, which are currently a hot-button item in the broader network neutrality debate.
When a particular network element experiences congestion, between 0.5% and 2% of customers can be consuming 50% of the capacity, Mr. Werner said, with the other 98% of customers limited to the other half of the network capacity at that point.
The new approach will be to assign a lower priority to the "power users" so that they are using only about 25%, until the network congestion has passed, he explained. After the prioritization is applied, the network would use normal transmission control protocol (TCP)) techniques for discarding packets as necessary, he added.
Based on testing so far, he continued, the expectation is the impact of that prioritization "will be fairly momentary and not very noticeable even to those customers that are power users."
The technique will be applied "network element by network element," so that if a user is causing congestion on traffic headed "upstream," the prioritization would be applied on uploads, and if a user is causing "downstream" congestion, the prioritization would be applied to downloads, he said.
Regarding a statement by FCC Chairman Martin that suggested Comcast should stop its previous traffic management practices now, rather than wait until the new approach is implemented, Mr. Werner said that if Comcast immediately shut off the application-focused traffic management, and if other network operators in the US did so as well, "a lot of services would suffer," such as Skype and other voice applications, certain gaming apps, and even web surfing "where you want snappy performance."
The US Chamber of Commerce issued a statement calling the Comcast-BitTorrent announcement proof "that the marketplace is working and that net neutrality regulations are not needed." William Kovacs, the group's Vice President, Environment, Technology and Regulatory Affairs, said, "There is no market failure that justifies government intrusion into the dynamic broadband market."
David Sohn, Senior Policy Counsel at the Center for Democracy & Technology (CDT) and Director of CDT's Project on Intellectual Property and Technology, called the Comcast-BitTorrnet announcement "a welcome development."
He said the new congestion management approach described in Comcast's announcement "sounds a lot like what CDT suggested in its comments to the FCC in the agency's Wireline Competition docket 07-52 broadband industry practices proceeding - namely, that network management practices involving any form of traffic degradation should be evenly applied, rather than singling out specific services or applications."
Mr. Sohn emphasized, however, that CDT "would like to see general adoption of the concept that congestion management should be (i) agnostic as to content or protocol and (ii) transparent to Internet users and applications developers."
He added, "Endorsement of these principles by other carriers could be an important next step, as could a decision by the FCC to incorporate these principles into its August 2005 broadband Policy Statement."
Velocix Competes with Joost in Live Streaming P2PTV
Excerpted from Beta News Report by Tim Conneally
UK-based Velocix (formerly CacheLogic) has begun offering live content with a hybrid P2P streaming client.
Live streaming offers a challenge to content providers not present in on-demand delivery. Live streams face a concentrated audience all attempting to access the video feed simultaneously, putting a significantly larger strain upon resources than if the same amount of viewers spread out that demand over a protracted time frame.
Industry leading peer-to-peer television (P2PTV) service Joost recently tried its hand at live streaming video in a partnership with CBS to broadcast NCAA "March Madness" basketball games. As Matt Zelesko, Senior Vice President of Engineering Operations at Joost, anticipated, connections to CBS feeds were reported to have been dropped at inopportune periods.
Velocix offers several approaches to live content delivery, one of which is a BitTorrent-compatible hybrid P2P client which turns each viewer into a sort of "peer host" for additional viewers. Streams are viewable in Adobe Flash or in Windows Media.
The company has partnered with the BBC and leading P2PTV service Babelgum as well as providing services for Bollywood.tv, and Chic.tv.
Though the company doesn't expect millions of consumers to install the Velocix browser plug-in yet, it believes the content partners will draw attention to the service. Its partnership with Connexia could provide Velocix an outlet similar to Joost's "March Madness" effort, as Connexia is connected with 2007 World Cup Champion football club AC Milan.
CableLabs Researcher Joins P2PTV Leader Joost
Excerpted from MultiChannel News Report by Todd Spangler
Jason Gaedtke, a one-time Internet engineer at Comcast who joined CableLabs as Chief Scientist in November, has been recruited to be Chief Technology Officer (CTO) of industry leading P2PTV service Joost.
Joost's previous CTO, Dirk-Willem van Gulik, joined the BBC's Future Media and Technology Group as Chief Technical Architect earlier this year.
At CableLabs, Gaedtke led Internet-related research projects on P2P architectures and services, "semantic web" incubation and metadata management, and online gaming platforms. Gaedtke represented CableLabs in the P4P Working Group (P4PWG).
Previously, Gaedtke was Chief Architect and Fellow for Comcast Interactive Media. In January 2007, Gaedtke - speaking at the Society of Cable Telecommunications Engineers' (SCTE) emerging technologies conference - remarked that Joost is the high-water mark for Internet video and that "they've proved it can work and it's certainly a competitive threat."
Joost uses a P2P architecture to distribute video content among its users' computers, using proprietary client software. The content, which includes full episodes of some shows, is available on-demand. Joost is also currently experimenting with offering live, P2P streaming video of the NCAA "March Madness" tournament.
The company has content-distribution agreements with Viacom and CBS (which are investors) as well as with PBS, Warner Bros. Television Group, Major League Baseball, and Turner Broadcasting System.
In January, Joost announced it hired Matt Zelesko, who previously was Vice President of Engineering for Comcast Interactive Media, where he led the team behind the online video platform Fancast, as its Senior Vice President of Engineering Operations.
And last summer Joost hired Mike Volpi as CEO from Cisco Systems, where he was Senior Vice President and General Manager for Cisco's Routing and Service Provider Technology Group, which included Scientific Atlanta.
Joost announced a $45 million round of funding last year from CBS, Viacom, Index Ventures, Sequoia Capital and Chinese multibillionaire Li Ka-Shing.
The startup was founded in January 2006 by Janus Friis and Niklas Zennstrom, who sold their Internet-phone venture Skype to eBay. Joost has offices in New York, London, Luxembourg, and The Netherlands.
P4P Intends to be Far-Reaching Global Standard
Excerpted from Online Reporter Report
From the ISPs' point of view, P2P traffic can appear to be exceptionally daunting. If they choose to block it, as some have accused almost all of the major US ISPs of doing, then their networks would become ghost networks, with virtually no traffic in sight. But if they embrace it, their networks are fast moving crazy places, where suppliers have to sprint to keep their network surviving.
So what's it to be? Well Verizon appears, at least to be considering a middle road, one where instead of working against P2P, or just putting up with its traffic costs, it will offer protocols to help cooperate with P2P networks to deliver entertainment, by better understanding the conditions of the network it is traveling over. That really IS open.
The initiative began last July and is through the auspices of a Distributed Computing Industry Association (DCIA) working group called P4P, which stands for "Proactive Network Provider Participation for P2P."
The two founding members and chairs come from Pando Networks and Verizon Communications. Pando is one of the new breed of P2P companies generating revenue from legal P2P file delivery.
This is really a club for ISPs and P2P suppliers in which they can work out their differences, and it is so much more of a positive approach than whining about network traffic and investing purely in "traffic shaping."
The working group reports that software is already being tested which can improve download speed between 200% and 600%, purely by offering up a set of network APIs, which let a P2P application know which parts of a network are busy, and using this to intelligently decide which P2P nodes should be uploading in support of a file or stream delivery.
It's not rocket science, and if a CompSci grad student had been given the problem, he could have come up with the same answer, but it is how to phrase that question that is interesting.
If the question was, "How do we get traffic zinging around the Internet, for nothing, without the help of the ISP and despite its best efforts to stop us?" then that definitively is the wrong question.
If he were asked, "You have a network and multiple copies of large files distributed around that network, how do you build a rapid file delivery mechanism?" then naturally you reach the DCIA answer.
It is the history of ISPs and P2P suppliers being at each other's throats for so long, that makes it hard to see how this might ever have come about.
In fact what needed to happen was that the livelihood of ISPs needed to be threatened, where the average customer was expecting more and more from the ISP, while the average monthly price for ISP service went down and down, and traffic on their networks went up and up, forcing more and more investment.
At that point, P2P traffic is taken as a fact of life, not something that the ISP looks to the US Supreme Court to make illegal.
ISPs cannot block all P2P activity because VeriSign's Kontiki P2P client, which is now used to deliver millions of hours of TV services around the world from respectable broadcasters, Skype, as well as Joost and Babelgum, are not breaking any laws.
Even Kazaa and BitTorrent may now be carrying more legal than illegal traffic, or if not yet, they should lean that way over time.
If we look beyond this simple set of proposals we see more and more what might be done. By bringing ISPs and P2P suppliers closer, perhaps the handshakes for this type of co-operative routing might also include some form of legitimate traffic audit.
So we perhaps reach a point where if P2P traffic from your software passes some kind of "threshold" test of mostly sending legitimate files (something that deep packet inspection [DPI] might still be needed for) then the APIs to sense the condition of the network are open to your client software, and it is pushed higher up the food chain in terms of the priority attached to the traffic.
If mostly unauthorized copyrighted material appears to be traveling across the network, then perhaps that API co-operation is refused by the network nodes and the resulting traffic packets will be treated as low priority.
That would create an underclass and upper-class of P2P clients, each with a signature that would trigger the various treatments by ISPs.
Now that all sounds fine and dandy, except that much of this traffic is encrypted, and one P2P client can be made to emulate any other, and can re-establish itself in different ports once it is identified and slowed, so there would be technical hurdles, but we believe that there would remain a class of P2P players that are working with the ISPs, and a class that is not.
What that creates politically is an accelerated acceptance of P2P activity for the average ISP customer.
Regardless, it is encouraging to know that Verizon at least is looking to a future when the FCC might make it illegal to indiscriminately block or slow P2P traffic, and instead is thinking about how to make the Internet turn into one huge TV set, sending Gigabyte class video files to every home.
While so little of Verizon's revenues currently rely on video delivery compared to say Comcast; its primary competitors, the major cable operators, perhaps may not feel able to embrace this approach, and this will accelerate a drift towards using the RBOCs as ISPs rather than the more expensive and more restrictive cablecos.
In the end we would expect that protocols and APIs that come out of this work will have to be a standard, and one that EVERY ISP, regardless of what their main business is (cable or telco) will have to offer.
Either customers will begin to leave in droves as word gets out that P2P goes faster on other networks or the simple fact that those networks, which don't wish to co-operate, will still be faced with a day-by-day war trying to keep P2P at bay, and will be still suffering the traffic consequences of badly saturated networks.
Comcast is now supporting this new P4P activity, despite the accusations about its excessive "traffic shaping" activities.
There is the feeling that the ISPs are the companies that need the technical help to make this happen, not that they are being begged by P2P software suppliers, because the P2P players seem to be winning the technology war here.
Perhaps it is the ISPs that need the P2P players' help, not the other way around.
Bell Canada Chokes Third-Party Traffic
Excerpted from The Register Report by Cade Metz
On March 14th, Bell Canada began throttling P2P traffic on pipes it rents to third-party ISPs and it neglected to tell them. The mega-Canadian telco has been throttling P2P traffic on its own network since October, but this is different matter.
One of those third-party ISPs is TekSavvy, a small family-owned company that prides itself on providing customers with Internet service that's never throttled. When Bell Canada started throttling TekSavvy traffic, an astute TekSavvy customer realized his BitTorrent client was acting funny and alerted the rest of the world with a post to DSLReports.
This TekSavvy customer had once received Internet access straight from Bell Canada. He switched to TekSavvy because he didn't like Bell toying with his P2P traffic. But then he noticed Bell was still toying with his P2P traffic.
"Recently, my BT download has been limited to 30k," he wrote. "No matter whether I am opening 1 torrent or 10 torrents at a time, my total download never goes over 30k. Before I decided to change to TekSavvy from Bell, I was able to do 50k with Bell's throttle. I heard everybody saying how TekSavvy won't throttle your P2P bandwidth. It worked great for me at the beginning, but I think that is all history now."
It is history. At least for the moment. After several other customers complained about the throttling, TekSavvy CEO Rocky Gaudrault confronted Bell, and Bell fessed up. Gradually.
"Last Thursday, March 20th, we first had discussions with Bell management, and unofficially, they said some load balancing might be going on," Rocky said. "Then on Tuesday afternoon, they officially told us they were throttling our client base."
"They're taking traffic and instead of passing it directly to us, they're moving it to some sort of aggregation point where it gets throttled."
Except that Bell Canada doesn't like the word throttling. It prefers "optimizing." "We recently extended our policies of optimizing our network by balancing the load to include our wholesale networks as well," Bell Canada spokesman Jason Laszlo told us.
"Increased congestion is affecting networks of Internet carriers across North America, including Bell," he continued. "And like a lot of other carriers, we're seeking to better balance Internet traffic during the peak usage periods so that all of our customers can receive the optimal level of service they deserve and rightly demand."
In other words, the company claims that P2P traffic on wholesale networks is affecting performance on its own ISP. "It's all one network," he said.
According to Laszlo, users will notice a dip in both download and upload times on P2P applications such as BitTorrent. "Customers of all kinds can continue to use P2P services. What they'll notice is that they won't work as fast during peak periods, such as late afternoon and evening." The company plans to extend the practice to all wholesale networks in Ontario and Quebec on April 7th.
After a little encouragement, Laszlo did admit that the company started "optimizing" without telling its wholesale networks. But he also said this isn't a problem. "There are very clear provisions in our contracts with the wholesale networks that allow us to manage our networks appropriately."
TekSavvy's Rocky Gaudrault doesn't see it that way. "The policy they're referring to deals with copper. Not data," he told us. "It depends on how long this goes on. If they say, 'We're over subscribing the network currently, and we expect to have everything fixed by this date', then fine. But if this becomes a permanent solution it is no longer a maintenance issue or a quality issue. It is a policing issue. They have given themselves the right to control data that's not theirs."
Is Bell Canada attempting to ensure that TekSavvy is just as unattractive as its own ISP? Gaudrault won't answer. Yet.
"It could be viewed in a variety of ways at this point. Again, if it's only a temporary solution, that's one thing. But if it's forever, then they're telling us what kind of clients they want us to have. And that's not their right either."
Gaudrault is now exploring alternative means of providing bandwidth to his customers. "We've been working with Bell for seven of the last ten years," he said. "But this is a slap in the face."
Help Vuze to Fight Throttling
Excerpted from TorrentFreak Report
ISPs have been throttling BitTorrent traffic for years now, but only recently has this turned into a political issue. The BitTorrent client Vuze has now developed a plug-in through which consumers can help distinguish the good from the bad ISPs.
Last November Vuze petitioned the Federal Communications Commission (FCC), resulting in an FCC hearing which was held a month ago. One of the issues raised there, was that there is little data available on the scope of BitTorrent throttling, a gap Vuze now plans to fill.
"We decided there was something important consumers can do to help elevate the debate," said Jay Monahan, General Counsel at Vuze. "We created a simple software 'plug-in' that works with the Vuze application to gather information about potential interference with the user's Internet traffic."
The main purpose of the plug-in is to gather factual data about which ISPs are throttling BitTorrent, and to what extent. Already there is an ever growing list of Bad ISPs available at the Vuze wiki, but the data from the plug-in will make their case even stronger.
When the first ISPs started to throttle BitTorrent traffic, Vuze was one of the first BitTorrent clients to introduce a counter-measure, namely, protocol header encryption. However, this was only the beginning of an ongoing cat-and-mouse game between ISPs and BitTorrent client developers.
Monahan guarantees that the gathered data will be treated anonymously. "Sharing this data with us does not involve disclosure of anyone's personally identifiable information (PII). We will aggregate the data and may talk about it or disclose it publicly, but no data about any specific user will be disclosed as part of this effort."
Internet Evolution Researches P2P Filters
TechWeb's Internet Evolution, a Web 2.0 site dedicated to investigating the future of the Internet, this week unveiled a landmark research report on P2P filters, which outlines the essential functions to ensure the continued expansion of the Internet, and concludes that most of the current generation of filters are ineffective for meeting today's challenges.
Produced by the European Advanced Networking Test Center AG (EANTC), the report is entitled Peer-to-Peer Filters: Ready for Internet Prime Time? Although this was research targeted to the more than two dozen product vendors, including established players and market leaders, only two vendors - US-based Arbor/Ellacoya and German-based Ipoque - agreed to release their results publicly.
P2P filtering products provide two critical functions: they assist ISPs that primarily care about network capacity, and they support entertainment companies that would like to prevent the unauthorized exchange of copyrighted content.
Both the Arbor/Ellacoya E30 and Ipoque PRX-5G devices showed excellent performance and very good P2P detection and regulation capabilities.
However, neither solution achieved perfect detection across the entire range of popular P2P protocols. According to the research, the effective solution will be a new generation of filters more finely tuned to the specifics of Internet P2P file-sharing traffic.
"It's quite clear that most vendors are still in an early phase of product deployment and their products have limited scale and functions. Based on the response to the test, both ISPs and the music industry will have to wait awhile before the carrier-class P2P tools that can meet their needs are widely available," said Stephen Saunders, Internet Evolution creator.
"Internet Evolution's objective is to bring awareness to key issues that are guiding the future of the Internet. We will tackle other critical issues through in-depth and comprehensive analysis of the Internet."
"ISPs often use trial-and-error methods to determine the efficiency of P2P filters today; our goal was to eliminate uncertainty in this area," said Carsten Rossenhoevel, Managing Director of EANTC.
"We discovered significant disparities between vendors' marketing collateral and what their products are actually capable of. None of the devices we tested were able to satisfy the media industry's demands to block individual copyright infringements. Three vendors vetoed publication because of problems with their results."
Jim Griffin Leads WMG Collective Licensing Effort
Excerpted from Portfolio Report
Edgar Bronfman Jr.'s Warner Music Group (WMG) has tapped industry veteran Jim Griffin to spearhead a controversial plan to bundle a monthly fee into consumers' Internet-service bills for unlimited access to music.
The plan - the boldest move yet to keep the wounded entertainment industry giants afloat - is simple: consumers will pay a monthly fee, bundled into an Internet-service bill in exchange for unfettered access to a database of all known music.
Bronfman's decision to hire Griffin, a respected industry critic, demonstrates the desperation of the record industry. It has shrunk to a $10 billion business from $15 billion in almost a decade. Compact disc sales are plummeting as online music downloads skyrocket.
"Today, it has become purely voluntary to pay for music," Griffin told Portfolio in an exclusive sit-down this week. "If I tell you to go listen to this band, you could pay, or you might not. It's pretty much up to you. So the music business has become a big tip-jar."
Nothing provokes sheer terror in the record industry more than the rise of P2P file-sharing networks. For years, digital-music seers have argued the rise of such networks has made copyright law obsolete and free music distribution universal.
Bronfman has asked Griffin, formerly Geffen Music's digital chief, to develop a model that would create a pool of money from user fees to be distributed to artists and copyright holders.
Warner has given Griffin a three-year contract to form a new organization to spearhead the plan. Griffin says he hopes to move beyond the years of acrimonious record-industry litigation against unlicensed file-swappers, college students in particular.
"We're still clinging to the vine of music as a product," Griffin says, calling the industry's plight "Tarzan" economics.
"But we're swinging toward the vine of music as a service. We need to get ready to let go and grab the next vine, which is a pool of money and a fair way to split it up, rather than controlling the quantity and destiny of sound recordings."
In the last year, the Recording Industry Association of America (RIAA), the industry group that represents the major labels, has sent 5,400 threatening letters to students at more than 150 schools, and reached settlements with more than 2,300 them. It has filed formal lawsuits against 2,465 others, who did not respond.
"I don't think we should be suing students and I don't think we should be suing people in their homes," says Griffin. "We want to monetize the anarchy of the Internet."
Griffin says Warner Music is "totally committed to this." The fundamental issue, he says, is whether music consumers will buy songs and albums individually, or whether they will subscribe monthly to access a "universal" database of songs.
Will Tanous, Warner Music's communications chief, said Griffin's initiative is part of Warner's "ongoing effort to explore new business models in the music industry."
In recent weeks, major music industry players have signaled their interest in the "music as a service" model. Sony BMG Music Entertainment is said to be developing an online music subscription service that would give users unlimited access to its catalog. Apple is reportedly negotiating with the major record labels to offer consumers free access to the entire iTunes library in exchange for paying a premium for Apple hardware.
Warner's plan would have consumers pay an additional fee - maybe $5 a month - bundled into their monthly Internet-access bill in exchange for the right to freely download, upload, copy, and share music without restrictions.
Griffin says those fees could create a pool as large as $20 billion annually to pay artists and copyright holders. Eventually, advertising could subsidize the entire system, so that users who don't want to receive ads could pay the fee, and those who don't mind advertising wouldn't pay a dime.
"Ideally, music will feel free," says Griffin. "Even if you pay a flat fee for it, at the moment you use it there are no financial considerations. It's already been paid for." While few of the plan's details have emerged, critics have begun their attacks.
David Barrett, Engineering Manager for P2P Networks at web content-delivery giant Akamai, says he's opposed to it on principle.
Griffin's plan, he says, is tantamount to extortion, because it forces everyone to join. "It's too late to charge people for what they're already getting for free," says Barrett.
"This is just taxation of a basic, universal service that already exists, for the benefit a distant power that actively harasses the people being taxed without offering them any meaningful representation."
Griffin, who in 1994 was part of the team that made Aerosmith's "Head First" the first song available on the Internet, goes to great pains to emphasize that the collective licensing plan is not "his" plan.
"This isn't my idea," says Griffin. "While I would gladly take the credit, blanket licensing has over 150 years of history behind it."
"Collective licensing is what people do when they lose control, or when control is no longer practical or efficient," Griffin says. "A pool of money and a fair way to split it up replaces control."
Griffin was quick to point out that the $5 figure is arbitrary. "We negotiate in every place," Griffin says. "Clearly $5 per month would be an insane number in China or India. If you could get a nickel a month you could grow the business tenfold in those countries. In another country that had a high GDP, a nickel per month would be ridiculously cheap. So you negotiate. Fair is whatever you agree upon."
Griffin says Bronfman and Michael Nash, the company's digital-strategy chief, brought him into Warner to create an organization to negotiate collective licensing deals.
But Griffin's ambitions extend far beyond just Warner Music. "We're building an as yet unnamed company inside Warner that is not intended to be solely owned by Warner," Griffin says.
"We hope all of the rights holders will come in and take ownership with us, and Warner will not control it. Our goal is to create a collective society for the digital age."
Meanwhile, critics have already attacked the plan as a kind of mandatory "culture tax."
"Jim will vehemently deny the 'tax' label," says Akamai's Barrett. "But it's a tax nonetheless. It'll be a government-approved cartel that collects money from virtually everyone - often without their knowledge - and failure to pay their tax will ultimately result in people with guns coming to your door.
"Jim's proposal does nothing but direct money to the very people that tried to prevent this future from coming to be," Barrett adds, "while further legitimizing the terror being waged in the courtrooms against their members."
Griffin dismisses such criticism. "I understand what David is thinking, but I assure you, we have no such interest in government running this or having any part of it," he says.
Griffin says that in just the few weeks since Warner began working on this plan, the company has been approached by Internet service providers (ISPs) "who want to discharge their risk."
"But more important than the risk for an ISP is the marketing," Griffin says, drawing a comparison to Starbucks' marketing of "fair trade" coffee.
"ISPs want to distinguish themselves with marketing," Griffin says. "You can only imagine that an ISP that marketed a 'fair trade' network connection would see a marketing advantage."
Gerd Leonhard, a respected music-industry consultant who has advised Sony/BMG, which recently announced plans for a flat-rate-subscription model for digital music, rejects Barrett's argument that the monthly fee amounts to a tax.
"This is not a tax," says Leonhard. "It's bundled into another charge."
"People should not be too harsh on Jim for trying to get the ball rolling," says Leonhard.
"At this point, 96% of the population is guilty of some sort of infringement, whether they're streaming or downloading or sharing. "What we have here is the widespread use of technology that declares all of the population to be unlawful."
Gooveshark Incentivizes Music File Sharing
Excerpted from the Gainesville Sun Report by Ankita Rao
The music industry is in turmoil, and a group of 20-somethings in a spacious office in downtown Gainesville work from afternoon to midnight to see their start-up company save it.
Young people aren't buying music anymore; instead they are downloading it, Josh Greenberg said. The cost is a dwindling financial scene for record labels, music producers, and artists.
Two young entrepreneurs decided that it isn't immorality driving the younger population to download music, but rather convenience. Unauthorized downloading programs make it easy to listen to good tunes. That's where Grooveshark comes into play.
"Our overall vision is to produce something that is actually better than free," said Greenberg, the Chief Technology Officer (CTO) and Co-Founder of Grooveshark.
The idea behind Grooveshark is that, unlike iTunes, the profit from buying songs online will be divided among the company, the consumer, and the entities holding official rights to the music.
With companies facing lawsuits for supporting unlicensed downloads and the music industry in "chaos," Grooveshark offers a situation where everybody wins, Greenberg said.
The Grooveshark system works on the basis of a P2P music-sharing system, Greenberg said. Users can upload their personal music library to the website.
To legitimize the process, the user must buy a song if he/she wants to add it to his/her collection, he said. Part of the profit is then deposited into the account of the person who uploaded the song to Grooveshark.
In this way, the sharing of an unauthorized music system is paired with a lawful purchase with the added benefit of compensation, Greenberg said. He emphasizes that it is not a get-rich-quick scheme for users, but a way to balance out the buying and selling of music.
The company is applying user feedback to create a new version of Grooveshark known as "phase two."
Sam Tarantino, 21, is the CEO of Grooveshark. He founded the company with Greenberg two years ago. Tarantino said he came up with the idea when he passed a store that bought, sold, and traded CDs. He wanted to replicate the idea on the Internet.
Tarantino and Greenberg teamed up at a University of Florida (UF) entrepreneurship meeting and came up with a plan of action, Greenberg said.
"We are taking the same concept that makes unauthorized networks successful and implementing them in a lawful way," he said.
Two years later, the venture continues to have momentum, Tarantino said. The company is working to land contracts with four major record labels and several investors.
"I'm pretty confident we're going to close a deal soon," Tarantino said. "These are deals no one imagined would come to us."
Convincing the music industry that they aren't just "a bunch of kids" has proven difficult. He builds trust by having his professional contacts recommend Grooveshark to top executives.
Attending conferences in France and California helped Tarantino understand the inner workings of the music industry, he said.
He found a professional mentor based in Boulder, CO to guide him. The real estate investor gave Tarantino business tips and helped him build a stronger investor base.
"Standing up above all that noise is difficult," Tarantino said. "You have to push."
The entire Gainesville-based Grooveshark team of about 40 people makes the company effective, he said.
The group consists mostly of UF students. They work in areas such as marketing, public relations, web design, and engineering, Greenberg said.
"Everyone has their own strengths," he added. "We are always changing, growing, and hiring."
Their youth might make music executives nervous, but the Grooveshark team is in touch with their consumer base.
"We are the market, we know what we want," he said.
UF student Austin Heerschap, 21, said he uses Grooveshark every day. The layout of the website, song selection, and financial benefits have made music downloading simple, he said.
The music collection available often includes obscure songs that can't be found easily through other companies, he said.
The company filters the music so that every song has great quality, Heerschap said. Unlike iTunes, the consumer can listen to the entire song for free before deciding whether to buy it.
Heerschap said he earned money from users downloading his music. He believes using a legitimate method is also beneficial.
"You don't have to worry about corrupted files," he said.
Grooveshark continues to perfect its system, Greenberg said. It takes into account all of the comments on its website and urges users to involve themselves in blogs on the site.
Tarantino said he hopes Grooveshark will circulate through UF just like Facebook did at Harvard before expanding across the world.
"It's in everyone's nature to say, 'You can't do this,'" he said. "The world will see."
Coming Events of Interest
AdMonsters Leadership Forum - April 22nd at the Digital Sandbox, New York, NY. The forum brings together senior members of the online ad operations community for a day of workshops, member-led presentations, and peer-certified best practice recommendations. This is truly a meeting of the minds for those leading operations online. David Clark, EVP of Joost, will keynote.
P2P MEDIA SUMMIT LA - May 5th in Los Angeles, CA. The third annual P2P MEDIA SUMMIT LA. The DCIA's flagship event featuring keynotes from industry-leading P2P and social network operators; tracks on policy, technology and marketing; panel discussions covering content distribution and solutions development; valuable workshops; networking opportunities; and more.
Digital Hollywood Spring - May 6th-8th in Los Angeles, CA. With many new sessions and feature events, DHS has become the premiere digital entertainment conference and exposition. DCIA Member companies will exhibit and speak on a number of panels.
Advertising 2.0 New York - June 4th-5th in New York, NY. A new kind of event being developed as a partnership of Advertising Age and Digital Hollywood. The DCIA is fully supporting this important inaugural effort and encourages DCINFO readers to plan now to attend.
P2P MEDIA SUMMIT SV - August 4th in San Jose, CA. The first-ever P2P MEDIA SUMMIT in Silicon Valley. Featuring keynotes from industry-leading P2P and social network operators; tracks on policy, technology and marketing; panel discussions covering content distribution and solutions development; valuable workshops; networking opportunities; and more.
Building Blocks 2008 - August 5th-7th in San Jose, CA. The premier event for transforming entertainment, consumer electronics, social media & web application technologies & the global communications network: TV, cable, telco, consumer electronics, mobile, broadband, search, games and the digital home.
International Broadcasting Convention - September 11th-16th in Amsterdam, Holland. IBC is committed to providing the world's best event for everyone involved in the creation, management, and delivery of content for the entertainment industry. Uniquely, the key executives and committees who control the convention are drawn from the industry, bringing with them experience and expertise in all aspects.
|