Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Blog

P2P Seek

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

July 30, 2007
Volume 18, Issue 8


Joost Reaches 1 Million User Mark

Excerpted from E-Consultancy Report

Peer-to-peer television (P2PTV) service Joost has been used by more than a million beta testers so far, Co-Founder Niklas Zennstrom told reporters at a press event in Estonia this week.

Clearly, the buzz around the online TV platform has helped push the user numbers up to such impressive levels. Zennstrom also said that Joost plans a full launch by the end of the year. Founded by Zennstrom and fellow Skype Co-Founder Janus Friis, Joost launched a limited beta version in December, and has since increased the number of available invitations.

Joost is mainly funded by its co-founders, though it raised $45 million in funding from Index Ventures, Sequoia Capital, and others in May, all of whom have a minority stake in the company.

The picture quality on Joost is superior to most other online video services, with Babelgum being the only real P2PTV competitor so far.

Joost, having launched earlier, has a broader range of content, and has also managed to get some major advertisers on board for pre-roll and overlay ads. Zennstrom told the press that Joost has signed uparound thirty tier-one adverstisers.

Nokia Embraces P2P File Sharing

Excerpted from ZeroPaid.com Report

Finnish mobile phone giant Nokia has purchased the media-sharing service Twango as part of its new strategy aimed at bringing peer-to-peer (P2P) file sharing to its mobile phone users.

Beginning sometime in the first half of next year, Nokia plans to have the P2P technology behind Twango incorporated in all of its Nokia handsets, according to Kari Tuutti, Communications Manager at Nokia Multimedia.

"Social networking and media sharing are important parts of Nokia’s future," he said, and "there could easily be further acquisitions."

Twango, which first went online in October 2006, provides subscribers with a monthly bandwidth of up to 250MB that can be used to upload photos, videos, music, and other personal media. Non-subscribers are able to browse the content for free.

Nokia will now be able to offer people an easy way to share content through their desktop and mobile devices.

"The Twango acquisition is a concrete step towards our consumer Internet services vision of providing seamless access to information, entertainment, and social networks – at any time, anywhere, from any connected device, in any way that you choose. We have the most complete suite of connected multimedia experiences including music, navigation, games, and – with the Twango acquisition – photos, videos, and a variety of document types," said Anssi Vanjoki, Executive Vice President and General Manager, Multimedia, Nokia.

"When you combine a Nokia N-series multimedia computer that is always on, always connected, and always with you, together with a rich media sharing destination like Twango, people will have exciting new ways to create and enjoy rich media experiences in real time."

With the announcement that the uTorrent mUI BETA for mobile phones has just been released, this news from Nokia adds an extra bit of excitement for file-sharing fans with mobile phones.

Though slightly different in terms of content and usage, it still points to an exciting trend of increased media availability and selection while on the go.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe again commend the US House of Representatives (HR) Committee on Oversight and Government Reform for conducting its Hearing on Inadvertent File Sharing over Peer-to-Peer Networks this week, and look forward to following up on critical action items coming out of this session that are important to continuing industry development.

P2P is now at the vanguard of Internet-based innovation with dramatic progress on many fronts. Our industry needs to excel in adapting technological advancements and business practices to ensure that the greatest value and safety accrue to consumers. While much work has been done in this area, it is fitting to re-examine the issues addressed during the hearing to determine what more needs to be done and take appropriate action.

The two-and-a-half hour session was presided over by Committee Chairman Henry Waxman. Ranking Member Tom Davis and Congressmen Cannon, Hodes, Issa, and Tierney made introductory remarks. Representatives Clay, Cooper, Cummings, Watson, Welch, and Yarmouth joined for questioning of the panelists.

Witnesses included the US Patent and Trademark Office’s (PTO) Tom Sydnor, the US Federal Trade Commission’s (FTC) Mary Engle, the US Department of Transportation’s (DOT) Daniel Mintz, Dartmouth University’s Eric Johnson, LimeWire’s Mark Gorton, Tiversa’s Bob Boback, and retired General Wesley Clark.

Opening comments from Committee Members generally acknowledged that P2P has enormous value, especially in the distribution of large amounts of data in a highly efficient manner, but decried the fact that sensitive government information is widely available online, and expressed the need for steps to be taken to lock it down. In addition, an anecdotal example was given of a Congressional constituent’s personal data (a 2003 tax return) currently being available to download via a leading P2P software program.

Chairman Waxman said, "The purpose of this hearing is not to shut down P2P networks or bash P2P technology. P2P networks have the potential to deliver innovative and lawful applications that will enhance business and academic endeavors, reduce transaction costs, and increase available bandwidth across the country."

Tom Sydnor underscored the seriousness of the problem of inadvertent sharing of sensitive data, and strongly suggested that file-sharing programs could be improved to ensure that users don’t make such potentially catastrophic mistakes. In answering questions, he acknowledged that this problem cannot be legislated away, but that self-regulation to date has not been entirely effective.

Mary Engle noted that the P2P industry has made steady improvements in this area. She also described two instances of related FTC legal action against non-P2P website operators that were misrepresenting and exploiting P2P software firms as well as deceiving the public. She reaffirmed that P2P is a neutral technology, but given the substantial risks that the PTO has now raised about software design, said that the FTC would look into this matter further. She indicated that the industry is innovating on its own to address the protection of intellectual property (IP), which includes curtailment of copyright infringement.

Daniel Mintz explained that, in part in response to an incident of a contractor inadvertently sharing DOT data, the department is working on usage policies and migrating to a program under which teleworkers will use encrypted laptops. DOT’s efforts to discourage usage of consumer P2P file-sharing programs on machines intended for government work will include training and auditing as part of DOT’s response to the Federal Information Security Management Act (FISMA).

Eric Johnson discussed his work with Fortune 500 companies to secure their data and noted the trend of "digital wind" whereby files associated, perhaps by metadata, with popularly traded media files, gain wide distribution online. He said that improved education is required among P2P users.

Mark Gorton acknowledged the core problem that the hearing was called to address, and noted that LimeWire is working on improvements to make its software more intuitive for neophyte users to prevent inadvertent sharing of data. He appreciated the fact that virtually all participants at the hearing cited beneficial uses of P2P and voiced his intention to help ensure that these positive attributes would stand at least as tall as the problems being addressed.

Mark reaffirmed that LimeWire has an obligation to help users secure data – and already has default settings to do so – and voiced frustration over users circumventing such safeguards. He outlined two areas of concern: 1) protecting users from inadvertently sharing directories and folders of sensitive or classified information – where industry self-regulation and coordination with government agencies should be able to address the problem; and 2) protecting content owners from copyright infringement – where ISPs need to become involved as part of the solution for efforts to be effective. Mark also indicated that network filtering would be effective nationwide, even against non US-based P2P clients, since US ISP plants and the traffic transmitted over them are all domestically located.

He estimated that LimeWire has a user base of 50 million on a monthly cume basis, and from January 2005 to the present time its market share of music searches has grown from just over 20% to nearly 75%. Mark said that LimeWire management’s motivation is to make sharing files easy, but that LimeWire also needs to do a better job of preventing inadvertent sharing of data not intended for distribution.

Bob Boback provided examples from Tiversa’s investigatory technology revealing both the availability of private data and the patterns of searches for such material on P2P networks. Tiversa’s analysis of traffic from 200 open P2P applications (many of which are not US-based) by file-type (as opposed to amount of traffic represented) revealed that 38% of files being traded are MP3s (music) and 19% are MPEGs (video). He also noted that sensitive data from multiple foreign (as well as US) governments is openly available online.

In an anecdotal example of the successful apprehension of a perpetrator who gathered online information of a suspiciously threatening nature related to family members of a government official, Bob cited how the individual’s use of P2P enhanced the initial criminal investigation, with law enforcement using a "what-other-files-do-you-have" query. He expressed concern that involving ISPs in filtering could slow down performance of P2Ps to the annoyance of users, and said that the optimal solution would be to prevent data "at rest" from being entered into redistribution.

Bob said that the entertainment industry needs to work with the P2P industry on business models for licensed distribution rather than trying to eliminate P2P. He noted that Hollywood’s past attempts to eliminate P2P technology through legislative measures and a Supreme Court ruling have failed. Bob suggested a way for P2P to support "long-tail" marketing for the movie industry: of 14,000 motion pictures now released annually, only 100 are profitable with existing theatrical, home video, and television distribution. Given P2P’s unique low-cost distribution characteristics, perhaps P2P could be developed as a channel to generate revenue for the other 13,900 titles.

Retired General Wesley Clark, now an advisor and equity stakeholder in Tiversa, stated that the scope of risk to government data is unacceptable. He expressed the view that use of distributed computing technologies needs to be regulated and users warned of risks. He acknowledged that it won’t be effective to order software developers to make changes, and that government agencies need to implement regimens including improving policies for use along with educating users, monitoring compliance, and auditing instances of abuse. He said there needs to be clear separation of computers to be used for government work and computers to be used for personal purposes, and agreed with Daniel that there needs to be some defensive monitoring and auditing.

To put into perspective some of the concerns raised with respect to P2P file-sharing networks, it may be useful to make a comparison to commercial search engines. For example, while the phrase "Pentagon Secret Backbone Network" will generally return results to a query on open P2P software applications, it will also yield 28,000 results on MSN, 103,000 on Yahoo, and 182,000 on Google; and the same can be said of "security clearances SCI," which will yield 13,000 results on MSN, 94.000 on Yahoo, and 1,910,000 – yes, nearly 2 million – on Google.

Chairman Waxman concluded by restating that P2P holds tremendous potential while also posing significant risks, and we must work together to ensure that progress is made in the right direction. The DCIA believes that the hearing was very worthwhile, and has reiterated our offer to work with appropriate government agencies to follow-up on the issues raised. We will keep DCINFO readers posted on related developments. Share wisely, and take care.

P4P Working Group Kicks Off

The DCIA-sponsored P4P Working Group (P4PWG) held its inaugural working meeting in New York, NY this week with fourteen attendees comprised primarily of broadband ISPs and P2P software distributors. Co-Chairs Doug Pasko of Verizon Communications and Laird Popkin of Pando Networks led a productive planning session centering on the P4PWG’s initial work product, timeline, required resources, and success measurement. Principal P4P researcher Haiyong Xie of Yale University was present at the meeting, which was hosted by Aydin Caginalp of Manatt.

P4P has been defined by the working group as a set of business practices and integrated network topology awareness models designed to optimize ISP network resources and enable P2P based content payload acceleration.

The P4PWG believes that ISPs will benefit with bandwidth savings and P2P distributors will benefit with faster content delivery if P2P traffic can be directed more efficiently in its use of network resources. The basic idea is to collaborate in learning how to use the least possible amount of network capacity to deliver content to users in the shortest time.

In addition to file-downloading P2Ps, the P4PWG is now reaching out to include live-streaming P2Ps as well and to expand the types of ISPs represented. The P4PWG intends to amplify ongoing efforts in this area, for example by improving upon deductive mapping techniques with ISP topological data. The group is focused on an initial deliverable that will be beneficial for general use both by broadband carriers and by P2P software distributors.

The P4PWG plans to meet next in mid-August. For more information, please call 410-476-7965 or e-mail P4PWG@dcia.info.

Computers and Privacy Breaches

Excerpted from Mondaq News Report by George Takach

Several recent high-profile privacy breaches have begun to focus attention on the important issues that result from the disclosure of personal information of consumers or employees in an unauthorized manner through loss or theft. Who has the liability when data is inadvertently disclosed? What steps should be taken to minimize damage? And how can future privacy breach incidents be prevented or better managed?

These are all good questions. But before turning to them, consider some of the broader trends that are behind our current vulnerability to privacy breaches. It’s no coincidence that the volume and severity of these incidents are increasing. To understand why, reflect for a moment on the history of computing and networking over the past 40 years, particularly from the perspective of the challenges posed to computer security and data privacy by the principal phases of computing technology over this period.

During the first wave of computerization in the 1960s and 1970s, each organization’s IT system consisted of one (or more) centralized mainframe computer(s) – aka the ‘big iron’, which was operated in the bowels of the company by a handful of people. The mainframe stood alone and wasn’t connected to other computers at the company, let alone to computers at other companies.

Computer security in such an environment was fairly straightforward. So long as the small team running the computer was honest, very little harm could come to the computer or the data residing on it. This computing environment did not raise many privacy-breach challenges, at least from a security perspective.

Ever since the appearance of the mainframe computer, engineers have been hard at work trying to replace it with smaller, more versatile computing machines. By the early 1980s, so-called mid-range computers had found favor in company IT strategies. These computers also had strings of dumb terminals (called ‘dumb’ because they did not do processing themselves but could at least access the mid-range computer that did the heavy lifting) attached to them. Lo and behold, these terminals found their way onto the desks of secretaries at the company. And so began the inexorable democratization of computing.

Mid-range computers and their concomitant dumb terminals showed companies the huge promise of distributed computing. Many new applications began to be used by the non-IT staff of the company (or other organizations, such as government departments) using these powerful new hardware machines. Of course, from a privacy and security perspective, this new computing configuration meant that more people in the organization had access to sensitive customer and employee data. One bad-apple employee now had the potential to access a myriad of company data.

By the mid-1980s, computer democratization was picking up pace with the advent of the personal computer (PC). Before long, there was a computer on every desk in an organization. And these weren’t merely dumb terminals; smaller and more powerful microchips allowed them to process data themselves, though they were also connected to hub computers called servers.

Moreover, the PC revolution wasn’t confined to the office. Soon, these powerful but fairly compact devices insinuated themselves into the home. Floppy disks containing large gobs of data began traveling between the PC at the office and the one at home. Not surprisingly, the first serious incidents of data loss were reported as floppies were inadvertently mislaid, or worse, stolen.

In the mid-1990s, of course, everything changed with the coming of the Internet. PCs, servers and even mainframes could now all be networked, both within proprietary/closed systems or, increasingly, through non-proprietary open ones. For the first time, computers became data-communication devices as well as data-processing machines.

Computer crime has been with us since the beginning of the computer revolution and the Internet gave rise to a whole new type of computer criminal, the so-called hacker, and a whole new ease by which to penetrate remote computer systems. In a word, the Internet made information more vulnerable.

In the last 10 years, computing devices have become smaller, more powerful and cheaper. The PC begat the laptop, which in turn (along with the cell-phone) gave birth to the personal digital assistant (PDA).

The microprocessor, however, is no longer used only in standalone computers. Rather, together with digitally-based sensors, microprocessors are being implanted into huge numbers of machines and objects and even people. And what makes all this computing power even more compelling is that the sensors and chips can send their data to host computers through the ether without having to be tethered by wires. Consider a few of the state-of-the-art applications.

A car insurance company has unveiled an insurance product that provides much more granular pricing based on very detailed, real-time car usage patterns, which are tracked and processed by computers. So, if you drive down a highway on a Sunday, you pay a lot less insurance for that trip than you would for a drive downtown during a weekday rush hour.

This is a good example of what more and more miniature microprocessors can do: They can tell us, in real time, what is going on around them. Other current new applications include a school that is putting wireless homing devices on young children so that the school never loses track of them. Similarly, a uniform maker is putting chips into firefighters’ suits so that their positions can be determined at all times while they are fighting a large blaze.

A range of digitally driven, wireless-connected medical devices is also hitting the market. Small chips with long-lasting power devices are being implanted just under the skin of various patients to facilitate monitoring of various vital statistics or collect more nuanced data. Essentially, these are tiny radio frequency identification tags that let doctors monitor their patients from afar.

These digital implant technologies will not long be restricted to the health community. Indeed, one European bar embeds such chips in patrons’ arms to assist with identity verification and payment. Previously the stuff of James Bond movies, these digital, wireless implant devices will grow into a huge business in a matter of years.

These technology trends and the business models generated by them have profound implications. In a nutshell, all the examples touched on above involve the collection of huge amounts of data, largely of the sensitive, personal variety. And this data is then transmitted hither and yon, over a variety of networks and by means of various technologies. While all of this activity brings significant benefits, there is of course one inevitable downside.

With so much data being collected, stored, processed, and transmitted, it’s merely a question of time before some data leaks out, notwithstanding the implementation of ‘best practices’ procedures for security and privacy.

Mobile Working Erodes Security

Excerpted from Secure Computing Report

Nearly two-thirds of IT security professionals still rely on passwords to protect their corporate networks, despite growing numbers of people working remotely, new figures show.

While two-factor authentication is now considered a more secure solution, only 15 percent of respondents said they used tokens to protect the remote access of mobile workers, according to the latest SafeNet survey. Furthermore, only 8 percent used smart cards and 3 percent employed biometric measures. "Organizations trying to reap the benefits of mobile working, without adopting adequate security technology and processes are sitting on a security time bomb," warned Gary Clark, Vice President of EMEA at SafeNet. "Passwords have not been sufficient security for years now. They are too easily compromised. Employing layers of security, such as tokens and smart cards and granular authorization, where network access is dependent on the worker’s location and position, is critical," he added.

Guarding Credit Card Information

Excerpted from Boston Channel Report

If you can hardly get through a day without "charging it,’" then make sure you’re smart about security.

There are at least three things consumers can look for to feel certain that the company with which they are doing business online is taking credit card security seriously. "You’ve got to remember, it’s a financial transaction, and you have to think about it," said Michael Maggio, CEO of Boston’s Newbury Networks.

First, online shoppers should look for little symbols on a company’s website that resemble a lock, or say VeriSign. It’s a sign that your personal information is secure. You should also make sure the address in the web browser where you’re entering personal information says "https." The "s" stands for secure.

Second, if you get an unsolicited e-mail asking you to purchase something, "go back to that site, not through the link in the email, but type it in yourself," Maggio said. "Only do business with people you’re comfortable with."

Third, it’s a good thing if companies ask you to enter that 3-or 4-digit code on the back of your credit card. If someone has stolen your credit card number, chances are they wouldn’t have that code.

Certainly nothing is foolproof, but knowing these three tips could keep you from becoming one of the 10 million Americans whose personal information is stolen each year.

From Russia with Malice

Excerpted from The Age Report by Nick Miller

If it weren’t true, it would be the script for the next Bond movie.

The mission: to eliminate a man. Codename: "flyman". Elite hacker. Suspected head of the so-called "Russian Business Network," a hotbed of cyber-fraud, criminal obscenity, and malicious "bot-nets" that wreak havoc across the Internet from its St. Petersburg base.

"We don’t know who he is," admits Rick Howard, Director of Intelligence at Internet security company VeriSign. "We don’t know if it’s a hierarchical organization or a loose confederation of similar groups. But it’s organized.

"They are making millions of dollars a year. They are not greedy — they take a few dollars here and there and move on to the next victim. And we think their main guy has connections to the Russian government, and is protected by them."

Mr. Howard sips his macchiato — percolated, not stirred — as he reveals the details of VeriSign’s latest research into cyber-fraud.

The RBN manages networks of phishing sites and trojan programs, designed to steal banking passwords. The targets are individuals but the ultimate victims are the banks, who still compensate their customers for cyber-fraud losses.

Mr. Howard is head of the "iDefense" labs at VeriSign, hired by "three-letter agencies" and financial institutions in the US, Canada, and Australia to test their online defenses.

Controversially, it pays "cash for vulnerabilities" — funding the search for security holes in software such as Microsoft’s Windows or Oracle databases.

It also tracks the bad guys. Internet viruses and other malware used to be the realm of the amateur. But now it is a profession, Mr. Howard says.

"Their code targets a particular banking system," he says. "They get an intricate knowledge of how that system works and then write code that goes against it."

Once a user’s banking details are won, money is siphoned out and the computer is enslaved to a "botnet" used for spam distribution, or in mass denial-of-service (DoS) attacks against a corporate network or web server.

Late last month VeriSign published a 50-page report into the activities of the RBN, which it describes as "a criminal Internet service provider (ISP)".

It’s not quite SPECTRE, but VeriSign believes RBN owns a 155 megabit-per-second fiber-optic link from Russia to London to quickly process its transactions.

"Flyman", its head, is rumored to be the nephew of a powerful St. Petersburg politician. RBN’s sites are vipers’ nests of criminally obscene content servers, phishing sites designed to fool visitors into handing over their banking details, and repositories of trojan code and other malware.

"VeriSign iDefense believes that RBN is a for-hire service catering to large-scale criminal operations," the report says. "Some of these criminals, who may also belong to the RBN circle, are taking advantage of the services provided by the organization they created."

Phishing attacks on Westpac, National Australia Bank (NAB), and Commonwealth Bank have all been traced to the RBN sites.

In October 2006, NAB took active measures against RBN’s Rock Phish project, the report says. RBM botnets launched a major distributed denial of service (DDoS) attack against the bank, rendering its homepage inaccessible for three days.

However, Mr. Howard says Australian banks are generally well protected. "They are world class," he says. "When we talk to your financial sector they are articulate and know what they’re talking about. They always have the best questions."

He is in town to talk to banks and government bodies about the online latest threats — and predict future threats.

China is a worry. "Traditionally their attacks are patriotic," Mr. Howard says. "Last year a Chinese group launched a significant attack on US Government offices." The attack used a vulnerability in Microsoft Office to steal millions of unclassified documents.

"This looks like espionage," Mr. Howard says. "But they were 12 guys, hackers for hire. They have the capability to branch out into monetary cyber-crime and they are probably going to."

VeriSign is also preparing a report on "disruptive" new technology such as online worlds and Internet-enabled mobile phones, which could radically change the battleground for the bad guys, and the people who fight them.

But that’s another movie.

Data Protection and Retention

The latest government and industry regulations require IT management to place increased emphasis on data protection and retention while safeguarding user privacy. These changes give organizations a new opportunity to re-examine their storage and backup strategies to improve overall IT operations.

For example, a critical factor with any regulatory compliance storage strategy is the recognition that very large amounts of data do not reside in central facilities. In fact, due to user habits and the decentralization of most businesses, large volumes of data that must now be protected sit on servers in remote offices and on user desktop and laptop computers, all of which are often not included in systematic backups.

As a result, more companies are using remote data protection services as an alternative to traditional backup. Not only does this approach help meet compliance requirements, but it can also help with business continuity and recovery efforts in the event of a disaster or computer crash.

Arsenal Digital has created Rising Against the Winds of Regulatory Compliance, a business brief that discusses the role of remote data protection services in safeguarding all of your data and ensuring your company meets new data protection requirements. The paper covers the attributes a regulatory compliance storage strategy must embrace, including reliability, security, centralization, scalability, accessibility, and service quality.

The Consumer Data Tug of War

Excerpted from CRM Buyer Report by Andrew Burger

Opposing tensions are much in evidence in the legal wrangle over individual privacy rights and security in the digital environment. The dizzying pace of technological innovation continues as individuals and organizations public and private struggle to come to grips with its implications and ramifications.

There’s a clear trend towards strengthening personal privacy rights, as evinced by a slew of federal and state legislation, such as the Federal Information Security Management Act (FISMA) and California’s SB 1386.

On the other hand, there is clearly concern that an individual’s right to privacy is being steadily eroded by governments responsible for protecting citizens and by multinational corporations, whose tremendous power and influence over political and legislative processes has become a given.

The list of recent legislation dealing with protecting individuals’ data and privacy rights is a long one. Prominent among them at the federal level are FISMA, the Graham-Leach-Bliley Act (GLBA), the Health Insurance Portability and Accountability Act (HIPAA) and the Sarbanes-Oxley Act (SOX). Added to that list is legislation concerning customer and individual privacy rights on the books in a majority of US states, as well as privacy and security regulations.

"California’s SB 1386 and the equivalent 30-plus state laws in other states are the laws with the most specific teeth around the protection of personally identifiable information (PII)," Ram Krishnan, Senior Vice President of Products and Marketing at GuardianEdge Technologies, told CRM Buyer.

These and other pieces of legislation, along with related self-imposed regulations, cut horizontally across and vertically down economic and industry sectors.

"Many of the verticals that have seen activity in data protection are all influenced to some extent by data protection legislation. Government, financial services, and healthcare are very good examples and all of these have been affected by the state laws like SB 1386," Krishnan continued.

The growth of P2P file sharing and other distributed web services has also heightened concerns about privacy rights incursions and abuses. The US House of Representatives’ Committee on Oversight and Government Reform this week held a Congressional Hearing aimed at exploring potential privacy and security concerns associated with the use of P2P file-sharing programs.

"With respect to safeguarding private information, current leading P2P software requires users to take multiple affirmative steps in order to share files that may include personal data," Marty Lafferty, CEO of the Distributed Computing Industry Association (DCIA), told CRM Buyer.

"P2P software suppliers have also affirmed their commitment to further reduce risks and competitively enhance both the safety and value of the user experience on behalf of their consumers and the public at large.

The DCIA is also willing to contribute to the dialog taking place between the Patent and Trademark Office (PTO), which issued a report on P2P networks and privacy concerns in March and more recently has been corresponding with the House committee, which in turn has been communicating with two leading US-based P2P software developer/distributors regarding consumer disclosures, default settings, recursive sharing, un-installation procedures and other topics, he added.

"Because of both the technical complexity and relatively fast-moving innovation in this area, a federally mandated and closely monitored private sector initiative – rather than even the best intentioned legislative measure – will produce the most beneficial effect to the public and to government agencies whose sensitive and confidential information must be protected as a matter of national security," Lafferty concluded.

Feds Should Clean up Their Own Act

Excerpted from ZDNet Report by George Ou

Every once in a while you’ll get a political hearing on Capitol Hill where elected government officials will grandstand and politicize issues that should have nothing to do with politics.

This time it’s Government Reform Committee Chairman Henry Waxman who says he is considering new laws against P2P software citing the possibility that P2P software may compromise national security and be used by organized crime. The problem is that Mr. Waxman hasn’t a clue what he’s talking about and this new round of political grandstanding is absurd.

The federal government should clean up its own security act because year after year it gets failing or near failing grades. Mr. Waxman is slamming LimeWire for producing software that may circumvent federal government security, but the real question is why are federal government IT departments allowing federal employees to install LimeWire or any other piece of software on government computers?

The mere fact that government employees have administrative access to install software on their computers let alone computers with access to sensitive information is absurd. If you can’t even keep employees from installing LimeWire, you’re sure as hell not going to prevent them from installing root kits which are infinitely more destructive.

Why pick on LimeWire? Sandy Berger stole secret documents from the National Archives by shoving the documents in to his socks – so will Congress propose a new law against socks? Will Congressman Waxman call the CEO of Fruit of the Loom to a hearing and grill him about the dangers of socks? If we’re afraid that federal employees with use P2P software to divulge national secrets, shouldn’t we be afraid they’ll use the fax machine too?

Shouldn’t we be more worried about the type of employees we place in to sensitive positions? The onus is on the government or any organization to lock down its infrastructure from the physical layer to the application layer to the people working for it.

File Sharing Not a National Security Threat

Excerpted from Washington Post Report by Sam Diaz

The makers of P2P file-sharing software such as LimeWire are no strangers to controversy. Hollywood has been battling file sharing over the Internet for years as a way to curb music and video infringement. But now, Congress is back in the debate, alleging that P2P software can pose a "national security threat."

It appears that sensitive or classified documents – military orders, terrorist threat assessments, accounting documents, tax returns, medical records and more – could fall into the wrong hands if government employees who install file-sharing software on their computers aren’t careful about which files and folders they share. According to CNET, members of the Government Reform Committee told LimeWire Chairman Mark Gorton at a hearing on Tuesday that his company also might be exposed to legal liability if someone’s income tax returns ended up on the Internet for anyone to see because the file-sharing software put them out there.

Here are a few questions to consider: Why are government employees installing file-sharing software on government-issued computers where these files are stored? Isn’t that against government policy and regulation? I’m not allowed to install P2P software on my work computer. Are you?

If these are their own personal computers, then why would sensitive or classified information be on them in the first place? Why is sensitive or classified information being stored locally on any computer that could leave the confines of a secured office? Have we learned nothing from the data breaches that stem from laptop thefts?

In TechDirt this morning, Mike Masnick was a bit harsher. He wonders why file-sharing system providers should take the blame for the stupidity of government employees – and politicians. He singles out Rep. Jim Cooper (D-TN) who reportedly blasted Gorton during the hearing and told him, "You seem to lack imagination about how your product can be deliberately misused by evildoers against this country."

Masnick’s response: "That’s laughably wrong. The misuse isn’t by so-called ‘evildoers.’ It’s by government employees who are disobeying policy and stupidly revealing confidential documents by misusing the software. This is yet another case where politicians want to regulate a technology they don’t understand."

Universities Win Fight over Anti-P2P Proposal

Excerpted from CNET News Report by Declan McCullagh

Senate Majority Leader Harry Reid has withdrawn anti-file sharing legislation that had drawn yowls of protest from universities this week.

Reid, without explanation, on Monday nixed his amendment that would have required colleges and universities – in exchange for federal funding – to use technology to "prevent the illegal downloading or P2P distribution of intellectual property."

Instead, Reid replaced it with a diluted version merely instructing higher education institutions to advise their students not to commit copyright infringement and tell students what actions they’re taking to prevent "unauthorized distribution of copyrighted material" through campus networks.

The revised version was tacked onto the Higher Education Reauthorization Act on Tuesday, a Reid spokesman said in a telephone conversation, which the Senate then approved by a 95-0 vote.

The original version, which had more teeth, alarmed lobbyists for universities, which tend to be delighted to accept federal largesse but rather dislike the government placing conditions on the cash.

Even worse, in their opinion, must have been the additional requirement (also now deleted) that the Department of Education annually identify the 25 colleges and universities receiving the "highest number of written" complaints from copyright owners.

Educause, a group that represents universities and related organizations, sent out an "URGENT CALL TO ACTION" on Friday that called Reid’s original amendment "yet another attempt by the federal government to dictate the day-to-day operations of colleges and universities." It urged recipients to phone Congress immediately "and tell them how much higher education opposes this amendment."

It’s unclear why the Senator yanked his original anti-P2P amendment on Monday evening, but the most obvious explanation is that the last-minute pressure worked.

The final version amends existing federal law that already deluges students with piles of paperwork they already never read on topics like faculty listings, special facilities for the handicapped, accreditation information, graduation rate statistics, campus crime reports, and so on.

Now the copyright infringement information will be added to the stack. ("Information required by this section shall be produced and be made readily available upon request, through appropriate publications, mailings, and electronic media, to an enrolled student and to any prospective student.")

We should note that by "final," in true Washington fashion, we don’t actually mean final. It may be final at the moment, but because the broader bill includes controversial components like $17 billion more on taxpayer-subsidized student loans and debt forgiveness, it may not necessarily become law.

The House of Representatives has approved a different version of the proposed legislation, and the Bush administration has said either version could amount to an unacceptable increase in spending.

Still, the Motion Picture Association of America (MPAA) seems to have decided that even the diluted final amendment is better than the current state of the law, and put a good face on the outcome. In a press release on Tuesday afternoon, the MPAA called the Senate vote a "major step" to combat infringement on campus, and included its estimate that movie piracy among students accounts for "more than half a billion dollars loss to the US industry annually."

BitTorrent Support for AllPeers

Excerpted from Ars Technica Report by Eric Bangeman

Although AllPeers has used the BitTorrent protocol to share files, "What we haven’t had was standard BitTorrent functionality," AllPeers Co-Founder & CTO Matthew Gertner told Ars Technica. "So we decided to add that. Having it within Firefox makes it easier for non-technical people to use."

Gertner reiterated that, unlike BitTorrent, the AllPeers network is completely private. "All sharing is done between authorized peers," said Gertner. He believes that makes it an optimal solution for sharing user-produced HD video, photos, and other content between a select group of people without running into drawbacks inherent in using other third-party platforms for sharing content.

AllPeers plans to make money in a couple of ways. The first is an always-on feature that uses Amazon S3 to store shared content for a limited time. "With a P2P network, the disadvantage is that people go offline at times," Gertner explained. "We offer the capacity to upload shares to Amazon S3 and make them available to people via e-mail invitation." AllPeers plans to begin instituting a payment model for that service.

Further on, AllPeers wants to get into media content sales like many other P2P services.

When asked about AllPeers’ potential as a P2P darknet, Gertner said that his company doesn’t encourage customers to share copyrighted content and that, longer-term, his vision is to provide a viable alternative for content providers. Gertner is also quick to point out that AllPeers respects users’ privacy. "It’s none of our business to look at any of the videos or photos we’re sharing," Gertner said. "People can feel secure about their privacy on AllPeers. Besides, if we started to spy on people, they would just move to another network."

For now, AllPeers will continue to hitch its wagon to Firefox. Gertner believes AllPeers – itself now available under an open-source license – can help drive adoption of Firefox, usage of which has moved past the 40 percent mark in some European countries.

"We’re seeing lots of usage in Europe," Gertner said. Longer-term, AllPeers plans to use XULrunner, a runtime for XUL-based applications that allows developers to easily code cross-platform applications entirely with XUL. Moving to XULrunner would allow AllPeers to be distributed as a standalone app as well as a Firefox plug-in.

Babelgum Could be MySpace for P2PTV

Excerpted from Daily IPTV Report by David Cotriss

With the popularity of sites like YouTube and MySpace, it’s no surprise that Babel Networks is ushering in a new era of online video sharing with its Joost-like service. The offering provides professionally produced content via channels that learn user preferences combined with the lean-back experience of standard TV at near-TV resolution.

Babelgum founder Silvio Scaglia put up $13 million of his own money and raised another $292 million to fund the start-up. The company was recently featured in our list of top industry movers and shakers. Here, we interview him about the company’s offerings and plans for the future.

DailyIPTV: Describe your peer-to-peer television (P2PTV) service and how it differs from competitors like Joost.

Silvio Scaglia: Babelgum is creating a global personal media platform distinct from the national mass media offered by traditional broadcasters and new IP-based providers. Babelgum is available to anyone in the world with a PC and a broadband connection and via the open Internet. That gives us the opportunity to address worldwide markets, and while our initial focus is on English-language material, our plans include expansion into other languages, with the next logical versions being for Chinese- and Spanish-speaking markets.

What is most important about Babelgum is that it does not seek to replicate the mass media that is already available. We aim to satisfy individual passions and interests with a vast library of content that provides the kind of choice that simply isn’t possible with terrestrial, cable, or satellite. Niche markets require niche content, and we expect 80 percent of content available on Babelgum to be in that category with the remainder having some degree of mass appeal.

If, for instance, you are passionate about extreme sports, we expect to provide a full library of on-demand and relevant content rather than one or two pieces. Our users can completely personalize their experience using Babelgum’s Smart Channels facility to define their content preferences and create personal channels. The platform then learns from users’ implicit and explicit choices to push relevant material and to rank it according to user popularity instead of editorial policy. It is a truly democratic environment in which both the user and content owner benefit.

DailyIPTV: Your company just announced 30 new content deals. Can you elaborate a bit on that?

Silvio Scaglia: The 30 content deals we announced have already been supplemented by others, and they are just the start. We are open to all providers of content subject to the content being of professional standard and meeting standards of taste and decency. We are not interested in user-generated content since this is well catered for elsewhere, but we are interested in talking to filmmakers with assets applicable to the widest range of niche and mainstream markets.

At the moment, we are in the beta stage of development with around 2,000 hours of content available. That figure grows daily and will expand to around 50,000 to 100,000 hours prior to our commercial launch toward the end of the first quarter of 2008. During our development stage, content owners benefit from a minimum-revenue guarantee and will share in advertising revenue at commercial launch.

DailyIPTV: Your service faces stiff competition from the likes of Joost and dozens of other P2PTV services. How are you dealing with that?

Silvio Scaglia: At the moment, there is no competition in this market at all. There are a number of players launching a variety of different models, but at the moment, we are all building a market rather than seeking to divide it up. That’s a healthy environment that encourages innovation and enterprise, and I welcome it. What counts at the moment is having a viable strategy, the passion, entrepreneurial spirit, commitment, and funding available to address a global market. These are all elements that Babelgum possesses.

DailyIPTV: We’ve been seeing fewer people watch regular TV and moving online. To what degree do you think this will occur and evolve in the next few years? Will P2PTV bring them back to the TV screen?

Silvio Scaglia: The Internet is a naturally interactive platform, and it should be no surprise, given the content available, that particularly younger audiences are spending more and more of their leisure time surfing the web. What the Internet has lacked until now is the secure, full-screen video capabilities and professional content of traditional TV. The combination of full interactivity, high-quality video, and potentially unlimited choice creates an entirely new and compelling proposition that delivers the best of the active sit-forward experience of the Internet with the passive lean-back experience of television. At Babelgum, we believe this combination of lean-forward and lean-back is the natural successor to the capacity and interaction restrictions inherent in existing broadcasting and, as a result, the market has dramatic growth potential.

DailyIPTV: Where do you believe the P2PTV market will be five years from now? How many screens will it reach?

Silvio Scaglia: There are no technical restrictions to delivering unlimited libraries of content via the open Internet on a global basis, and the existing audience, just in English-speaking markets, already exceeds 300 million people with a PC and a broadband Internet connection. That market alone is growing exponentially, and significantly greater numbers are possible from developing broadband markets such as China. Added to that is the potential to connect directly to televisions either via IP set-top boxes, integrated TVs, or by wired or wireless connections from the PC. All approaches will soon be available to the mainstream consumer market. Taking those factors into account, it is easy to see the market growing to many hundreds of millions in a relatively short space of time.

Can P2PTV Replace Cable & Satellite TV

Excerpted from NewTeeVee Report by Om Malik

Theoretically yes! Especially now that ISPs like AT&T and Verizon have started to talk about broadband connections that zap data back and forth at speeds in excess of 100 megabits per second. As the speeds increase, the ability to call up niche content will theoretically be faster than a blink of an eye. From Azureus to Zattoo – the future is full of video, says The Modesto Bee.

"We’ll move from a world of 300 channels to 3 million" predicted Michael Liebhold, a Senior Researcher at the Institute for the Future, a Palo Alto think tank.

While it is easy to get carried away in the euphoria offered by broadband nirvana, I would like to temper such unbridled enthusiasm. Between 300 and 3 million stand such pesky problems as infrastructure investment whether video content made for a handful of people actually make money.

Fears that consumers will abandon cable in droves to watch online videos may be overblown, said Bruce Leichtman, head of Leichtman Research Group in Durham, NH.

"Television already works pretty darn well," he said. "P2PTV will augment and complement television. To think it will replace TV is where people are getting carried away."

I concur — maybe not 3 million, but 3,000 would be more like it. After all, who wouldn’t want a dedicated channel of "Monk" episodes. Or, for that matter, BBC do-it-yourself shows.

P2P Streaming & P2PTV Workshop

P2P file sharing has become increasingly popular, accounting for as much as 70 percent of Internet traffic by some estimates. Recently, we have been witnessing the emergence of a new class of popular P2P applications, namely, P2P audio and video streaming and peer-to-peer television (P2PTV).

While traditional P2P file distribution applications target elastic data transfers, P2P streaming and P2PTV focus on the efficient delivery of audio and video content under tight timing requirements. Still in its infancy, both live and on-demand P2P streaming and P2PTV have the potential of changing the way we watch TV, providing ubiquitous access to a vast number of channels, personalizing your TV experience, and enabling roaming TV services.

To date, a number of architectures have been suggested by using either the tree-based push approach (e.g., Narada and SplitStream) or the mesh-based pull approach (e.g. CoolStream). Further improvements are possible by taking advantage of advanced source and channel coding techniques such as layered coding, multiple description codes, fountain codes, and network coding.

Given the initial success of P2P live streaming and P2PTV, questions still remain about how to extend the existing P2P systems to support advanced applications with more stringent requirements such as video-on-demand (VOD) services. Furthermore, with incipient deployment of P2PTV systems, there are still interesting open research challenges on how P2P media streaming can complement P2PTV systems, providing advanced VOD features, and enabling access to a larger selection of content.

The DCIA and Microsoft Research encourage you to participate in a day-long pre-Sigcomm 2007 P2P Streaming & P2PTV Workshop on Friday August 31st in Kyoto, Japan. Please contact michiko@dcia.info for more information.

The aim of this workshop is to gather a group of experts presenting incipient work in this area, generating interesting discussions around the technical challenges and future of P2P streaming & P2PTV systems.

Internet via Power Lines Set to Launch

A plan to offer consumers Internet access through their home’s power lines may soon come to fruition for residents of Grand Ledge, MI.

City residents may soon benefit from an agreement between Michigan’s Consumers Energy and the Los Angeles, CA company Utility.net to test the technology in regions where Internet service is typically not available through cable or phone lines.

Grand Ledge fits the description.

"We are pleased to see our existing infrastructure being utilized to provide additional choices and options for broadband Internet access," Consumers Energy Project Manager Gerry Wyse said. "Utility.net intends to bring broadband to communities in central Michigan that have few or no broadband provider choices today."

The new Internet service, which uses radio frequencies to send data, could be offered on a nationwide scale if the tests prove successful.

Ultramercial Chosen by JiWire for Wi-Fi

JiWire, provider of the world’s leading mobile broadband advertising network, announced an alliance with Microsoft to deliver cutting-edge advertising to users of municipal Wi-Fi networks, in cities including Portland, OR and Oakland County, MI. The Portland network currently has over 13,000 monthly users, and is owned and operated by MetroFi, which works with Microsoft/MSN to deliver locally relevant content and advertising on the network. The Oakland County network, run by MichTel Communications, will be the nation’s largest municipal Wi-Fi deployment, covering 910 square miles and 1.2 million people once fully deployed.

Where agreed with the network owner, advertisements delivered as a result of the JiWire-Microsoft partnership may include Ultramercial ad units, which are patented interactive advertisements that Wi-Fi network users opt-in to view in exchange for free network access; advertisements on screens displayed before and after a user logs into the Wi-Fi network; and advertising that appears while a Wi-Fi network user is browsing the Internet. Many of the advertising campaigns delivered on the municipal networks will be from advertisers participating in JiWire’s mobile broadband advertising network.

Pando Wins Top Private Company Award

Pando Networks, a P2P content distribution technology developer whose users download and share large media files, has been chosen by AlwaysOn as one of the AO100 Top Private Companies for 2007. The fifth-annual elite AO100 list was compiled by AlwaysOn’s editorial panel. To be eligible for the list, companies had to be peer-nominated. AlwaysOn received more than 1,000 nominations from venture investors, investment bankers, and other industry experts.

Pando and the other AO 100 Top Private Companies for 2007 are to be honored at AlwaysOn Stanford Summit from July 31st – August 2nd at Stanford University. The Stanford Summit is an executive gathering highlighting significant economic, political, and commercial trends affecting global technology industries. AO100 Top Private Companies identifies the most promising global technology entrepreneurial opportunities and investments.

Pando’s patent-pending P2P platform includes a free 3MB desktop application that enables consumers to subscribe to and download full-screen HD Internet TV, videos, audio, photos, and games and easily share those files – or their own – via e-mail, IM, or the web. Pando is available on Windows, Mac, and Linux versions. More than 10 million people have downloaded Pando since its launch 14 months ago.

"Distributing HD video over the Internet is happening now and will only accelerate," said Pando CEO Robert Levitan. "Pando’s rapid growth combined with this AlwaysOn Top 100 award signals that our products are going to play a critical role in delivering increasingly larger media files that current Internet technologies cannot support."

BlueMaze Entertainment Virtual CD Scores

BlueMaze Entertainment’s patent-pending Virtual-CD (VCD) is an emerging standard in online music and new media promotion. The VCD is a digital representation of a physical CD – including artwork, jewel case, and liner notes. The proprietary application is based on a VCD/DVD concept, in which consumers are able to listen to music, view photos and video, and read about featured artists or songs, all in a self contained platform.

Additionally, consumers can share music with friends via the "Send VCD to A Friend" feature, enabling a buzz-generating effect.

Universal Music Group’s (UMG) Machete Music is using three versions of BlueMaze’s VCD to boost recording artist Notch’s online presence in preparation for the release of his album "Raised by the People."

The different versions of the VCD vary track orders and artwork, emphasizing the artist’s diverse musical background and influences. The first is R&B/Hip Hop, the second Carribean/Dancehall, and the third heavy on the Latin influence.

This is all part of a new digital landscape and evolving marketing mix, which is becoming increasingly more dependent on P2P to drive early buzz and sales. Beyond quicker turn times than traditional CD manufacturing, some of the benefits that the VCD offers include, flexibility, unlimited distribution potential, and an integrated viral marketing capacity.

According to Mitch Towbin, Co-Founder and Director of Marketing for BlueMaze, "It’s obvious that this is where music marketing is heading… with valuable Internet-based statistics and focused targeting supporting a more analytical approach. Ultimately marketing execution and delivery will be both hyper-local and customized in order to be effective."

The VCD application is currently being used to showcase physical products online, tease up-and-coming releases, virally enhance marketing and promotional efforts, revitalize previous albums and back-catalogs, and to drive traffic to online properties, retail stores, and events.

RIAA Admits its Legal Campaign is Useless

Excerpted from The Inquirer Report by Nick Farrell

The Recording Industry Association of America (RIAA) has admitted that its lawsuit campaign against people it calls pirates is not the answer to the problem.

According to the RIAA’s own figures there are more than 7.8 million households in March 2007 in the US that downloaded unauthorized music. This is in comparison with 6.9 million households in April 2003, when the litigation campaign began.

In an interview, Jonathan Lamy, a spokesman for the RIAA said that litigation generated more heat, friction, and headlines.

He thought it was better to follow an aggressive licensing strategy and offer legal alternatives. Lamy said that this was a better way to win over fans – although it would continue with its legal strategy anyway.

John Palfrey, a Clinical Professor of Law at Harvard Law School and Executive Director of the Berkman Center for Internet and Society said that litigation has not made a meaningful dent in how much copyright infringement goes on among American young people.

He said that it represents a signal that the recording industry is out of step. More here.

Coming Events of Interest

  • Edinburgh Television Festival – August 24th-26th in Edinburgh, Scotland. Janus Friis, Co-Founder of P2PTV service Joost, will deliver the inaugural Futureview Lecture at this year’s festival. The aim of this year’s event is to assemble a cast list from the hottest shows, the most exciting new technologies, and the biggest TV controversies of the year.

  • International Broadcasting Convention (IBC) – September 6th-11th in Amsterdam, Holland. IBC is committed to providing the world’s best event for everyone involved in the creation, management, and delivery of content for the entertainment industry, including DCIA Members. Run by the industry for the industry, convention organizers are drawn from participating companies.

  • Broadcasting 2017 – September 27th in London, England. What do the leading players in television and new media across Europe predict for the future of broadcasting? Find out during "Broadcasting 2017" at BAFTA. Join industry giants such as Silvio Scaglia, Founder & Chairman of Babelgum for a day of stimulating sessions.

  • PT/EXPO COMM – October 23rd-27th at the China International Exhibition Center in Beijing, China. The largest telecommunications/IT industry event in the world’s fastest growing telecom sector. PT/EXPO COMM offers DCIA participants from all over the world a high profile promotional platform in a sales environment that is rich in capital investment.

  • P2P Advertising Upfront – Sponsored by the DCIA October 26th in New York, NY and October 29th in Los Angeles, CA in conjunction with Digital Hollywood Fall. The industry’s first bicoastal marketplace focused on the unique global advertising, sponsorship, and cross-promotional opportunities available in the steadily growing universe of open and closed P2P, file-sharing, P2PTV, and social networks, as well as peer-assisted content delivery networks (CDNs).

  • P2P MEDIA SUMMIT LV – January 6th in Las Vegas, NV. This is the DCIA’s must-attend event for everyone interested in monetizing content using P2P and related technologies. Keynotes, panels, and workshops on the latest breakthroughs. The Conference will take place in N260 in the North Hall of the Las Vegas Convention Center and the Conference Luncheon in N262-264. This DCIA flagship event is a Conference within CES – the Consumer Electronics Show.

Copyright 2008 Distributed Computing Industry Association
This page last updated July 6, 2008
Privacy Policy