Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2PTV Guide

P2P Networking

Industry News

Industry News

Data Bank

Techno Features

Anti-Piracy

April 12, 2010
Volume XXX, Issue 5


Court of Appeals Overturns FCC Order

Excerpted from Dow Lohnes Broadband Update Report

This week, the United States Court of Appeals for the District of Columbia Circuit released a decision overturning the US Federal Communications Commission's (FCC) August 2008 order holding that Comcast had violated the 2005 Internet Policy Statement by blocking customer access to BitTorrent, a file-sharing service.

The court's unanimous decision concludes that the FCC "failed to tie its assertion of ancillary authority over Comcast's Internet service to any 'statutorily mandated responsibility.'"

It therefore vacated the FCC's 2008 order. The court's decision rests on the basic principle that the FCC "was not delegated unrestrained authority" under the Communications Act.

The FCC's authority to enforce its 2005 Internet Policy Statement or to regulate Internet service providers' (ISP) network management practices or other activities will depend upon the FCC directly linking its actions to a specific "statutory delegation of regulatory authority."

The text of the decision is available from the court's website here.

Please click here for the full report on this important court decision by Dow Lohnes Members Jim Burger and J. G. Harrington including an overview; the court's analysis, which describes the legal test, Brand X, Congressional policy, and specific statutory provisions; other arguments; implications; and next steps.

Jim Burger will be a featured speaker on the Policy Track at the P2P & CLOUD MEDIA SUMMIT on May 6th in Santa Monica, CA.

Comcast Wins Web Traffic Fight Against FCC

Excerpted from Reuters Report by Jerome Pelofsky

A US appeals court dealt a setback to the FCC's authority to oversee the Internet, tossing out an agency ruling intended to force Comcast to change the way it managed its broadband network.

For years, the FCC, Internet providers, and public interest groups have squared off over potential regulations for governing access and management of high-speed Internet service, often described as the "Net Neutrality" debate.

The decision issued Tuesday will likely have major implications for future regulation of Internet access in the United States and consequences for the FCC and its Democratic Chairman, Julius Genachowski, who has made broadband his flagship issue.

The FCC in 2008, in response to customer complaints, cited Comcast for blocking users from some peer-to-peer (P2P) applications - often used to distribute large files such as television shows and movies - and ordered the company to stop.

While Comcast had said it would change its network management practices to ensure all Internet traffic was treated essentially the same, it asked an appeals court to review whether the FCC had the authority to impose such requirements.

The US Court of Appeals for the District of Columbia sided with Comcast and said that the FCC failed to show that it had the necessary authority to impose such restrictions on the provider's network operations.

"It relies principally on several Congressional statements of policy, but under Supreme Court and DC Circuit case law statements of policy, by themselves, do not create 'statutorily mandated responsibilities,'" the three-judge panel said.

"The Commission also relies on various provisions of the Communications Act that do create such responsibilities, but for a variety of substantive and procedural reasons those provisions cannot support its exercise of ancillary authority over Comcast's network management practices," they said.

The FCC, which has argued it has broad authority, last month unveiled an ambitious plan to upgrade Internet access for all Americans and shift spectrum from television broadcasters to support the huge demand for smart-phones and other wireless devices.

The ruling could have major consequences and will likely set off a flurry of lobbying at the FCC by Internet access and content providers like Google, Verizon, and AT&T.

An FCC spokeswoman said that the agency was still committed to pushing for unfettered Internet service. "It will rest these policies - all of which will be designed to foster innovation and investment while protecting and empowering consumers - on a solid legal foundation," said FCC spokeswoman Jen Howard.

The agency could ask the full appeals court to reconsider the decision or seek review by the US Supreme Court.

The FCC could also seek help from Congress, where lawmakers could rewrite the laws to provide the agency more explicit authority, or it could also try to rewrite its own rules to address the issue.

A Comcast spokeswoman said the company remained committed to the FCC's principles for an open Internet, and had filed the suit to "clear our name and reputation."

"We will continue to work constructively with this FCC as it determines how best to increase broadband adoption and preserve an open and vibrant Internet," said spokeswoman Sena Fitzmaurice.

The decision could free broadband providers from numerous requirements in the short term; however, the FCC could try to reclassify the service into a different category that would permit the agency to apply more regulations, one analyst said.

"Although today's decision is an immediate victory for broadband providers, they may have won the battle only to face a larger war," said Stifel Nicolaus analyst Rebecca Arbogast.

The case was Comcast Corp v Federal Communications Commission, U.S. Court of Appeals for the District of Columbia, No. 08-1291.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThis week's widely discussed ruling by an Appeals Court, overturning a 2008 FCC order that involved prominent distributed computing industry participants, once again underscores the challenges faced by traditional government regulatory agencies and lawmakers in keeping pace with technology innovation in the Internet age.

Serious difficulties in trying to control various aspects of as large, complex, and ever-changing a phenomenon as the Internet point to the need for fresh thinking about what the roles of government institutions should be, and how they should exert the most beneficial influence on broadband networks and the applications that rely on them.

The DCIA believes that the Internet has represented the greatest positive example of human progress over the past several decades - with enormous unrealized potential still ahead - while its many remarkable advances arguably, and perhaps ironically, have been the least influenced in helpful ways by regulatory activity.

Unfortunately, the cumbersome, slow-moving, at times overbearing, and by definition centralized authority characteristic of long-established federal institutions is fundamentally at odds with the more-and-more efficient, rapidly evolving, highly empowering, and increasingly decentralized and competitive nature of web-based applications and broadband networks themselves.

Given the events of this week, which are manifestations of a larger ongoing conflict over the concept of "net neutrality," we urge officials to consider implementing an alternative - and unquestionably non-traditional - new-age tool to address Internet-related issues that arise posing potential problems to their ultimate constituents, consumers.

From the perspective of a global trade organization like the DCIA, the closest analogous example of this tool is the "working group." The basic process for implementing a working-group-like approach for government purposes could be summarized as follows: 1) carefully identifying the relevant parties to Internet-related disputes; 2) empowering them to work out optimal technological and business practices solutions; 3) and holding them accountable for doing this.

The threat of bringing to bear old-school, blunt instrument, draconian, innovation-stifling legal measures and regulations if the parties fail to achieve acceptable results under such a regime will provide ample incentives for parties to produce.

Since 2007, in parallel with the FCC's attempt to control network management practices that was the subject of this week's judicial decision, the DCIA has sponsored and facilitated the P4P Working Group (P4PWG), to which both Comcast and BitTorrent contributed substantially and currently participate in leadership roles.

By way of background, Internet traffic between the years of 2000 and 2007 saw peer-to-peer (P2P) traffic grow from virtually non-existent to representing as much as 50-65% of downstream traffic and 70-80% of upstream traffic in many locales.

2007 marked a turning point for the emerging P2P industry, with P2P beginning to become part of the content delivery infrastructure in large scale deployments, and content owners increasingly indicating a preference for integrated P2P and CDN solutions. Major content and CDN players started to select P2P technology partners to enhance their service offerings.

The intention of establishing the P4PWG, which owes its existence to Verizon Communications, Pando Networks, and Yale University, was to formulate an approach to P2P network traffic management as a joint optimization problem.

The objective of certain participating ISPs, for example, was to minimize network resource utilization by P2P services. The objective of certain participating P2P software firms, conversely, was to maximize throughput. The joint objective of both ISPs and P2P software developers was to protect and improve their customers' experience.

P4P itself was defined as a set of business practices and integrated network topology awareness models designed to optimize Internet service provider (ISP) network resources and enable P2P based content payload acceleration.

There are currently more than 50 active participating companies in the P4PWG representing ISPs, P2P software distributors, researchers, and service-and-support companies. In addition, there are now approximately 50 observers, representing vendors, cable multiple system operators (MSOs), content providers, and other interested parties. The P4PWG, like the Internet itself, has a global footprint, with broad worldwide representation.

From 2007-through-2009, the P4PWG was very active, first with US-based and then with overseas field trials, sub-group expansion, standards-setting activities, and related steps demonstrating substantial progress. Tellingly, the migration of P4PWG field work to non-US locales during this period coincided with increasing uncertainty regarding an expansion of US government intervention in this space.

At its highest level, the P4PWG represents the opportunity for partnerships among ISPs and P2P networks to address a potential problem from a private sector perspective. Real-world complexities of market forces, technological capabilities, financial considerations, etc. are built into the process.

Our point here is that this effort could have been even more productive to date with alignment and support from governmental authorities, rather than having to contend with the uncertain and divisive course that they chose to take, but which now, for the time being at least, has been reversed.

P4P was successfully field-tested by US-based companies AT&T, Comcast, and Verizon Communications, for example, working with Pando Networks, which uses the BitTorrent protocol, and Yale University; and key results of these trials were published.

Learning from these trials included that it would not be inappropriate for ISPs to receive reasonable compensation from content providers using P2P for the services and delivery enhancements that ISPs may offer to them through capabilities like P4P. Alternate, flexible financial arrangements may assist ISPs by providing the appropriate financial incentives to add significant capacity for such services in better alignment with traffic demands.

As noted, our P4PWG experience also demonstrated that uncertainty associated with the current US rulemaking process in this area caused US-based industry participants to temporarily reduce their active involvement in this important and effective process, which was addressing key areas of broadband network resource utilization and related P2P software functionality under the auspices of a voluntary private sector initiative.

The accomplishments of US firms contributing to the P4PWG from 2007 through 2008 far exceeded their successes in 2009. US-based field trials, sub-group expansion, standards-setting, and related activities demonstrating genuine productivity essentially moved offshore in 2009 given uncertainty about US regulatory intentions.

In 2010, P4P is continuing to provide the way to solve a pending bandwidth crisis before it becomes a serious threat and offering a means to collaboratively and cooperatively address future capacity concerns. New commercial products are being introduced that were motivated by the P4PWG's work; and the pace of progress is now expected to accelerate.

For the first time, there is the potential to have carrier-grade P2P with P4P, which in turn will open opportunities for innovative new services, based on the certainty that the fastest path from point A to point B on a network is via P4P-enhanced P2P.

Benefits to consumers include faster downloads, higher quality of service (QoS), and potential assurances of not being subject to service interruptions or degradation. In short, P4P enables content delivery that is more efficient for both the consumer and the network operator compared to alternative architectures.

Aren't these competitive market-based reasons for P4P to be successful clearly superior to regulatory edict?

The DCIA recognizes that, given the inherent dynamism and rapid growth of the Internet, flexibility is a critical component of network management. Therefore, lawmakers and government agencies should avoid adopting strict network management rules that could preclude new opportunities for collaboration and new business models between ISPs and application providers. They should adopt a working-group-like process as their tool of choice for Internet-related concerns.

It is through such initiatives that we will help to continually improve the experience of end-users accessing the applications and content of their choice over the Internet. Share wisely, and take care

Ruling Raises Questions on FCC Regulation

Excerpted from Washington Post Report by Cecilia Kang and Frank Ahrens

At first glance, Tuesday's federal court ruling on Comcast looked like a clean win for the cable giant and for competitors including Time Warner and AT&T. The court, after all, ruled that Comcast could regulate high-speed Internet traffic over its own system and that a company that wanted to push its content through Comcast's pipelines could not.

But the ruling might be only the beginning of a long campaign between Internet service providers (ISPs) and companies such as Skype, Google, and Microsoft. The outcome is far from certain.

At issue is the wonky-sounding phrase "net neutrality." In 2008, the Federal Communications Commission (FCC) told Comcast and other big high-speed Internet companies that they must treat content that flows through their pipelines equally, whether it's digitally lightweight e-mail or hefty movie files, by pushing it all through at the same speed.

Comcast complained that certain kinds of Internet traffic are so heavy that they slow down the entire system. Essentially, Comcast wanted to be able to enforce speed limits on its information highway, moving the big, traffic-clogging Internet traffic into a slower lane. Comcast sued the FCC, and Tuesday, the US Court of Appeals for the DC Circuit sided with Comcast.

The immediate impact is on the FCC. The agency said Wednesday that the cybersecurity, privacy, and consumer-protection policies it had wanted to pursue under its net-neutrality authority could now be in jeopardy. Now the FCC must decide whether it wants to appeal or try to work around the ruling.

Experts wonder what the court's ruling might mean for the pending $30 billion merger of Comcast and NBC-Universal. Net-neutrality advocates fear that NBC's online content, such as episodes of "30 Rock," would be waved into the fast lane on Comcast's pipes, while content from rival companies - say, videos on the Google-owned YouTube - would get slowed down.

Comcast says that it manages congestion on its Internet network only by volume, not by the type of content. In other words, Comcast sees only a line of slow-moving trucks. It does not manage traffic based on what's inside the trucks.

The head of the big cable companies' trade group called Tuesday's ruling a victory, at least for now.

"While in the short run it's clearly a reaffirmation of status quo, which is good news, it raises uncertainty in terms of a regulatory or legislative response," said Kyle McSlarrow, chief executive of the National Cable & Telecommunications Association (NCTA).

The ruling frees the big cable companies from the threat of net-neutrality rules, which they say would significantly hamper their ability to manage traffic on their own networks and prioritize certain applications, such as those that block spam.

"Some public interest groups and non-profits have pointed to some kerfuffles over network owners interfering with traffic that was sent by a reproductive rights group and a charity; in both cases the broadband providers cleared things up quickly, but those are other examples of areas of concern," said Rebecca Arbogast, head of research at Stifel Nicholaus.

"The broadband providers make the argument that on the other side, if net-neutrality rules are adopted, it may interfere with their ability to prioritize traffic in order to provide Kindle- and Garmin-type services," she said, referring to the Amazon e-book and the Global Positioning System (GPS) navigation system.

But the FCC could work around the Tuesday ruling with a vote of the five FCC commissioners. Currently, ISPs fall under a lightly regulated area of the FCC. It would take only a 3-to-2 vote to move high-speed Internet into one of the FCC's more heavily regulated areas, where the agency could set tough rules on companies such as Comcast.

The FCC said Wednesday that the ruling would hamper key portions of its national broadband plan, such as its goal to bring high-speed connections to rural and low-income areas.

On the other side of this Goliath vs. Goliath tale is Google and companies that want a freer information superhighway.

The search giant has pushed for the FCC to impose conditions on an auction for airwaves that would require that a wireless network built from that spectrum be open to any device and any Internet application. The company hasn't weighed in on how the FCC should proceed after the court decision.

Comcast took on the FCC after the agency said the cable giant violated open-Internet guidelines in 2007 by throttling BitTorrent, a popular online file-sharing service that not coincidently allows users to download unlicensed movies and television shows.

Wednesday, Christopher Libertelli, Director of Government and Regulatory Affairs for Skype, worried that the decision leaves such P2P services with no advocate among federal agencies.

"What I can say is that there is no place to go after this decision if a P2P application is degraded on a network," he said.

FCC Mulls Neutrality Options in Light of Legal Defeat

Excerpted from Online Examiner Report by Wendy Davis

Still reeling from this week's court ruling, which said that the Federal Communications Commission (FCC) has no authority to enforce "net neutrality" principles, the FCC is already warning that it might not be able to carry out aspects of the national broadband plan.

General Counsel Austin Schlick says in a blog post that the decision might have rendered the FCC powerless to execute a host of recommendations, including ones that are aimed at protecting consumers.

In the decision, a federal appellate court vacated an FCC order sanctioning Comcast for throttling P2P traffic on the grounds that the FCC has no authority to enforce neutrality principles. This means that the FCC can't prevent Internet service providers (ISPs) from blocking visits to sites like Hulu or Google, should ISPs decide to do so.

The FCC hasn't yet announced its next move, but advocates are pushing the agency to reclassify broadband as a "Title II" telecommunications service, in which case ISPs would be required to follow common-carrier principles.

In 2002, the Bush administration FCC reclassified broadband as an information service. That move, part of a deregulation initiative, was upheld by the US Supreme Court in 2005.

Any attempt by the FCC to reverse course will certainly be met with opposition in court as well as political opposition. Even before this week's decision, telecoms and cable companies urged the FCC to "keep this Pandora's Box of Title II classification nailed shut." They also warned that a reclassification of broadband would "plunge the industry into years of litigation and regulatory chaos."

Nonetheless, some consumer advocates say that the FCC can legally change its mind and restore broadband's former classification as a telecommunications service as long as it has a "reasoned basis" for doing so.

Advocates say that a Supreme Court decision issued last year in a profanity case gives the FCC solid legal footing for reclassifying broadband.

In that matter, Fox and other TV broadcasters sued the FCC for ruling in 2006 that the broadcast of "fleeting expletives" - swearing by celebrities on live TV - was indecent. The broadcasters argued that the FCC had taken the opposite position prior to 2004, and that its about-face on the issue was unfair.

But the Supreme Court rejected that argument, holding that the FCC was free to issue new policies that contradicted prior ones as long as it has a good reason for doing so.

US Online Ad Market Showing Signs of Recovery

Excerpted from Digital Media Wire Report by Mark Hefflinger

The US Internet advertising is showing "signs of an emergent recovery," posting a record $6.3 billion in revenues during the fourth quarter of 2009, according to a report from the Internet Advertising Bureau (IAB) and PricewaterhouseCoopers.

"The record $6.3 billion spent on Internet advertising in the fourth quarter of 2009, while certainly aided by seasonal demand, is a strong indication that the worst of the economic impact on Internet advertising is over and that the seeds of growth have been planted," said PwC's David Silverman. 

While the fourth quarter showed a positive trend, overall US online ad revenues for 2009 were $22.7 billion, a 3.4% decline from 2008. Search ads accounted for 47% of this total, or nearly $10.7 billion - up slightly from 2008, while display ads totaled $8 billion, up 4% from 2008. 

Digital video ads saw a 39% increase in revenues during 2009. The report also found that, looking at PwC data from 2005 through 2009 across ad-supported TV, radio, newspapers, magazines and the Internet, the Internet's share of combined ad revenue grew from 8% to 17%.

Are All Video Streams Created Equal?

Excerpted from Video Insider Report by Ashkan Karbasfrooshan

"Animal Farm" would have included the quote, "All streams are equal, but some streams are more equal than others," had it been written in the twenty-first century by an online advertising executive.

Last week, Brightroll's CEO Tod Sacerdoti touched on the Syndicated Video Black Network, which he said was "dominated by codependent and unscrupulous video syndication firms, ad networks and publishers." He's right, but there's more to it than that.

With video consumption growing quickly and marketers embracing the medium, videos are being streamed in two ways:

Stream A - The YouTube variety: a viewer clicks on a link and is taken to a page where the video loads automatically. Marketers run pre-roll, overlay or companion ads. The video is the main content on the page.

Stream B - The in-banner contextual variety: the video sits in a 300x250 or 300x600 unit next to an article (the main content on the page). The video tends to auto-play with the audio off and there might be a companion display, overlay or pre-roll advertisement.

The explosion in social networking created a glut of ad inventory, plummeting ad rates. Conversely, online magazines and newspapers have some of the most valuable real estate online, with branded marketers running alongside premium content. The problem is most of this is display banner inventory as many of these properties fail to generate much video inventory.

Earlier, Sacerdoti forecast that the top ten video properties will be aggregators, not content creators. Indeed, with search engines failing to index video content, the largest content-producing properties will remain text-centric; and the only video destinations that will really scale will be the aggregators. YouTube will remain #1, Hulu will be a strong #2 and the rest are vying for the third spot and fighting against obsolescence.

Admittedly, those who can build a destination will attempt to do so, and those who cannot will turn to distribution.

But audiences consume content by type (articles vs. video) and not category (auto vs. business). Visitors who frequent a newspaper website generally want to read articles; those who watch videos go to YouTube.

Consequently, when a magazine or newspaper adds a link off their main page to a given video, few people will click through and watch that video. (The exception might be a site like CNN or ESPN whose DNA is in fact video.) The result is that even the traditional media companies who will produce video content will fail to generate the ROI to continue the endeavor. Before long, they will turn to video content producers to offer video content on their properties.

There are, of course, drawbacks to focusing on building a property alone. Once you include frequency capping and geo-targeting, you realize that the true commercial value of a property is not as large as you think that it is.

Moreover, by spreading out a campaign across multiple web properties in one distribution network, you can obtain a far bigger share of voice across the web (albeit so long as it's a form of syndication that avoids the tactics of the so-called "black market syndication" markets that Sacerdoti refers to).

Up to now, we're talking about the value to a marketer. But, what about the content producer who is also building a brand and an audience: the greater the audience, the greater the value to advertisers; the greater the brand, the greater the value of the content. This doesn't mean that advertisers should pay the same price for Stream A as they should for Stream B, of course, it just means that to totally write off Stream B is leaving a lot of value on the table to both marketers and for producers.

The media business involves production, publishing, and distribution. Video is no different; however, production is a commodity and expensive; publishing (i,e., building a destination) is a huge challenge with video; and distribution is fragmented.

I have found that investors would pay roughly (depending on the many things that drive M&A deals): 1x revenues or 2x earnings for production companies; 2-3x revenues or 3-10x earnings for publishing companies; and 4-10x revenues or 5-30x earnings for distribution companies.

This changes over time, but if you can build a "white hat" syndication company and have a defensible IP then you will be sitting pretty.

Business Class Video

A new research study published by Aberdeen Group, compares companies that have achieved a demonstrable return on investment (ROI) to those that struggle to measure ROI. By comparing these approaches, this report differentiates video deployments with measurable goals and payback from video technologies, including P2PTV, that are merely decorative or experimental.

Top organizations have aligned P2P-based video with quantitative value by gaining management capabilities, archiving abilities, and tracking hard metrics that have translated the qualitative characteristics of video into measurable goals.

"Business Class Video: Defining the Standard of Business Value" analyzed over 160 organizations with video solutions and found that the top 20% of organizations had an average ROI on their video investments of 186%, which translates to a payback period of less than seven months. In comparison, the bottom 30% of respondents were completely unable to measure any quantitative value from their video investments.

"The enterprise approach to video has evolved from synchronous and scheduled video usage to asynchronous, on-demand, and centrally managed video assets," says Hyoun Park, Research Analyst, Aberdeen Group. "To ensure that employees and operational departments gain value from video, companies must schedule, archive, store, and measure video used throughout the enterprise."

Companies that have focused on aligning video to departmental pressures have focused on pressures such as: the struggle to optimize communications broadcasts and educational materials based on the learning preferences of multiple workforce generations; marketing demands to improve customer and audience interactions; the challenge to create and develop new products with a geographically dispersed research and development team.

A complimentary copy of this report is made available thanks in part to underwriters Ignite Technologies and Kontiki. To obtain a complimentary copy of the report, visit the following link: Business Class Video: Defining the Standard of Business Value.

Adobe P2P: The Tech Behind Chatroulette

Excerpted from Streaming Media Report by Troy Dreier

From international newspapers to television shows, plenty has been said about what goes on in Chatroulette - some of which is funny, some of which is bizarre, and some of which is, well, perverted.

But what about the technology behind the site, which allows anyone with a webcam to randomly and anonymously interact with a never-ending series of people?

While it looks like an incredibly simple site, Chatroulette wasn't built in a day. It took months of hard and persistent work. More surprising, though, it was done singlehandedly by a 17-year-old Russian kid still in high school.

That kid is Andrey Ternovsky, and he's now on the West Coast talking with potential investors and enjoying his trip abroad. While he won't discuss the possibility of selling the site, he's happy to talk about how he created it.

Chatroulette was born out of Ternovsky's curiosity and some experiments with controlling webcams remotely. A friend suggested that it would be cool to video chat with random people, and when Ternovsky couldn't find a site like that he decided to create one.

Coding wasn't Ternovsky's strong suit, and he couldn't afford to hire anyone to create the site for him, so he started by learning Java and ActionScript. He thought about building his site on Flash Media Server, but found it too complicated and "too damn expensive."

Open source Red5 was also too complicated. He wanted something that would work out of the box. The answer was a combination of Flash Player 10's P2P ability and the Wowza Media Server.

Unfortunately, Ternovsky didn't have the money for Wowza, either, so he cheated a little and used a pirated version with a cracked serial number. Once Wowza was running, it almost immediately crashed. Even though he wasn't running a licensed copy, Ternovsky decided to ask the company for support. He was sure they would check his license, but, to his surprise, they didn't.

Help came from a guy named Charlie, whom Ternovsky thought worked in Wowza support. He was actually Charlie Good, the company's Co-Founder and CTO. Good created a patch for Ternovsky, which resolved one problem, but then another crept up. And another. And another. The constant connecting and disconnecting on Chatroulette, as well as the thousands of new streams created every moment, put Wowza through some extreme tests. In all, Good created four patches and the two exchanged between 200 and 300 e-mails by the time the site was running smoothly.

"His application exercises our software in new and different ways," says Good.

When a user tries to make a connection with Chatroulette, the site first tries to make a P2P connection using Flash 10. If it doesn't succeed in three seconds, it relies on Wowza. The two systems are used about equally, since many users don't meet Flash 10's technical requirements. Ternovsky is now giving notes to Adobe's developers to help them improve the new feature.

Ternovsky did make good on that pirated copy, by the way, and purchased a Wowza license once his site was a viral success. He offered the company back pay, but Good's team decided the promotion they'd get was worth more.

"We realize that a lit bit of piracy is actually okay," says Good, since it allows for some experimentation.

Even though Good's fifth connection on Chatroulette showed him a man committing a disgusting act (no, not that one; something worse), he believed early on in Ternovsky's work.

"I told my wife, I think this guy's got something," says Good.

Chatroulette now gets 1 million unique users every day, with the US, France, Germany, the UK, Canada, Turkey, and China providing the most users. Only one country has never tried the site, says Ternovsky: Chad.

In the future, Ternovsky would like to add games to the site, such as chess, to give people another way to interact. He's also working on a full-screen version, although he's unsure how much bandwidth it will take. While the next version of the site is still undecided, after his current round of meetings in California, it seems unlikely that Ternovsky will be working on those changes alone.

Veoh Assets Sold to 2Peer 

Excerpted from San Diego Union Tribune Report by Wade Roush

The remaining assets of San Diego's peer-to-peer television (P2PTV) service Veoh Network, which shut down in February after years of litigation with Universal Music Group (UMG), have been sold to Los Angeles-based social video startup 2Peer, according to a report today in VentureWire.

Veoh had raised some $70 million from Boston-based Spark Capital as well as Goldman Sachs, Time Warner, Intel Capital, and former Disney CEO Michael Eisner.

Abacast Announces Key Sales Talent Hire

Abacast, a provider of streaming solutions for the online radio and video industries, this week announced the addition of Major Accounts Manager Michael Dalfonzo.

Previously Michael served as Vice President of Sales for Spacial Audio where he was instrumental in establishing and building the inside sales team and developing their major new accounts. Michael has held nearly every position in a radio station including Station Manager, Program Director, Music Director and on-air talent Previous to his work at Spacial Audio, he served as Director of Industry Affairs at RCS where he was the lead Product Evangelist and managed sales for all of the major broadcast groups.

Previous to RCS, Michael was Vice President/Director of New Technology at the Seattle-based Research Group. While there, he developed and brought to market "Virtual Radio Programming" which became the model for several other companies' hub and spoke program distribution systems. In addition to his many years of major market radio and software experience, Michael owned and operated Radio Plus Broadcast Consultants, Inc. where he did pioneering work in highly targeted direct mail and database marketing.

"I am delighted to join the great team at Abacast," said Michael. "I believe their end-to-end radio streaming platform is unique in the industry, and I look forward to helping radio stations expand their online streaming and increase their profits in the rapidly expanding digital marketplace."

"Michael brings more than a 40-year career in radio to Abacast," says Jim Kott, VP of Sales and Marketing for Abacast. "We are very excited to have such a knowledgeable and proven performer joining our team, and we're looking forward to utilizing his expertise in helping serve our online radio customers."

Tribler Evolves Its Decentralized BitTorrent Ecosystem 

Excerpted from TorrentFreak Report

Researchers from the Tribler P2P team at the Technical University of Delft, Netherlands, have been working on their next generation BitTorrent client for a few years now.

The project has been awarded millions of euros in funding from the European Union (EU). With this money, the researchers have been developing a new BitTorrent ecosystem where torrent sites are no longer needed.

To achieve this goal the search functionality of the new Tribler client is fully distributed, meaning that the torrents come from within the network of peers and not from a torrent site or a central server. This could potentially make BitTorrent indexers such as The Pirate Bay (TPB) and isoHunt obsolete.

The downside of this type of search is that it is impossible to remove or moderate spam and fake files. In order to solve this problem the Tribler team has implemented a SwarmRank feature that ranks torrents based on their trustworthiness and speed.

"SwarmRank is inspired by Google's PageRank algorithm which is used to keep Google search results neat, tidy and relevant," Dr. Johan Pouwelse, lead researcher at the Tribler P2P team, told TorrentFreak.

Another new feature based on the trust idea implemented in the latest Tribler release is a reputation score for downloaders called BarterCast. With this feature the Tribler team hopes to achieve higher download speeds for users who share the most.

To reward seeding the Tribler client doesn't rely on sharing ratios like most private BitTorrent trackers do. Instead, every peer that shares valid pieces of a file will simply become more trustworthy. The more users upload, the more their reputation score increases and the higher their download speeds will be.

With the new SwarmRank and BarterCast features, Tribler has made a step forward in preventing spam filling up its decentralized BitTorrent environment. "By adding reputations for both swarms and peers we have a new tool against spam and pollution in BitTorrent, while we can reward seeding," Dr. Pouwelse told TorrentFreak.

The Tribler team is working on some fascinating stuff. However, thus far the client still has a very small market share which is not ideal for a decentralized system like this. This may change quickly. The good thing is that the client is entirely Open Source so other developers can take advantage of the research if needed.

Federal CIO Vivek Kundra on Cloud Computing 

Excerpted from GCN Report

The following prepared remarks by Federal Chief Information Officer (CIO) Vivek Kundra were released by Brookings on April 7th, where Kundra delivered a keynote address on "The Economic Gains of Cloud Computing:"

Today I'd like to talk about how the government is leveraging cloud computing to deliver results for the American people. The economic gains, the environmental benefits, and the ability to provision services on demand are key factors in the government's shift to cloud computing.

There was a time when every household, town, farm or village had its own water well. Today, shared public utilities give us access to clean water by simply turning on the tap. Cloud computing works in much the same way. However, instead of water coming from a tap, users access computing power from a pool of shared resources. Just like the tap in your kitchen, cloud computing services can be turned on or off as needed, and when the tap isn't on, not only can the water be used by someone else, but you aren't paying for resources that you don't use. Cloud computing is a new model for delivering computing resources - such as networks, servers, storage, or software applications.

On September 15, 2009, I announced the Federal Government's Cloud Computing Initiative at NASA's Ames research center, in the heart of Silicon Valley. This region is home to some of the world's most influential and respected high-tech companies, universities and research institutions and is a leading source of technological innovation. It is essential that the government tap into this innovation, and be open to adopting new technologies.

The Federal Government is the world's largest purchaser of information technology. We spend over $76 billion annually on more than 10,000 systems in support of more than 300 million Americans. Yet our technology infrastructure is fragmented and inefficient. Over the past decade, the number of Federal data centers has grown from 432 to more than 1,100. This growth in redundant infrastructure investments is costly, inefficient, unsustainable and has a significant impact on energy consumption. In 2006, Federal servers and data centers consumed over 6 billion kWh of electricity and without a fundamental shift in how we deploy technology it could exceed 12 billion kWh by 2011. For far too long, the Federal departments and agencies have operated vertically - creating silos that underutilize skilled workers and vital funds, while producing unimpressive results for the American people.

The government must spend less taxpayer money on redundant infrastructure and more time on technologies that improve the lives of the American people. Think about our everyday lives. You can launch your own website in minutes. A small business owner can manage payroll online. A grandmother can share pictures of her grandchildren with family across the world. But why is there such a gap between the public and private sectors when it comes to technology? It wasn't that long ago when Federal employees had better technology at their desks than in their homes.

As the world's largest consumer of information technology and as stewards of taxpayer dollars, the Federal Government has a duty to be a leader in pioneering the use of new technologies that are more efficient and economical.

Many organizations in the private sector are using cloud computing technologies to realize tremendous savings and streamline their operations. For example, NASDAQ is using the cloud to give customers and regulators who ask about past trading actions a snapshot of market conditions at the time of the trade. NASDAQ accomplishes this by slicing and dicing terabytes of historical data in seconds.

Starbucks used cloud-based tools to launch an online community in just one month that has generated thousands of ideas from customers and employees on how to improve Starbucks.

In the government it can take years to procure, configure and deploy technology solutions. By using cloud services, the Federal Government will gain access to powerful technology resources faster and at lower costs. This frees us to focus on mission-critical tasks instead of purchasing, configuring and maintaining redundant infrastructure.

Please click here for the full report and click here for the associated slide presentation.

The Digital Economy Bill's Impact on Consumers

Excerpted from The Linc Report by Daniel Ionescu

The entertainment industry has been trying for years to eradicate online infringement by attacking websites associated with distribution of unlicensed content. But these sites keep on reappearing in different shapes or under different names, so the next step was to crack down on the users themselves.

Seen as a clampdown on unauthorized file downloaders, the UK's controversial Digital Economy Bill didn't create reams of debate in Parliament on Wednesday night. It was passed by 189 votes to 47, and is a slightly watered-down version, with a few parts stripped out. You can read the whole 76-page bill here.

Nevertheless, the Digital Economy Bill will have serious consequences on unlicensed downloads of copyrighted material such as music, films, or games, which incidentally are very popular among students.

Torrent sites like The Pirate Bay (TPB) and file-sharing applications like LimeWire are commonly known sources to fill iPods with music, watch the latest film releases, or download computer games.

Here are some of the consequences of the Digital Economy Bill. These will likely affect those who live in private-rented accommodation and have their broadband supplied independently.

"Copyright owners" must be provided with details (not identities) of infringers, which implies that Internet service providers (ISPs) such as BT, Virgin, or TalkTalk must provide these details upon anonymous monitoring of your Internet use.

ISPs will be required to block users' access to sites that offer unauthorized downloads, in this case hundreds of Pirate Bay-like sites. This is a measure many ISPs are likely to take as a first step towards enforcing the upcoming legislation.

ISPs will have to send warning letters to users who consistently download copyrighted material. Ignoring the warnings might result in slower "throttled" broadband speeds for the recipients of the letters or go as far as temporarily suspending Internet access.

Cafes, bars, hotels, or any other places that offer free WiFi connections are expected to run into trouble with the new rules brought by the Digital Economy Bill. If clients use these connections to download infringing materials, businesses will see themselves forced to shut these services down.

There are several ways though to anonymize Internet use and avoid ISP monitoring. They include a bit of computer wizardry or purchasing access to a virtual private network (VPN).

Such an example is the iPredator VPN, created by the people behind TPB. At $20 for three months, the service exchanges the IP number users get from their ISPs to an anonymous one, offering a safe and encrypted connection between the computer and Internet. Using proxy services is another method.

Debate on the Digital Economy Bill is unlikely to stop here though. Many are expressing their dissatisfaction with the bill on Twitter, where #debill is now a trending topic.

ISP Vows to Challenge New UK Law

Excerpted from Digital Media Wire Report by Mark Hefflinger

Following the approval of a Digital Economy bill in Britain that would implement a "three-strikes" policy where repeat file-swappers see their Internet connections suspended, at least one ISP - TalkTalk - said it would defy any order to disconnect its customers.

In a blog post, TalkTalk's Andrew Heaney stated that the company won't hand over customer data to copyright holders "unless we are served with a court order," adding that, "if we are instructed to disconnect an account due to alleged copyright infringement we will refuse to do so and tell the rights-holders we'll see them in court." 

The controversial Digital Economy Bill will see copyright holders and ISPs collaborating to target unauthorized file-sharing, with ISPs forwarding warning letters from copyright holders to subscribers suspected of copyright infringement. While the bill as passed does not include language regarding permanent disconnection of a subscribers' connection, after repeated warnings the subscriber's connection could be slowed, or temporarily suspended.

Coming Events of Interest

Cloud Expo - April 19th-21st in New York, NY. Co-located with the 8th international Virtualization Conference & Expo at the Jacob Javits Convention Center in New York City with more than 5,000 delegates and over 100 sponsors and exhibitors participating in the conference.

LA Games Conference - April 29th in Los Angeles, CA. Over 300 of the most influential decision-makers in the games industry gather for the LA Games Conference to network, do deals, and share ideas about the future of console, PC, online and mobile games. LA Games Conference - now in its 4th year - features a lively and fun debate on timely cutting-edge business topics

Digital Hollywood Spring - May 3rd-6th in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

P2P & CLOUD MEDIA SUMMIT - May 6th in Santa Monica, CA. The DCIA presents its fifth annual seminal industry event as a conference within DHS, with the subject matter now expanded for the first time to include cloud computing, the most advanced and rapidly growing distributed computing technology.

Cloud Computing for Government Conference - June 7th-9th in Washington, DC. Learn how to cut costs and create a more efficient, scalable and secure IT infrastructure. In addition, learn how to develop a cloud computing strategy, along with helpful tools, tips, and techniques to get started. Hear practical advice, firsthand, from leading experts including the NASA Ames Research Center, US Department of Energy, Silicon Valley Education Foundation, and many more. Mention "DCIA" to receive a $200 registration discount.

Broadband Policy Summit VI - June 10th-11th in Washington, DC. The most comprehensive, in-depth update about the implementation of the FCC's National Broadband Plan. No other forum provides the detailed coverage, expert insight and networking opportunities you'll receive at Broadband Policy Summit VI. The expanded program includes top-notch faculty who will address the most pressing broadband issues in six panel discussions, two debates and four keynote addresses.

Digital Media Conference East - June 25th in McLean, VA. The Washington Post calls this Digital Media Wire flagship event "a confab of powerful communicators and content providers in the region." This conference explores the current state of digital media and the directions in which the industry is heading.

Copyright 2008 Distributed Computing Industry Association
This page last updated April 18, 2010
Privacy Policy