Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2P Leaders

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

July 12, 2010
Volume XXXI, Issue 6


Pew Research Finding: The Internet Is Good

Excerpted from Social Graf Report by Erik Sass

The Internet is good, according to a survey of 895 technology stakeholders, pundits, and other experts by the Pew Research Center's Internet & American Life Project and Elon University's Imagining the Internet Center, inquiring about the effects of e-mail, social networks, and other digital media. 

Specifically, a large majority of respondents (85%) agreed that, "In 2020, when I look at the big picture and consider my personal friendships, marriage, and other relationships, I will see that the Internet has mostly been a positive force on my social world. And this will only grow more true in the future."

The findings of the survey aren't really terribly surprising, for a couple reasons. First of all, many of the respondents are deeply involved in the Internet for business, advocacy, or punditry: the group of respondents includes Craig Newmark, Founder of Craigslist; Jeff Jarvis, Associate Professor and Director of the Interactive Journalism Program at the City University of New York's Graduate School of Journalism; Clay Shirky, an author who writes about the Internet and teaches new media at NYU's graduate Interactive Telecommunications Program; Esther Dyson, an investor in tech start-ups who also writes about the Internet; and Nicholas Carr, a writer whose publications include The Big Switch: Rewiring the World from Edison to Google.

What's more surprising is the 14% of respondents who agreed with the opposite statement, specifically, that "In 2020, when I look at the big picture and consider my personal friendships, marriage and other relationships, I will see that the internet has mostly been a negative force on my social world. And this will only grow more true in the future."

I actually agree with some of the complaints attributed to the negative respondents by Pew: for example, it's certainly possible that "time spent online robs time from important face-to-face relationships," and that "the Internet fosters mostly shallow relationships."

But these are merely two possibilities among many. As I have argued before, these kinds of negative outcomes are more reflective of individual choices and personal characteristics than any inherent quality of the technology itself; furthermore, it is foolish - and even dangerous - to confuse a morally neutral technology with the uses some people make of it. This is dangerous primarily because it allows people to blame the technology rather than dealing with their own shortcomings: If you find yourself spending less time on important face-to-face relationships than you would like, maybe you should put down the friggin' BlackBerry. If you think you are stuck in shallow relationships, maybe you could try making them deeper. If you realize you don't know how, maybe you are realizing something about yourself.

The same is true of other complaints. For example, the negative cohort agreed that "the Internet allows people to silo themselves, limiting their exposure to new ideas." But if someone with access to the Web - the biggest, freest exchange of information in human history - ends up living in their own personal echo chamber, then that must be result of their own decisions (and they almost certainly would have ended up in the same ideological cubbyhole without the web).

Likewise, another complaint was that "the Internet is being used to engender intolerance." Once again, it not like the Internet invented ignorance and stupidity, and this association may ignore the Internet's potential for monitoring, policing, and combating intolerance. Indeed, I would encourage hate groups of all stripes to maintain Web sites and public message boards, and while they're at it they might want to provide their e-mail addresses and passwords to the FBI as well.

Not all the complaints were psychosocial. More straightforward drawbacks cited by the "negative 14%" include "the act of leveraging the Internet to engage in social connection exposes private information." This one is obviously the real (huge) issue, given recent controversies surrounding new social media initiatives from Facebook and Google, and it must be dealt with if social media is to thrive in the future, and prove the doubters wrong.

Cloud Computing Set for Phenomenal Growth

Excerpted from PC Magazine Report by Chris Fernando

In a bid to capitalize on the $150 billion addressable global market opportunity, organizations are increasingly looking to invest in the cloud as it enables companies to unleash their potential for innovation through greater intelligence, creativity, flexibility, and efficiency, all at reduced cost.

Driven by innovations in software, hardware and network capacity, the emergence of cloud computing has led to an industry shift as more applications are being delivered over the web and through the browser. As cloud computing continues to gain momentum, organizations are set to embrace the full potential of all types of new cloud applications.

"It is clear that cloud computing represents the early stages of a big shift in the information technology (IT) sector as it offers enormous potential in terms of reduced costs and operational flexibility," said Helal Saeed Almarri, CEO, Dubai World Trade Centre.

Fawwaz Qadan, Director of Enterprise Storage, Servers and Networking (ESSN), HP Middle East, said, "Today, cloud computing gives businesses more control and flexibility over the technology they deploy and the way they deploy it. It helps companies reduce costs and focus resources on gaining strategic advantage. While deployment strategies differ, it is critical that an organization's infrastructure is managed as a utility made up of secure, scalable, and standards-based building blocks of integrated IT resources from storage to servers and network management tools."

Jesper Frederiksen, Vice President EMEA, Symantec Hosted Services added, "Cloud computing and Software as a Service (SaaS) are gaining significant traction as important delivery models for IT services with many potential business and financial benefits. Because cloud computing is still relatively new, organizations may still be coming to terms with this delivery model and how best to take advantage of it."

Ad Industry Optimism Reaches High Point - Especially for Digital

Excerpted from Media Daily News Report by Joe Mandese

Advertiser optimism toward their media spending, which bottomed out a year ago, continues to rise and is now at the highest relative point since a well regarded research company began tracking it three years ago. Nearly a third (32%) of ad executives now expect to increase their ad spending over the next 12-months, marking the greatest percentage since Advertiser Perceptions Inc. (API) began asking that question in the spring of 2007.

Conversely, only 22% said they plan to decrease their ad spending, marking a positive 10 percentage point difference between those planning to increase or decrease their advertising budgets, which is the basis of API's advertiser optimism index.

That compares with a negative five percentage point index in the spring of 2009, when API executives say the advertising recession effectively "bottomed out," and signals that Madison Avenue is firmly on the path to a sustainable recovery.

The findings correlate with recent upgrades major industry analysts have made to their overall ad spending forecasts for the U.S. and global advertising marketplaces over the next 12-months, and raises questions about whether the ad industry is a "leading" indicator for a general economic recovery, or whether it is a "lagging" indicator of more systemic macro economic problems.

"The answer is pretty simple," said Brian Wieser, the global director of forecasting at Interpublic's Magna unit, who recently issued his most optimistic outlook for the US and global advertising economies since the global economic recession began. "It's tightly related, but it's concurrent - not leading, not lagging."

Wieser, who is more or less Madison Avenue's de facto chief economist, says that the "best fit" on the role of ad spending as an economic indicator is, "measured as correlation.

"Between changes in virtually any economic variable and total advertising revenues are always concurrent (comparing same quarter to same quarter or same year to same year)," he said. "The correlations are much lower when you look at it on a lagging or leading basis, and whether you look one quarter or two quarters ahead or behind."

Ken Pearl, one of the partners of API, said he still considers advertising to be a lagging economic indicator, but he noted that "advertiser optimism" is a leading indicator of advertising spending, and therefore is a key perceptual metric for Madison Avenue to follow.

"Advertiser optimism leads ad spending. Based on that, I think that advertisers are feeling better about the economy, or at least, more comfortable about the reality of economic uncertainty, which in either case will positively affect ad spending," he explained.

Pearl noted that the optimistic momentum has been building since last year, and that for at least the near term, "We're moving in the right direction."

If that's true, it should be good news for all the media, because the sentiment among ad executives is improving for every single medium, even supposedly business model challenged ones such as newspapers, magazines and radio.

While magazines and national newspapers continue to be the only media yielding a net negative difference (minus 10 percentage points and 32 percentage points, respectively), they both mark substantial improvements from a year ago (when they were minus 26 percentage points and minus 46 percentage points, respectively).

Broadcast TV has flattened out to zero percentage point difference, following three consecutive years of erosion in advertiser confidence, and all other media are ascending, especially digital media (plus 60 percentage points), and particularly mobile (plus 62 percentage points).

The current sentiment reflects the responses of 1,412 ad executives - both marketers and agency media buyers - to interviews completed online in April/May 2010, and is being published here for the first time.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA commends the accomplishments of the P4P Working Group (P4PWG) and the related work-in-progress of Kai Lee, Aijun Wang, and Kaiyu Zhou, who this week submitted their Internet-Draft concerning an ALTO and DECADE Service Trial within China Telecom to the Internet Engineering Task Force (IETF).

Their document reports the experience of China Telecom, the largest broadband network operator in China with more than 60 million fixed-line subscribers, in a recent four-month trial of the current application-layer traffic optimization (ALTO) service and P2P caching involving 7 million users in a single province.

The bottom line was that these innovated technological approaches significantly improved the capability of this Internet service provider (ISP) to affect the distribution of P2P traffic.

The immediately prior trial of P4P, the predecessor to ALTO, on a more limited scale, was published by Comcast. Based on China Telecom's follow-on test, the impact of ALTO on a large scale should extend these benefits, regardless of the number of content files/streams or the number of users.

Xunlei, a leading P2P service in China, which supports both download P2P and live P2P streaming to more than 20 million users daily, was the participating P2P service in the China Telecom trial.

Pando Networks served as the P2P service in the preceding Comcast trial. Caching was not used in the Comcast-Pando trial, which focused on specific test content to a trial group of 57,000 users.

During the latest trial, China Telecom provided ALTO and P2P caching servers, and monitored traffic load on its local backbone. Xunlei contributed its client and users.

The ALTO server notified ALTO software embedded in Xunlei trackers of network topology and cost maps, with the client responding quickly to optimize traffic. All Xunlei-delivered content was measured. Data was collected from both the Xunlei P2P client and the ISP's network management system (NMS).

Special issues addressed during the trial included traffic level variances (e.g., higher overall traffic on weekends).

The test process had two parts: 1) measuring the impact of the ALTO server on traffic localization and P2P user experience; and 2) evaluating the benefits of introducing P2P caching to improve user download speeds and reduce bandwidth consumption.

Two modes were tested: one locally optimized using ALTO information and the other representing the normal Xunlei peer selection and traffic control rules absent the ALTO information.

Results included converting a very impressive 60% of what had been inter-province traffic to intra-province traffic with ALTO. The P2P caching system decreased bandwidth use by a remarkable 55% while average download speed increased 6%.

Conclusions from this trial include that the ALTO mechanism is very effective in optimizing traffic flow and when used in conjunction with P2P caching, decoupled application data enroute status (DECADE), or another service performance enhancement mechanism, also improves user performance.

The P4PWG encourages qualified interested parties to get involved in its activities, either as participants or observers.

Its mission is to work jointly and cooperatively with leading ISPs, P2P software distributors, and technology researchers to ascertain appropriate and voluntary best practices for the use of P4P mechanisms to accelerate distribution of content and optimize utilization of ISP network resources in order to provide the best possible performance to end-user customers.

Its top objectives are: 1) Provide ISPs with the ability to optimize utilization of network resources while enhancing service levels for P2P traffic. 2) Provide P2P software distributors with the ability to accelerate content delivery while enhancing efficient usage of ISP bandwidth. 3) Provide researchers who are developing P4P mechanisms with the support to advance and the ability to publish their work.

4) Determine, validate, and encourage the adoption of methods for ISPs and P2P software distributors to work together to enable and support consumer service improvements as P2P adoption and resultant traffic evolves while protecting the intellectual property (IP) of participating entities. And 5) Establish appropriate and voluntary best practices for the deployment of P4P mechanisms to meet the above identified objectives in a way that can be sustained by all of the necessary participants.

For more information, please contact P4PWG@dcia.info. We also urge you to download and read Global Knowledge's Ten Security Concerns for Cloud Computing. Share wisely, and take care.

Managing Network Services via the Cloud

Excerpted from CTO Edge Report by Mike Vizard

Managing distributed networks has always been a pain in the proverbial node. But with the advent of cloud computing, new approaches that promise to relieve many of the headaches of managing distributed computing are on the way.

Pareto Networks, for instance, is rolling out a new cloud computing service that makes it easier to centrally monitor and manage a distributed network. The service works by deploying either a physical or virtual appliance at the remote location, which is in turn linked back to the cloud computing service provided by Pareto.

Through this service, information technology (IT) organizations can control and provision the network services that are being delivered to various branch offices. According to Pareto Networks CEO Matthew Palmer, the company will make available an application programming interface (API) that will make it easier to integrate a variety of third-party products and services into the Pareto Networks service.

Pareto Networks is really taking a concept that has been applied in the security sector and applying it more broadly to networking services in general.

Whether the network link to the remote office is established via Ethernet, Wi-Fi or 3G/4G networks is immaterial, said Palmer. What matters, says Palmer, is that IT organizations will have a much simpler way to remotely deliver network services in a way that allows them to guarantee quality of service without having to dispatch IT people every time a network service needs to be updated.

Cyber Terrorism and M2M: A Few Thoughts

Excerpted from M2M Report by Carl Ford

Scott Snyder wrote in his book The New World of Wireless: How to Compete in the 4G Revolution some scenario-planning strategies for companies investing in the wireless future.

At a high level, it was the question of how much you could trust the network versus how much you had to rely on the network. Reading about some the problems we are facing in The Economist, I came away feeling that machine-to-machine (M2M) might be the home of the next generation of development on the Internet.

A lot of our friends are in the clouds these days. They want to make the network the home of everything you own. From a return-on-investment (ROI) perspective, this is pretty reasonable. You need to network anyway, so who cares where things are located, right?

And if you have networks that have bandwidth - which is the promise of 4G and even the wireline world - then the cloud is a good move.

However, reading the stories about countries shutting down the networks and the level of sophistication of hacking these days, the use of a successful cloud solution will also draw the interest of hackers.

A honey pot in the clouds is just as sweet as a honey pot on-premise. However, the attack on the application is not what I am thinking about, I am thinking about the survivability of the network itself.

I expect we could accomplish a level of security utilizing peer-to-peer (P2P) strategies and delivering a self-organizing network that avoids the use of much of the traditional systems. Much of this activity is discussed in the Distributed Computing Industry Association (DCIA) during its various meetings and in its working groups.

In some ways, this is easy for M2M-like applications since often they have been small networks that are self organized. Expanding these principles to overall self-management across the Internet should become a strategic asset that can create, in effect, a cloud-like service - minus the cloud's data center.

As I try to describe it, the best analogy I can come up with is munitions development. Munitions normally manage the risk of an accident as one of their goals in economies of scale.

In the cloud, we have seen the benefit of the economies of scale. With M2M we should see the benefits of increased distributed processing.

Cloud Computing - Helpful Clarification

Excerpted from The Next Web Report by James Hicks

"Cloud computing" is perhaps the most misunderstood and overused term in technology today. We all want to understand the concept behind the phrase and gain its supposed benefits, but is it really the best move for everyone?

Working in the technology space and designing cloud hosting solutions for large organizations, I tend to have my own thoughts and feelings about the cloud computing phenomenon. The best way I have found to explain the promise of the cloud is this one short sentence: "Access your data anytime and anywhere."

When you look at that definition with an objective frame of mind, what does it really mean? Essentially I'm stating that you can have access to your core data, whatever its format, from any Internet-connected device regardless of your physical location. Who cares if you're not at your office in California or New York? Who cares if you're using your neighbor's laptop? It just doesn't matter. That is a powerful promise, and it has brought many sheep to the fold; perhaps too many too quickly.

Now, as with all things new there is compromise in this equation; cloud computing is no silver bullet. The challenge that you must take into account is that when you migrate your digital files to the cloud you lose full control over your operating infrastructure. You still own the data of course, after all it's your intellectual property (IP), but when you establish a relationship with a cloud hosting provider you can no longer walk by and watch the lights blink on your server.

This is too much a change for many individuals and organizations, especially when sensitive data is in question. Everyone who works in cloud computing knows how difficult it was, even only a few years ago, to get clients to understand (let alone trust) that an external firm could effectively manage their data backups. If the client didn't have a physical DLT tape or CDROM of its archived data sitting in a fire safe in its office, a managed back-up solution was nearly impossible to push through to their CIO. Numerous high profile cloud failures from firms as large as Microsoft have not helped.

But as the industry has matured, those fears have eased. Improved uptime, better back-up procedures, increased general knowledge, and better access to education on cloud systems have all worked together to slowly change and soften the perspective of even the most hardcore old-school mind. We will see more and more adoption of cloud computing in the near future. That seems to be assured given the growth in both consumer and oriented cloud computing companies around the world.

If you think about it, a high percentage of Internet users today are already utilizing aspects of cloud computing in their normal daily workflow. For example, if you use browser based e-mail or any of the inexpensive online file storage utilities that are available (Dropbox, Box.net, etc.), you depend on the cloud. These services have seen significant increases in usage and reliance from both the individual and enterprise levels. To put this in perspective, online was banking was new and taboo in recent memory; now people trust the cloud with their personal financial information every day.

My recommendation to anyone interested in the best elements of cloud computing while understanding its risks and difficulties is this: climb into this pool slowly. Don't move all of your core data and functions outside of your grasp with one swift motion. Ease into the cloud, become comfortable with the subtle changes that will come to your daily workflow before you make any more adaptations. At the same time, keep this in your mind at all times: the cloud is the future of much of the internet, and you will use it eventually. The only question is how to do so in the best, most effective way possible.

Project Canvas Revolutionizes How We Will Watch TV

Excerpted from ITProPortal Report by Desire Athow

If you thought the BBC iPlayer is big, think again as Project Canvas could well radically change the way consume video content by allowing users to seamlessly switch between video on demand and classic linear TV consumption.

The BBC Trust has already given its green light to Project Canvas and detailed specifications about the platform should be in the public domain by the end of the the month.

Like Freeview, Canvas is expected to be an open platform and will go beyond BBC's iPlayer. Indeed, some are already saying that Canvas is the next logical evolution of iPlayer with many more channels and significantly more features.

One example provided by Canvas Chief Technology Officer (CTO) Anthony Rose is that of an on-demand mash-up where you're able to pull data in real-time about a player on the field in a World Cup match.

The idea of embedding uniform resource identifiers (URIs) within videos is not something new, but the ubiquity of cheap broadband combined with the availability of incredibly vast data warehouses like Wikipedia or Google makes it far closer.

The fact that the service will be known as Youview, a portmanteau for Youtube and Freeview, illustrates this case perfectly. By blending the two approaches, Canvas expects to deliver the best of both worlds: a traditional, curated approach and the wilderness of the internet.

The main obstacle, however, remains bandwidth or the lack of it; the more popular Canvas will become, the more bandwidth will have to be allocated to it and this could have a serious impact on quality of service.

Rose, however, says that Canvas could help service providers to reduce the cost of operating IPTV by making IP multicast smarter and allow more linear channels to be delivered over the Internet.

This reminds us of P2P-Next, that open source authorized P2P-based platform, that began in February 2008 and had the backing of 21 partners across Europe - including the BBC and the European Union (EU). Could P2P-Next be part of Canvas and solve the bandwidth conundrum?

Streaming Music and Cloud Services Gear-Up

Excerpted from The Guardian Report by Chris Salmon

Now that P2P music streaming service Spotify and We7 have taken music streaming to the masses, the next shift in our listening habits is expected to be towards a new breed of "cloud services."

These allow users to upload their digital music library from their computers to a website, then access it from any computer or mobile device. Both iTunes and Google are strongly rumored to be launching services in the coming months. Until then, you can give this latest technological new dawn a whirl with mSpot.

Sign-up for a free mSpot account and you can easily sync parts or all of your PC or Mac's music collection with the site - although it can take around 90 seconds for each song to upload. Once your music is there, access your account through any computer's browser, and you can play, search, and make playlists. In the US, there's already a free Android mobile app, which allows you to play your mSpot library on the go, with an iPhone app expected to follow.

With mSpot, free users aren't subjected to any adverts. We7 and Spotify use their ad revenue to pay their hefty streaming royalty bills. But mSpot argues that if you already own an MP3, you have the right to play it on your own devices, without the need for further royalties. So, you can upload up to 2GB of music to mSpot for free.

It's an impressive service, but it seems likely that mSpot's outlook on royalties will be challenged by the big four labels before very long. Plus, the launch of those higher-profile cloud music rivals is looming.

The big question, though, is whether any service built solely on music that a user "owns" can compete against the streaming sites that allow people to hear whatever they like, without having to purchase it first.

Apple, of course, would prefer people to keep buying music from its iTunes store. But even they have been offering some impressive free streaming this week. The shows taking place at this month's iTunes Festival in London are being webcast live via both MySpace and the free iTunes Live iPhone/iPad app.

So far, Scissor Sisters, Tony Bennett, and N-Dubz have appeared. On-demand highlights should soon begin to appear at iTunes Festival, where you can also check the full line-up.

Cloud Adoption Surges Ahead

According to a survey conducted during Cisco Live, 71% of organizations have implemented some form of cloud computing, despite an unclear understanding as to the actual definition of the technology. From the exhibition floor, Network Instruments polled 184 network engineers, managers, and directors and found the following

There is already widespread cloud adoption. Of the 71% having adopted cloud computing solutions, half of these respondents deployed some form of private cloud. 46% implemented some form of software-as-a-service (SaaS), such as SalesForce.com or Google Apps. 32% utilize infrastructure-as-a-service (IaaS), such as Amazon Elastic Compute Cloud. A smaller number 16% rely on some form of platform-as-a-service (PaaS), such as Microsoft Azure and SalesForce.com's Force.

The meaning of the cloud is debatable. The term "cloud computing" meant different things to respondents. To the majority, it meant any IT services accessed via public Internet (46%).

For other respondents, the term referred to computer resources and storage that can be accessed on-demand (34%). A smaller number of respondents stated cloud computing pertained to the outsourcing of hosting and management of computing resources to third-party providers (30%).

Real gains are being realized. The survey asked those who had implemented cloud computing to discuss how performance had changed after implementation. 64% reported that application availability improved. The second area of improvement reported was a reduction in the costs of managing IT infrastructure (48%).

There are still technology trouble spots. While several respondents indicated their organizations saw definite gains from the technology, others observed network performance stayed the same or declined. 65% indicated that security of corporate data declined or remained the same, compared to 35% who saw security improvements. With regards to troubleshooting performance problems, 61% reported no change or faced increased difficulty in detecting and solving problems.

"With proper planning and tools to ensure visibility from the user to the cloud provider, Cisco Live attendees are successfully deploying cloud services," said Brad Reinboldt, Product Marketing Manager at Network Instruments. "I was a bit surprised by the number of companies lacking tools to detect and troubleshoot cloud performance issues, as they risk running into significant problems that jeopardize any cost savings they may have initially gained."

US Congressman Proposes Internet Sales Tax

Excerpted from Telecom TV Report by Martyn Warwick

One of the reasons why online shopping is so popular in the US is that purchases usually come free of state sales taxes. However, that happy state of affairs may soon cease - if Congressman Bill Delahunt (D-MA) has anything to do with it.

It's said that only two things in human experience are certainties - death and taxes. And in the US, the mills of the dreaded (even feared) Internal Revenue Service (IRS) grind so small and remorselessly that it does indeed sometimes attempt to pursue people beyond the grave. 

State tax authorities are often second in line knocking on the coffin. Currently, Americans buying over the Internet from out-of-state shopping sites don't have to pay state sales taxes, which vary enormously across the country. 

So, if you live in California and buy a book from a website from a vendor based in New York, you don't pay sales tax either in the east or in the west. Some analysts identify this as a major factor in maintaining the economic health of the nation but, as always and everywhere, tax officials and state governments see revenues slipping through their fingers, and for at least the past ten years, have been trying and consistently lobbying for the introduction of a regime to tax Internet purchases. 

Unsurprisingly then, as the tax monster once again rears its ugly head, Congressman Bill's bill has the full and vocal backing of the National Conference of State Legislatures (NCSL). The politician says that states could be better off to the tune of $23 billion a year spread among them if an Internet sales tax is introduced.

The proposal also has the unequivocal support of many of the biggest US "bricks and mortar" retailers such as Wal-Mart and Costco, which are required to charge and collect sales taxes at their shops across the US. On the other side of the divide is a loose coalition of online and mail-order retailers (such as Amazon and eBay) that are pressing for no sales taxes on web transactions. 

For example, speaking last week, Tod Cohen, eBay's Vice President for Government Relations said, "At a time when unemployment rates are high and small businesses across the country are closing shop, Congress should protect small Internet retailers and the consumers they serve from yet another Internet tax scheme." 

As is the norm in the US, a bill comes with a wordy title attached, so Mr. Delahunt's proposed legislation is called the "Main Street Fairness Act." His bill is seconded and supported by several other Democrat politicians but no Republicans have signed as a co-sponsors. 

The final draft of the proposed legislation is very complex - as you'd expect from politicians trying to come to grips with the vagaries of the Internet. 

Want another example? Well, how about Stephen Conroy in Australia? There's a government minister who doesn't have the vaguest idea about how the web works but wants to legislate to censor it all the same.

Mediation in Thomas-Rasset Case Fails

Ars Technica Report by Nate Anderson

Minnesota's top federal judge, Michael Davis, certainly seems like a man who just wants the infamous Jammie Thomas-Rasset file-sharing case on his docket to just go away. And the recording industry, which has prosecuted Thomas-Rasset through one name change, two trials, and three years, appears to be under the distinct impression that it's getting picked on.

Thomas-Rasset was the first file sharer in the US to take her copyright infringement case all the way to a federal trial, where she was found liable for $222,000 in damages. After the trial ended, Judge Davis tossed the verdict and granted Thomas-Rasset a new trial on the grounds that one of his jury instructions was flawed.

That second trial again found Thomas-Rasset liable, and jurors upped the damages to a shocking $1.92 million for the 24 songs at issue in the case. This time, Davis ruled the amount "monstrous" and slashed it to $54,000. The RIAA could take that amount or it could choose a third trial, limited to the issue of damages.

It chose a third trial. But instead of letting the case play out, Davis in June 2010 ordered the parties to meet with a Minneapolis arbiter to hash out their differences.

This would not necessarily be unusual - federal judges demand settlement talks all the time - except for the fact that Davis had already tried the same tactic several times. Both sides had failed to settle before going to trial. In the run-up to the first trial in 2007, Davis ordered them to try again, though he later rescinded that order.

Before the second trial, Davis demanded another settlement conference; after a half day of mediated talks in 2009, this broke down.

After the second trial, the parties again talked voluntarily and could reach no agreement. According to a both sides, they were "stymied by their substantially divergent views on the law and on this case."

So when Davis ordered both sides into mediation again last month, lawyers on both sides must have practiced their eye-rolling skills. What was the point? But Davis also noted something specific and unusual in his June 18th order: the arbiter would be paid $400 per hour, and "the fees incurred for the settlement proceedings shall be paid by Plaintiff." That is, by the record labels.

Predictably, the talks broke down. In a joint motion filed with the court Monday, both sides agreed that nothing will be gained by proceeding further with the mediation, and both were irritated at having to go through the process. "The appointment of the Special Master for settlement purposes can only be done with the consent of the parties and after the parties have been provided notice and an opportunity to be heard," they told the judge. "In this instance, the parties neither consented nor were provided an opportunity to be heard."

But the recording industry was even more upset by the issue of payment.

"The Plaintiffs, on their own, also object to that portion of the June 18, 2010 Order that obligates them to pay the Special Master's fees. Plaintiffs brought this case alleging that they were the victims of Defendant's copyright infringement. Twice, Plaintiffs have obtained verdicts by juries that Defendant willfully infringed their rights. Twice, the Court has set aside those verdicts and the case is now set for a retrial on the question of damages alone.

The Defendant is an adjudged, willful infringer of Plaintiffs' copyrights and, while Plaintiffs strongly subscribe to the Court's desire to settle this case, Plaintiffs believe that the financial burdens associated with the appointment of a Special Master for purposes of pursuing a Special Master should not be placed upon them.

The perception that Plaintiffs have greater resources to shoulder those financial burdens should not automatically dictate that they should bear those costs, especially given that they are the prevailing parties. Indeed, pursuant to 17 U.S.C. Section 505, Plaintiffs have the right to obtain costs from Defendant, including any costs associated with a Special Master. As such, Plaintiffs do not believe that they should bear the burden of compensating a Special Master."

Judge Davis certainly isn't on Thomas-Rasset's "side" here; indeed, when slashing the second trial award, he trashed Thomas-Rasset for the moment when she "lied on the witness stand by denying responsibility for her infringing acts and, instead, blamed others, including her children, for her actions."

But he certainly doesn't intend to let a huge damage award escape his courtroom. When reducing the $1.92 million award to $54,000, Davis arrived at this amount by awarding triple the $750 minimum for statutory damages. This amount is still "significant and harsh," he noted, but it's a "higher award than the Court might have chosen to impose in its sole discretion."

After multiple settlement talks, two trials, and two judicial decisions to set the verdicts aside, Judge Davis still hasn't rid himself of the troublesome case. Come October 4, 2010, Jammie Thomas-Rasset and the RIAA lawyers will again appear in his 15th floor Minneapolis courtroom for a third trial on damages.

Time Warner Cable Refuses to Cooperate with Mass Lawsuit

Excerpted from Paste Magazine Report by Jennifer Ross

Like the Recording Industry Association of America (RIAA), Viacom, and Warner Bros. Entertainment UK, a handful of independent film producers have become adamant in their efforts to track down copyright violators.

According to Wired, the US Copyright Group began tracking down BitTorrent users who had downloaded select indie films, including "Steam Experiment," "Far Cry," "Uncross the Stars," "Gray Man," and "Call of the Wild 3D,"in March.

Throughout the process, the films' producers have sought assistance from Time Warner Cable, requesting that the company sift through its records to match IP addresses with users who have downloaded the films.

Refusing to comply, Time Warner has argued that it would take three months for its staff to reveal the identities of the thousands of users the filmmakers are attempting to sue.

Although major record companies have attempted to sue copyright violators by launching similar campaigns, they have targeted violators over long periods of time rather than collective groups simultaneously.

Time Warner Cable representatives claim they would be unable to respond to requests from law enforcement in a timely manner while also identifying copyright violators for the filmmakers.

As the company's executives stated in a court filing, "TWC has a six-month retention period for its IP lookup logs, and by the time TWC could turn to law enforcement requests, many of these requests could not be answered."

Time Warner is currently arguing their case against the mass subpoenas in a District of Columbia federal court.

Lawsuits & Takedowns Have No Effect on BitTorrent Traffic Excerpted from DailyTech Report by Jason Mick

In May, the US Copyright Group's "pay-or-else" suit over torrent downloads of the movie "The Hurt Locker" slammed over 5,000 individuals. One might have expected that downloads of the movie might have dropped.

However, they have actually been going quite strong; the film was downloaded 200,000 times in June, with 23% of the downloads coming from the US. Some observers believe that the movie's producers may actually be content with the unauthorized downloads. Despite losing millions of copies in sales, it's likely still logging IP addresses and will be able to recoup millions in threat payouts. For that reason, the film's producers have made no effort to remove the film from popular torrent sites.

Elsewhere, torrent sites are clearly being targeted for takedown. Following the escape to overseas hosting in 2005 in the wake of the LokiTorrent and EliteTorrents suits, torrent hosters have offered open defiance to anything media enforcement groups like the MPAA and RIAA can throw at them.

However, torrent downloads are actually continuing to increase, with the efforts against them seemingly having little effect, either on the downloads or the sites that host them.

The Pirate Bay (TPB), perhaps the best known site, is still very much in action and, according to some sources, turning a small profit. Threats, police raids, civil actions, ISP-ordered takedowns, and even sentencing the Swedish admins that ran the site to jail time ultimately have offered no relief to the media industry. The site still is up and running complete with copyrighted material.

Similarly, market-leading Usenet indexer Newzbin - after its recent defeat in Netherlands court over free-speech regarding infringement - is again up and running. After a brief takedown, the site has returned to the same URL, with dozens of movie listings being added daily. The site's admins, who have invested over $40,000 in the site, even boasted about plans to profit from it.

That kind of sentiment seen by TPB and Newzbin increasingly seems the sentiment in the hard-core infringer community. And the general public seems to be becoming increasingly brazen in its infringement as well.

Frustrated media enforcement groups are generally turning to two solutions. Either to craft mass threat schemes like "The Hurt Locker's" or spend money lobbying the government for harsher punishments. Both solutions are problematic for the industry groups.

The problem with settlement schemes is that law firms demand a big cut (in "The Hurt Locker" case, reportedly 70% of the settlements). And the legislative effort is no better as it risks mass public outrage, if efforts such as the jailing of file sharers or repeal of free speech about copyright infringement are passed.

Swedish Pirate Party Wants To Be The Pirate Bay's ISP

Excerpted from Tech Eye Report by John Daly

The Swedish Pirate Party says it has started giving bandwidth to embattled torrent tracker The Pirate Bay (TPB).

Not only that, but various Pirate Party candidates running for parliament in the Swedish elections on September 19th want to host the BitTorrent search engine from inside Parliament itself, should they get elected, as part of their political mandate.

The party claims such actions would be covered by the Swedish constitution, as it apparently states MPs can neither be prosecuted nor sued for doing things which are part of their political mandate. All the Pirate Party has to do in Sweden is win a seat in parliament and receive approval from the site's owners.

The Pirate Party further claims such a step would have symbolic value for "Information security, fundamental freedom of expression, the future of Sweden as an industrial nation, and Sweden's reputation as leading the way into the future."

It also says politicians from other parties do not see a connection between "file-sharing culture and future industry skills", stating that the Pirate Bay is a "is a global icon for freedom of speech, next generation of jobs, and future industries."

The Pirate Party would love to do away with laws concerning immaterial goods; however, some reckon that no one would aspire to be a film-maker, musician, or author anymore if there is no living to be made from art.

Coming Events of Interest

Intellectual Property Breakfast Club - July 13th in Washington, DC. Negotiators for the United States government on intellectual property and innovation have just returned from international negotiations in Switzerland on the Anti-Counterfeiting Trade Agreement (ACTA). A member of the United States Trade Representative and other experts will speak on the controversial treaty.

Music Ally Event: Cloud Models - July 14th in London, UK. With an ever-growing list of services, it's clear: music is moving into the cloud. Is ownership over? How will rights-owners get paid? What are the threats and opportunities for labels, publishers, collecting societies, managers, and artists? Don't miss this timely seminar at Deloitte Atrium.

Managed File Transfer (MFT) Offerings - July 20th online. MFT is one service area that smart information technology (IT) organizations are increasingly turning to as the businesses they support become more complex and dispersed, data volumes increase, and existing infrastructures are pushed to its limits. Join this lively eSeminar.

NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year – this show features lively debate on timely cutting-edge business topics.

M2M Evolution Conference - October 4th-6th in Los Angeles, CA. Machine-to-machine (M2M) embraces the any-to-any strategy of the Internet today. "M2M: Transformers on the Net" showcases the solutions, and examines the data strategies and technological requirements that enterprises and carriers need to capitalize on a market segment that is estimated to grow to $300 Billion in the year ahead.

Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.

Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.

Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

P2P Streaming Workshop - October 29th in Firenze, Italy. ACM Multimedia presents this workshop on advanced video streaming techniques for P2P networks and social networking. The focus will be on novel contributions on all aspects of P2P-based video coding, streaming, and content distribution, which is informed by social networks.

Fifth International Conference on P2P, Parallel, Grid, Cloud, and Internet Computing - November 4th-6th in Fukuoka, Japan. The aim of this conference is to present innovative research results, methods and development techniques from both theoretical and practical perspectives related to P2P, grid, cloud and internet computing. A number of workshops will take place.

Copyright 2008 Distributed Computing Industry Association
This page last updated July 16, 2010
Privacy Policy