Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

May 3, 2010
Volume XXX, Issue 8


GenosTV Debuts at Digital Hollywood Spring

ShambroWest Corporation cordially invites DCINFO readers to join its senior management team at an exclusive dinner in Los Angeles this Wednesday evening May 5th to preview GenosTV for the Digital Hollywood community.

This gala media event to introduce the revolutionary Genos broadband television service will begin at 6:15 PM at Loews Santa Monica Beach Hotel. Attendance is limited, so RSVP at your earliest convenience to 410-476-7964 or genos@dcia.info.

GenosTV will be delivered over broadband Internet connections to its subscribers' televisions using proprietary hardware and software platforms to maintain the security of the content. This unprecedented online service will provide its subscribers worldwide with access to cable, network, and other media content from every geographic region and in every language around the world.

Using a feature called TvME, GenosTV subscribers will also be able to create and distribute their own television channels, sharing them either with a small circle of friends, broader interest based groups, or the entire GenosTV community.

With CloudDVR, the Genos network will provide secure online storage of up to 180 days for every channel on the service, making a multi-million hour DVR available to members with active subscriptions to the content.

CloudDVR will also store non-channel content including material created by subscribers, movies in multiple languages from both mainstream and independent producers, along with other free and video-on-demand (VOD) content.

The Genos network is currently seeking and evaluating technology partners with hardware, software, and networking solutions. GenosTV is seeking to license content from providers in every language and every geography around the world.

Genos is a subsidiary of the ShambroWest Corporation, with offices in Las Vegas and Amsterdam. It was founded by Rob Shambro, Serial Entrepreneur and founder of SAVVIS, StreamSearch, and ILabs; Mike West, subject matter expert in consumer electronics and former technical leader at IBM; and Kevin Bacchus, creator of the Microsoft Xbox.

ShambroWest Corporation is being incubated by Sonnenschein, Nath, and Rosenthal.

Rovi Music Database Selected by Wenner Media to Power RollingStone

Rovi Corporation this week announced that the Rovi music database will help power the new RollingStone website, Wenner Media's flagship brand that has been covering music and pop culture for more than four decades.

The Rovi music database was selected by Wenner Media to provide its over five million unique RollingStone users with comprehensive metadata about their favorite albums and artists ranging from rock to jazz to country and everything in between.

The Rovi music database will serve as the taxonomy of music metadata for RollingStone's vast directory of artists and will supplement that directory with album titles, album art, and information about tracks, labels, and release dates. Rolling Stone's use of the Rovi music database includes identification tags to simplify cross-promotional and sales activities with other Rovi-powered online stores and portals.

"The RollingStone.com site serves as a one-stop destination for music lovers to quickly and easily find the very latest news, reviews, downloads, and videos on their favorite artists, and help them discover new ones," said Daniel Mandell, Director of Business Development for Wenner Media.

"By complementing the classic Rolling Stone music editorial, the Rovi music database enables us to give RollingStone.com members a streamlined experience that allows them to dive deeper into artist and album data across a wide variety of genres."

The Rovi music database provides multiple forms of data to support a variety of devices and services across online services, service providers, and consumer electronic platforms, such as websites, social networking sites, mobile, PC, and embedded applications. It is a part of a vast Rovi data library that includes information on music, movies, games, books, television, sports, awards as well as cover art, reviews, celebrity biographies, rich media and more.

Rovi Corporation's Michael Papish will keynote at the upcoming P2P & CLOUD MEDIA SUMMIT on Thursday at Digital Hollywood Spring.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe hope to see you in Santa Monica this week at the DCIA's first-ever conference about the impact of (peer-to-peer) P2P and cloud computing on the distribution of entertainment content. This promises to be our most timely and strategically significant conference to date.

The P2P & CLOUD MEDIA SUMMIT will be held in conjunction with Digital Hollywood Spring this Thursday May 6th starting with a continental breakfast at 9:00 AM at Loews Santa Monica Beach Hotel. This is the fifth annual DCIA conference being held in conjunction with Digital Hollywood.

The P2P & CLOUD MEDIA SUMMIT will explore current policy, technology, and content issues as well as next-generation business opportunities related to P2P and cloud based commercial offerings. A special session during the conference luncheon, sponsored by Alcatel-Lucent, will feature a candid discussion on how to do business with leading media firms.

KEYNOTES will include Alcatel-Lucent's Buck Peterson, General Manager (GM) of AppGlide; BitTorrent's Claude Tolbert, Vice President of Business Development; Cisco Systems' Geng Lin, Chief Technology Officer (CTO) of the Cisco-IBM Alliance; Giraffic's Assaf Benjamin, Vice President of Marketing and Business Development; HD Cloud's Nicholas Butterworth, Chief Executive Officer (CEO); KPMG's Mark Lundin, Partner; and Rovi Corporation's Michael Papish, Product Development Director.

The special CONFERENCE LUNCHEON session will frankly address the 'do's and dont's' for approaching major entertainment companies with new technology solutions. What are the absolute musts for succeeding in such an endeavor? What are the most serious pitfalls to avoid?

Participants will include Loeb & Loeb's Larry Kenswil, Of Counsel (formerly Universal Music Group); Pepperdine University School of Law's John Malcolm, Distinguished Practitioner in Residence (formerly MPAA); Priority Digital Media's Amy Friedlander-Hoffman, President (formerly AT&T); Starz's John Penney, EVP of Strategy & Business Development (formerly HBO); TAG Strategic's Ted Cohen, Managing Partner (formerly EMI Music); and Ubiquity Broadcasting's Steve Jacobs, President (formerly SONY).

The POLICY TRACK, sponsored by FTI Consulting, will take a global perspective on changing rules for P2P and cloud computing and answer questions such as what are the key laws and regulations that P2P and cloud computing software developers and distributors need to observe in various jurisdictions? What changes are taking place in the regulatory environment affecting P2P and cloud-computing technologies? What will be the impact of recent lawmaking actions and court rulings? What else has to happen from a legal and policy standpoint to foster investment and commercial development of P2P and cloud computing?

Panelists will include Consulting, Legal, Mediation & Strategy Services' Matt Neco, Principal; Digital Media Analyst Jason Roks; Dow Lohnes' Jim Burger, Member; FTI Consulting's Roger Scadron, Managing Partner; Hughes Hubbard & Reed's Dan Schnapp, Partner; MasurLaw's Steve Masur, Managing Director; Morrison & Foerster's Melody Torbati, Of Counsel; and St. Edwards University's Gregg Perry, Assistant Professor.

The TECHNOLOGY TRACK will zero in on how P2P and cloud computing are affecting the evolving distribution chain and answer questions such as what is the current landscape for P2P and cloud-based content distribution? What trends are emerging among participants in the distribution chain and in consumer usage? What impact do advances in digital rights management (DRM), compression, caching, content acceleration, swarming, streaming, and other distributed computing technologies have?

Panelists will include Asankya's Norman Henderson, VP of Business Development; Joyent's Steve Tuck, Director of Sales; PacketExchange's Chuck Stormon, VP, Strategic Accounts & Alliances; Sivoo's Rich Moreno, Principal; Verimatrix's Neerav Shah, VP of Business Development; and Yummy Interactive's Christopher Hennebery, VP of Software Distribution.

The CONTENT TRACK will focus on how to balance monetization and anti-piracy efforts to maximize profitability and will answer questions such as what business models show the greatest promise for P2P and cloud-based content delivery? What changes are needed to more effectively harness file-sharing and related technologies? What content-security solutions are now in development that will optimize P2P, cloud computing, and hybrid peer-assisted deployments for the benefit of all participants in the distribution chain?

Panelists will include BayTSP's Lawrence Low, VP of Product Management & Strategy; BUZZMedia's Anthony Batt, Founder; Copyright Clearance Center's Chris Kenneally, Director of Author Relations; Free Speech Coalition's Diane Duke, Executive Director; Game-Based Marketing's Gabe Zichermann, Author; and Independent Producer Melike Amjarv.

The NEXT GENERATION P2P & CLOUD PANEL will discuss the prospects for "content in the cloud," including music, TV, and film. Many distributed computing solutions are on the way, from live streaming to HD content downloading, with associated business models ranging from ad-supported, to subscription, to paid download. This session will go into the practical applications of P2P and cloud computing in the marketplace

Panelists will include Aleric's Vincent Hsieh, CEO; Ascent Media's Mick Bass, VP of Alliance Management; Flycell's Adrian Rubio, SVP of Corporate Development; Grab Networks' Marcien Jenckes, President of Media and Content; Panvidea's Doug Heise, VP of Marketing; RedThorne Media's Ian Donahue, Co-Founder & Director of Business Development; and TVU Networks' Jim O'Brien, Senior Advisor.

If you inadvertently missed P2P & CLOUD MEDIA SUMMIT's money-saving early registration rates, please call 410-476-7964 as soon as possible. For more information, please visit www.dcia.info/activities.

For sponsor packages, please contact Karen Kaplowitz, DCIA Member Services, at 888-890-4240. Share wisely, and take care.

Verizon Envisions Providing "Everything via the Cloud"

Excerpted from CIOL Report by Sudhakaran

Verizon Business is looking at the possibility of providing everything via the cloud. "It is not just software or security as a service (SaaS) that we are looking at, but to provide everything via the cloud," said Joe Crawford, Executive Director, IT Solutions, Verizon Business this week.

Crawford also talked about the immense potential of cloud solutions spanning the length and breadth of the IT infrastructure of a company. To strengthen this cloud reach, Verizon has teamed up with IBM and McAfee among others to offer more potential services in computing and security.

"Computing as a service (CaaS) answers all enterprise needs," Crawford said, adding, "It helps a company to perform globally."

Verizon says APAC, especially India, will benefit from moving in this direction. What the company has in mind is a "Europe India gateway," where India can play a major role in Verizon's business plans, added Blair Crump, Group President, Worldwide Sales and Service at Verizon Business.

"We have the largest network in the world; we plan to grow more business from international markets beyond the US," said Crump. He also emphasized that APAC will benefit more from the expansion plans of Verizon, adding, "Verizon is a multinational company that, like its customers, is moving to new markets around the globe. This year our country-specific focus will be India and Japan."

As of now Verizon Business has over 6,000 people working in APAC region. When Crump was asked why the focus on Asia, he said, "Asia is the fastest growing economy, so it is the top area for us in 2010." He also said APAC is the region where the customers are growing, and the focus is mainly on finance, retail, network, energy, and healthcare.

As part of the expansion in APAC, Verizon recently opened its second data center in Hong Kong to meet the demand of its customers. This year, the company is planning to invest $16.8-to-$17.2 billion to build, operate and integrate its networking and computing platforms.

"We continue to expand our presence globally to better serve our multinational customers and to accelerate the delivery of next-generation cloud services," said Yali Liu, Director of Network Planning for Verizon in Asia Pacific.

She also said Verizon is adding more-and-more new technologies to make the cloud more cost effective and reliable. "We have seven undersea cables in the Pacific Ocean and in all major cities we have multiple nodes so that customers face no issue even if there is a cable breach," she said.

She also said the company is looking at expanding into more locations and there are lots of demands from Africa. "We are in 15 countries in APAC and India and China are leading among them," added Liu. She also said Verizon is exploring the possibility in Indonesia closely and a new data center in India is also under review.

Talking about the India expansion plans, Andrew Dobbins, Vice President, Asia Pacific, said the company has 33 new services in India. "Last year, we hired 100 people and this year also we will hire over 100 in India. Last year we opened a center in Pune and this year we opened one in Chennai," he added.

He also observed that India would be leading the world in the usage of servers. "India is going to be the hub that connects the East and the West," he asserted.

Despite the hype created over the "cloud," most of the people are still apprehensive because they have the feeling that their data would not be safe and secure in a third person's hand. And to overcome that barrier, Verizon is planning a major campaign this year.

Akamai Launches Sustainability Initiative

Akamai Technologies, the leader in powering video, dynamic transactions and enterprise applications online, this week unveiled a new sustainability initiative dedicated to helping customers run more carbon-efficient web infrastructure and improve their energy consumption. 

Akamai's sustainability initiative includes a focus on enhancing the carbon efficiency of its global delivery network of over 61,000 servers. The initiative is designed to help Akamai improve the hardware and code efficiency within its network, and reduce carbon emissions and energy consumption, all while responding to customer demand for carbon measurement and reporting. 

The objective of the initiative is to quantify the energy reductions that Akamai realizes in network carbon efficiency and further develop Akamai's focus on becoming more energy efficient. "This is a unique opportunity for Akamai to revolutionize sustainability, specifically carbon management and network efficiency, across the global Internet landscape," said Paul Sagan, President & CEO of Akamai.

"For more than a decade, we have been delivering cloud computing optimization solutions that reduce the energy consumption of our customers' web and data center infrastructure. We built a massive, shared network of distributed computing power, running on efficient hardware and code, with a higher utilization compared with traditional data center infrastructure, and made it available as a managed service. Moving forward, and in collaboration with our customers, we plan to use the information we collect to identify and implement additional best practices." 

Akamai will report on the progress of its sustainability initiative here. In 2009, Akamai's drive to improve its network carbon efficiency resulted in an 86 million-pound CO2 reduction - a 32% reduction from the amount of CO2 used in 2008, which is equivalent to removing 7,400 US autos from the road. 

Today, Akamai is helping customers estimate their own environmental carbon footprint related to Akamai services. Akamai is also designing an application on its customer-facing portal to provide customers with ongoing, monthly visibility of their carbon footprint on the Akamai platform, across 11 geographic regions around the world. In addition to network carbon efficiency improvements, Akamai is pursuing reductions in its energy, water, and materials consumption within its business operations. 

For its work on building carbon-efficient IT solutions, Akamai has recently been recognized by The Uptime Institute, a leading independent think tank and research body serving the global data center industry. Receiving an 'honorable mention,' Akamai will be recognized at this year's annual Green Enterprise IT (GEIT) Awards. The GEIT Awards showcase organizations that are pioneering energy-efficiency improvements in their IT and data center operations, or that provide technology that can significantly reduce energy consumption.

Spotify Adds Social Features to Let Users Share Playlists 

Excerpted from California Chronicle Report

P2P music streaming service Spotify has made the most significant changes to its service since it launched, with the addition of social features to let users connect with friends and share music.

The changes pit Spotify against other music-based social platforms including MySpace and mFlow, the recently launched social music service which pays users commission from music sales that result directly from a recommendation they've made.

Paul Brown, Spotify's SVP of Strategic Partnerships, said the services will increase the time users spend on Spotify, creating a more engaged audience for advertisers.

"That will clearly have a knock-on effect for advertisers and make our proposition to agencies and brands more attractive," he said.

The social functionality and music management tools will let users set up a profile and add other users into a People sidebar, either by searching for them on Spotify or integrating their friends via Facebook Connect.

They can also publish their profile and let other users follow their music taste by subscribing to their playlists or seeing the top six artists and tracks they've been listening to.

Spotify has created an inbox which lets people send tracks to each other, and a friend feed that shows all the activity of users' friends.

Susan Clarke, Account Director at MEC Interaction - which handled a campaign for the Fiat 500 that used Spotify to get people to recommend songs for a branded playlist - said the changes will increase the attraction of Spotify to brands.

"Our campaign for Fiat led to it being the most popular playlist out of around 30m and we ended up with almost seven days' worth of music that people suggested," said Clarke. "The more social you can make a campaign on Spotify, the more it resonates with the users."

Dave Chase, head of music partnerships at Mindshare's brand partnership arm BrandAmp, said, "Advertisers understand Facebook so the deal is a good one because they can run campaigns that cross both platforms, safe in the knowledge they're reaching users during their consumer journey. Spotify still needs to lower its ad rates if it's to benefit."

Spotify's moves come as music streaming service We7 announced music played on its service was now paid for in full by advertising.

Steve Purdham, CEO of We7, said, "There has also been a lot of cynicism about whether ad-funded services work and there are a lot in the market. But we've shown that it can work, with music rights owners and artists getting the right rate."

BioTorrents: A BitTorrent Tracker Site for Scientists

Excerpted from Zeropaid Report by Jared Moya

Researchers at the University of California Davis have decided to create a BitTorrent tracker site dedicated not to the transitory pleasures of film, music, or TV, but rather to the more nobler pursuit of education.

BioTorrent is a "file-sharing service for scientific data," created explicitly for the purpose of allowing researchers to share large amounts of data faster, and more reliably.

"The transfer of scientific data has emerged as a significant challenge, as datasets continue to grow in size and demand for open access sharing increases. Current methods for file transfer do not scale well for large files and can cause long transfer times," reads an abstract describing the site. "In this study we present BioTorrents, a website that allows open access sharing of scientific data and uses the popular BitTorrent P2P file-sharing technology. BioTorrents allows files to be transferred rapidly due to the sharing of bandwidth across multiple institutions and provides more reliable file transfers due to the built-in error checking of the file sharing technology."

The researchers specifically single out the failures of transferring data via HTTP or FTP, being that in both cases there exists only a single source of the data, and that large amounts of bandwidth are required to ensure "adequate" download speeds. It also makes data susceptible to loss of access if the single server hosting it malfunctions.

From the site: BioTorrents allows scientists to rapidly share their results, datasets, and software using the popular BitTorrent file sharing technology. All data is open-access and any unauthorized file sharing is not allowed on BioTorrents.

We encourage researchers and organizations to share their data on BioTorrents as an alternative to hosting files through FTP or HTTP for the following reasons: Using BioTorrents can allow researchers to download large datasets much faster; BioTorrents can act as a central listing of results, datasets, and software that can be browsed and searched; and data can be located on several servers allowing decentralization and availability of the data if one server becomes disabled.

With BitTorrent, as most reading this already know, data can be stored and accessed by an infinite number of people simultaneously, and is unfazed by the bandwidth shortcomings of a single host. It combines the upload bandwidth of connected peers to serve all download requests, and data will always be accessible so long as a single seeder makes it available.

They also recommend that large scientific institutions and data repositories follow suit, and make their "larger datasets" available on BioTorrent. They point out that in one month alone "NCBI users downloaded the 1000 Genomes (8981 GB)" some 100,000 times.

"If BitTorrent technology was implemented for these datasets then the data supplier would benefit from decreased bandwidth use, while researchers downloading the data, especially those not on the same continent as the data supplier, enjoy faster transfer times," they add.

Better still, they note that small groups of individual researchers could use BioTorrents as their primary method for publishing scientific data. Researchers could instantly make their data, software, and analyses available "without the requirement of an official submission process or accompanying manuscript."

"This form of data publishing allows open and rapid access to information that would expedite science, especially for time-sensitive events such as the recent outbreaks of influenza H1N1[19] or severe acute respiratory syndrome (SARS)[20]," they continue. "No matter what the circumstance, BioTorrents provides a useful resource for advancing the sharing of open scientific information."

Haihaisoft Releases Live 3.0 P2P Broadcast Solution

In the whole world, only Haihaisoft and a few other companies are able to provide H.264 high-definition (HD) P2P solutions which can save over 98% bandwidth costs. 

Haihaisoft Live P2P technology dramatically reduces the bandwidth needed to deliver HD video. It solves the serious problem that live video broadcasting requires too much bandwidth, and helps cut bandwidth costs by up to 99%. 

At the same time, Haihaisoft also launched its proprietary P2P set-top box (STB). Haihaisoft live technology can be used for IPTV. Haihaisoft P2P users can watch TV programs smoothly through its STB. 

Haihaisoft Live's Principle is as follows. Each streaming media user (client) is a node. Users can set-up their networks and equipment capacity to connect with one or several users to share data. Some users play on-demand streaming media content entirely from other users' on-demand streaming media, but no connection with the streaming media server; Some users play on-demand streaming media content both from the other users' streaming media and streaming media server.

Gygan May Be Next Big Thing in File Sharing

Excerpted from Business Week Report by Gabriel Gralla

Gygan is one of the newest additions into the file-sharing battleground. With so many choices already available - BitTorrent, Usenet, and RapidShare among others - can Gygan really offer anything new? There's nothing groundbreaking, but it does fill a niche: Gygan (free up to monthly download GB limit) combines some of the best features of RapidShare and Usenet, but unlike the two it runs on an easy-to-use standalone client.

Like RapidShare, Gygan relies on users uploading content to their specific service, which is stored in a private sharing network on Usenet. Note that this is not P2P - Gygan provides indirect access to Usenet through its own servers.

This allows incredibly fast downloads and uploads (maxing out almost any connection), but it's also why you have to start paying after 1GB of free downloads. The first 1GB of downloads a month (4GB for the first month) is included for free to users, but if you want more you'll have to buy a monthly package for $.33-$.50 a GB.

On the other hand, unlimited uploads are free for everyone - and you can even make money through Gygan's Reward Program if enough people download your files.

If you're used to free services like BitTorrent, $.50 a GB might seem expensive, but you're paying for convenience and speed. You can start downloads from a simple URL like you would at Rapidshare, or from a .gyg file which contains the download information like a .torrent.

But Gygan also offers an in-client search feature, so you don't have to waste time scouring the web for the right link (Gygan also allows you to share "private" files which do not show up in search results, and are only accessible by the download URL).

And although Gygan's backend is Usenet, you would never know it; downloading is as simple as point and click, with all the complications of Usenet hidden from the user.

Despite being released to the public only a few months ago, Gygan has a decent collection of content, which should only continue to grow as more users join.

Already, Gygan is a viable file-sharing service. If you're willing to pay for fast, easy downloads, then give Gygan a try. It's not going to overtake BitTorrent any time soon, but it may be the right fit for you.

Beyond Oblivion Secures Investment

Excerpted from Music Week Report

US digital music start up Beyond Oblivion has created headlines by announcing that Rupert Murdoch's News Corp has taken a stake in the company.

On its website, Beyond Oblivion says it is "a music service that combines the stickiness of a social network with unlimited life-of-device access to the largest music library on Earth, within a vast ecosystem where content owners are paid per-play no matter if the original music file was ripped, bootlegged or legally or illegally downloaded".

According to the Financial Times, which carries an interview with the company's founder Adam Kidron, this means the onus for paying for music is shifted away from consumers and towards device manufacturers and broadband providers.

This approach - effectively monetizing unauthorized file sharing - is what QTRAX originally set out to do. It also mirrors the approach of Nokia's Comes With Music service, whereby consumers buying a particular handset have access to "free", legal music.

In the FT Kidron predicts his company will pay out more than $100 million in royalties to music companies within five years. More controversially, he says that an ultimate payout of $10 billion is "completely within reach".

News Corp has reportedly invested an undisclosed sum for a minority stake in Beyond.

What Really Is Cloud Computing?

Excerpted from Information Week Report by Paul McDougall

Numerous enterprise IT organizations claim to be implementing private cloud data center architectures from which end-users can tap information and applications on-demand, from internally managed servers.

But does that really fit the definition of cloud computing, an IT model that, some argue, necessarily adds a third-party, external service provider to the equation?

A private cloud "is simply what IT should have been doing for the past 20 years," said BitCurrent founder Alistair Croll, Monday at an Interop Las Vegas session called, "Private Clouds Are Just Another Name For IT Done Right."

Croll argued that the mere implementation of certain technologies associated with the cloud-virtualization, scripting and automation, single sign-on and the like-do not in and of themselves add up to cloud computing.

Virtual desktops require substantial up-front investment but cost savings can be significant. Learn how to deliver maximum bang for the buck.

"All those things are good, but they don't make you a cloud," said Croll, who added that owning a corporate jet does not make a company an airline.

Some vendors, particularly those pitching "cloud" offerings to in-house IT departments, begged to differ.

John Stetic, a director of product management at Novell, argued that there's little difference between an external or internal cloud from the perspective of the end user business unit. "It's simply another shared services model," said Stetic.

For their part, third-party providers of cloud services insist that cloud computing, by definition, implies the existence of an outside vendor that specializes in providing IT services from over the wall, usually under a metered pricing model.

"There are technical aspects and business aspects (of cloud computing)," said Steve Riley, senior technical program manager at Amazon Web Services. "You can duplicate the technical aspects, but not the business aspects," said Riley.

"And in-house, you can't reach infinite scale," Riley added. Of course, a decade ago the IT industry embraced terms like ASP, utility computing, and on-demand computing to describe services very similar to those offered by today's "cloud" vendors.

So the better question may be this: Does cloud computing, whether external or internal, really represent anything new?

The Mysterious and Scary BitTorrent Monitoring Site

Excerpted from TorrentFreak Report

While many BitTorrent users operate their clients without a second thought, many are well aware that everything they can do has the potential to be monitored by someone, somewhere. The data available in BitTorrent swarms is necessarily public - if it wasn't, no-one would be able to share anything with anyone.

The open nature of this amazing file-sharing system certainly has its benefits, but for many its greatest strength is also its greatest weakness. Organizations like the IFPI, RIAA, MPAA and others have spent a great deal of money over the years monitoring BitTorrent and other file-sharing networks. But what if that same feature was available to anyone right now via any browser?

That appears to be one of the functions behind a new and slightly unsettling website. After clicking past the title page, one is confronted by a message about the user's IP address and location which is derived from a standard traceroute (we used a commercially available VPN for tests) but it is the note at the bottom that provokes the most interest.

Gulp. Apparently this interface provides the ability to monitor BitTorrent swarms (we don't know and couldn't find out which ones) for the IP addresses on the subnet of the accessing user's IP address and show torrents that have been shared at some point.

After jumping onto a few legal torrents tracked by public trackers, we used the interface to try and find our test IP address in the reports but failed to locate it. There could conceivably be some sort of time delay but we were simply unable to confirm the exact mechanism of operation or, indeed, if the results are 'real' at all.

However, if the results are real (and they do look very convincing), then there is an even more worrying feature. Not only is it possible to search for torrents being shared on the user's IP address, but also any IP address of the user's choosing by simply reforming the end of the tracking URL.

But it doesn't stop there. At the bottom of one the pages is a link for the 'Auditor Console.'

This CLI-type affair accepts a few common commands. Typing 'ls' brings up a list of available directories, while the 'cd [directory name here]' command allows access to them.

One of the folders provides monitoring of a few select IRC channels while others appear to be non-functioning. Others contain lots of documents about monitoring and surveillance including wire-tapping requests for certain ISPs.

Having looked around this site and done quite a bit of research trying to find out who is behind it, TorrentFreak found some rather interesting links back to several individuals which leads us to go "Hmmmm."

But enough of the chit-chat, you should try this for yourselves. We'd also like to see if you find what we found (hint: it's not as scary at it looks). Have fun, and feel free to e-mail us at tips@torrentfreak.com with anything interesting you may find, or go ahead and write about your discoveries in the comments.

LaLa Shutter Fails To Send Shudders Through Tech World

Excerpted from Online Media Daily Report by Gavin O'Malley

LaLa - the cloud-based music-streaming service that Apple bought late last year - is going bye-bye at the end of May. 

In its place, industry watchers are predicting that Apple will launch a cloud-based iTunes.com - the mere implication of which has serious implications for the digital music business. 

Citing unnamed sources, a Wall Street Journal report published in late January said Apple was planning to launch a web-based version of iTunes as soon as June. "Tentatively called iTunes.com, the service would allow customers to buy music without going through the specialized iTunes program on computers and iPhones," it reported. 

"Apple's decision to close LaLa isn't much of a surprise, as LaLa never found much of a foothold as a standalone music service," according to CNet

"The real prize for Apple was the company's streaming technology ... Apparently, Apple is considering a plan to offer iTunes users the ability to store digital copies of their music and videos on the company's servers and then be given the ability to access their media via any web-enabled device." 

And it's about time, writes PaidContent. "Against both the web and subscription rise, iTunes' a la carte reliance looks archaic and one-dimensional, tooled for a market that's plateaued." Earlier this year, former MP3.com head Michael Robertson laid out the broad implications of Apple's would-be cloud-based media strategy. In short, he wrote, "Their upcoming plans ... are positioning Apple to lead the digital music industry into a new era." 

Upon the news of LaLa's demise, TechCrunch asked: "Will Apple be the first company to turn online music subscription services into a sizable business?" 

Perhaps, but Apple's path to subscription-based success isn't without its obstacles. For one, "LaLa's music streaming license was reportedly non-transferable, should the company be acquired," notes GigaOM. "Any new agreement with the major labels could involve a messy renegotiation in which Apple would make new concessions to the labels, as it did last year when variable pricing, DRM and bundling formats were in play."

Battle Lines Drawn as Net Neutrality Comments Roll In 

Excerpted from PC Magazine Report by Chloe Albanesius

Monday marked the final day for people interested in the Federal Communications Commission's (FCC) net neutrality proceeding to submit public comments on the issue, and one thing's for sure - the FCC is not at a loss for opinions.

While some groups wholeheartedly supported the commission's efforts, others urged the FCC to back off, focus on other subjects, or cut certain things out of the pending rules.

The Internet Innovation Alliance (IIA) - which includes AT&T and Level 3 Communications among its membership - urged the FCC to scrap plans for a net neutrality proceeding and focus instead on its national broadband plan.

"At this decisive moment in the American broadband revolution, we should strive to give greater certainty to telecom investors and innovators and rally support to the collective push for universal access and adoption, rather than advancing divisive regulations that increase investors' uncertainty," the IIA said. "Because marketplace realities do not necessitate new Internet rules at this time, we urge the FCC to shelve such regulations and devote all of its energies toward advancing the national broadband plan."

AT&T also submitted separate comments. Jim Cicconi, Senior Executive Vice President for External and Legislative Affairs at AT&T, accused supporters of manufacturing a threat and failing to identify any real problem that would make net neutrality rules necessary.

"Over and over again, we hear supporters cite the single instance where the FCC felt compelled to take action, namely the Comcast-BitTorrent case," Cicconi said. "But one example does not a compelling case make. Indeed, thanks to the DC Circuit, the Comcast case ironically now stands for the opposite proposition - namely, that government must have compelling reasons if it's going to substitute its judgment for that of the free market, and when it acts it must do so only with clear legislative authority."

The net neutrality rules would also apply to the wireless industry, something CTIA, the wireless industry, is against. In a telephone interview this week, CTIA president and CEO Steve Largent said there is also no evidence that the wireless industry needs regulating.

"No one's been able to give a specific example of harms regarding net neutrality with the wireless industry; the entire industry," Largent said. "This is really a thriving and innovative industry that continues to work."

The net neutrality proceeding at the FCC dates back to 2007, when Comcast was accused of blocking access to BitTorrent. Under former Chairman Kevin Martin, the FCC opened an investigation and handed down an enforcement action against Comcast in August 2008, citing the FCC's Internet Policy Principles.

The enforcement action required Comcast to be more transparent about its network management policies, a condition with which Comcast complied. Comcast appealed the ruling, however, because it said the FCC did not have the authority to hand down such an action. The FCC based its ruling on Internet Policy Principles - not actual FCC rules - in making its judgment, and it was therefore invalid, Comcast argued.

While that appeal made its way through the courts, new FCC Chairman Julius Genachowski proposed formal net neutrality rules that would ban Internet service providers (ISPs) from discriminating against specific applications or programs - and make the Internet Policy Principles official FCC rules. ISPs would be able to use reasonable network management, but could not block programs like BitTorrent.

Most people were in agreement about the idea of net neutrality - the idea that everyone should have equal access to the Internet. Disagreements emerged, however, over whether the government should get involved or if the Internet industry should self-regulate.

The FCC's net neutrality proceeding was moving along, with a deadline for final comments on April 8 when the court of appeals ruled in Comcast's favor on April 6. The FCC had no right to regulate Comcast's network management, the court said, prompting many to question if the FCC would be able to move forward on net neutrality and its national broadband plan. In the wake of that decision, the deadline for comments on net neutrality was extended to April 26.

Free Press, which filed the original complaint against Comcast, issued its support for the net neutrality proceeding Monday.

"American consumers need the FCC to act as their champion and to stand up against the powerful phone and cable companies that would rather sacrifice economic growth and the common good in a short-sighted attempt to protect their old business models," Free Press research director S. Derek Turner said in a statement. "It's clear that violations of the open Internet are ongoing and kept secret from consumers. If the FCC fails to establish basic rules of the road, we can expect much more of the same from broadband providers."

The Computer & Communications Industry (CCIA), meanwhile, - which includes Google, Microsoft, Yahoo, AMD, Facebook, and eBay in its membership - said that consolidation among wireline, wireless, cable, and satellite companies "makes rules to preserve Internet freedom and openness more critical than ever."

"The failure of the previous Commission to maintain enforceable, basic rights to nondiscriminatory Internet access should not mean consumers must simply give that up and allow their online experience to be hijacked by financial deals between Internet access providers and Web sites," Ed Black, CCIA president and CEO, said in a statement. "The FCC must restore its original role of public interest watchdog, looking out for consumer and small business broadband Internet connection rights."

CTIA's Largent, however, said his organization is trying to show the FCC commissioners "that there's no evidence of harm, that we're producing great results and that wireless really is a different technology," he said. "Net neutrality rules could have the ability to kind of mess up what we're doing, and we don't want that to happen, and I don't think they want that to happen."

"If anything is going to happen on net neutrality, it needs to emanate from Congress and not the FCC, and so we're real busy in the process of educating both the commission and members of the Hill and the House and the Senate."

In the wake of the Comcast decision, the FCC is reportedly considering a rather technical solution that would essentially allow it to classify Internet service as a telecom service, not an information service.

The National Association of Manufacturers (NAM) held a joint call with the US Chamber of Commerce and trade group TechAmerica on Monday, urging the FCC to re-think that option.

The FCC needs to reflect and "should refrain from moving forward absent clear congressional authority," according to NAM's Marc-Anthony Signorino. The FCC should focus on "fostering innovation and job growth," he said.

File Sharers Are Young, Male, and Good-Looking

Excerpted from NewTeeVee Report by Janko Roettgers

Looks like online infringement isn't just for zit-faced teenagers anymore: the act of downloading videos, music, e-books and other goodies is most popular with male users between the ages of 20 and 29, according to a new survey by Germany's GfK Panel Services that was commissioned by IFPI.

One quarter of all males in this age bracket use file-sharing networks to download content, followed by 17% of males aged 30-to-39.

Gender issues aside, about 7% of all Germans admit to using file-sharing networks to download content, which comes to a total of about 4.5 million teutonic downloaders.

Teenagers download just slightly more than the average user, with 9% stating that they're downloaders.

Also remarkable: 7% of users aged 40-to-49 are active users of file-sharing networks. All of that sharing happens despite many users being fairly certain that at least part of their behavior is violating copyright laws.

87% of all consumers questioned stated that it's likely not lawful to share files through social networks, and 94% believe it to be unlawful to offer files for download via BitTorrent. GfK registered similarly high percentages for the perceived unlawfulness of offering files for download via blogs, newsgroups, one-click host sites and other types of file sharing services. Of course, that doesn't mean that simply accessing a file shared through a one-click host carries the same type of stigma.

One should also note that there is likely to be a high number of users afraid to state that they're sharing, even if asked by a market research company. Not only are Germany's major labels still pursuing lawsuits against file sharers, unlike their counterparts in the US, but individual rights holders have been partnering with a number of companies that specialize in hunting down file sharers for profit, resulting in an avalanche of lawsuits that has by now reached the hundreds of thousands.

However, all of these lawsuits haven't helped to turn around the fate of Germany's entertainment industry, with music sales declining 3.3% last year.

These losses are actually now impacting anti-piracy market research as well. Germany's music industry used to release its annual "Brennerstudie" (burner report) for free. This year, it only made the above mentioned demographic numbers publicly available, and instead decided to sell the full report for $200.

Fair Use Generates $4.7 Trillion for US Economy 

Excerpted from Slashdot Blog Report by Hugh Pickens

"The Hill spotlights a study released by the Computer & Communications Industry Association (CCIA), which concludes that companies relying on fair use generate $4.7 trillion in revenue to the US economy every year.

The report claims that fair use - an exception to the copyright law that allows limited use of copyrighted materials - is crucial to innovation. It adds that employment in fair use industries grew from 16.9 million in 2002 to 17.5 million in 2007 and one out of eight US workers is employed by a company benefiting from protections provided by fair use.

Rep. Zoe Lofgren (D-CA) says the reasonable fair use of content needs to be preserved; otherwise, content owners will control access to movies, music, and art that will no longer be available for schools, research, or web browsing. Lofgren tied the copyright issue with the question of net neutrality.

Without net neutrality 'content owners will completely control and lock down content. We're going to be sorry characters when we actually don't see fair use rights on the web,' says Lofgren. 'If we allow our freedom to be taken for commercial purposes, we will have some explaining to do to our founding fathers and those who died for our freedom.'"

Coming Events of Interest

Digital Hollywood Spring - May 3rd-6th in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

GenosTV Gala Media Event - May 5th in Santa Monica, CA. ShambroWest Corporation invites all P2P & CLOUD MEDIA SUMMIT delegates to an exclusive dinner previewing the revolutionary GenosTV for the Digital Hollywood community.

P2P & CLOUD MEDIA SUMMIT - May 6th in Santa Monica, CA. The DCIA presents its fifth annual seminal industry event as a conference within DHS, with the subject matter now expanded for the first time to include cloud computing, the most advanced and rapidly growing distributed computing technology.

Broadband Policy Summit VI - June 10th-11th in Washington, DC. The most comprehensive, in-depth update about the implementation of the FCC's National Broadband Plan. No other forum provides the detailed coverage, expert insight and networking opportunities you'll receive at Broadband Policy Summit VI. The expanded program includes top-notch faculty who will address the most pressing broadband issues in six panel discussions, two debates and four keynote addresses.

Digital Media Conference East - June 25th in McLean, VA. The Washington Post calls this Digital Media Wire flagship event "a confab of powerful communicators and content providers in the region." This conference explores the current state of digital media and the directions in which the industry is heading.

Digital Content Monetization 2010 - October 5th-6th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.

Copyright 2008 Distributed Computing Industry Association
This page last updated May 16, 2010
Privacy Policy