Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2P Leaders

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

August 2, 2010
Volume XXXI, Issue 9


DreamWorks Signs Cloud Computing Deal

Excerpted from Channel Register Report by Chris Mellor

DreamWorks SKG has signed a multi-year deal with Cerelink for cloud computing access.

Instead of rendering movies like "How To Train Your Dragon" on thousands of its own computer cores, DreamWorks will use elastic compute resources housed in Cerelink's supercomputing-class facility at the New Mexico Applications Center (NMCAC).

"Elastic" cloud computing allows clients like DreamWorks SKG to dynamically adjust technical capacity to meet their real-time business needs.

Cerelink is a high performance cloud computing (cloud HPC) provider to the motion picture industry. It provides private clouds for rendering and other content creation and management application, based on a combination of data center space, scalable high performance computing and networking, in the form of Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS).

The Cerelink facilities include access to several thousand square feet of secure data-center space located in Rio Rancho, NM. That space is fed by redundant electrical power grids. It has access to LambdaRail, the 12,000 mile US coast-to-coast fast broadband network, and to a supercomputer at Encanto.

This offers a theoretical peak speed of 172 teraflops (peak theoretical speed) from its Altix ICE 8200 cluster, with 133 teraflops sustained operation. The ICE 8200 consists of 1,792 nodes (14,336 cores) of quad Xeon 3.0 GHz processors housed in 28 racks.

Cerelink's private cloud computing service was used by DreamWorks Animation to render parts of "Shrek Forever After" and "How to Train Your Dragon" this year. Cerelink itself was founded by a group of ex-Intel managers in 2005.

James Ellington, its CEO, said, "We forecast growing our technical capacity by 20 times by the end of 2011 - this will create one of the largest cloud computing arrays for motion picture production in the world."

This represents a threat to suppliers of in-house HPC compute and storage facilities to the movie rendering industry, such as BlueArc, DataDirect, Dell, HP, Isilon, and NetApp. If their customers start hiring rendering and animation HPC capacity from service suppliers such as Cerelink, then there will be less demand for in-house solutions.

It is some distance from the movie mecca in Hollywood to New Mexico, raising the question of why the movie moguls should look at doing their rendering and animation in New Mexico?

The State offers film production incentives, like a 25% tax rebate, for projects done in the state. Also, Cerelink and the NMCAC, in collaboration with the University of New Mexico, and the New Mexico Department of Information Technology, use an ultra-high-speed network link to link New Mexico to Hollywood.

Cerelink says that, because of this link, Hollywood should hire its elastic compute cloud in New Mexico to do rendering and animation work more cheaply than in California.

uTorrent Adds Web Interface Support for iPad

Excerpted from PadGadget Report

If you happen to be a peer-to-peer (P2P) addict, the makers of the popular BitTorrent client uTorrent just renovated their client's web interface: the app now lets you remotely access your on-going transfers directly from your iPad (as well as iPhone and Android).

Using the feature is easy, all you need to do is set up a username and password in your uTorrent client (under Options > Preferences > Web), and then go to http://web.utorrent.com with your iPad.

Note that you will need the latest development build of the client to play with the web interface (3.0 alpha), which is only available for Windows at this stage - Mac users will have to be more patient to play with the feature.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWhile representatives of major Internet service providers (ISPs) and consumer advocacy groups met privately with US Federal Communications Commission (FCC) officials this week to discuss potential government action regarding open Internet principles, Congressman John Dingell (D-MI) and others continued to express criticism of the Commission's unilateral attempt to regulate broadband.

Since the DCIA's inception seven years ago, there has been enormous expansion in both broadband deployment and the entirely new phenomenon of entrepreneurs creating businesses that contribute substantially to our economy while relying exclusively on the Internet for their existence.

We must all work together now to protect these new web-based models at the same time as we continue to incentivize ISPs to invest in network build-outs and further development of broadband services. These two growth engines go hand-in-hand. Clearly one cannot thrive without the other, and successful collaboration in this arena is of very great benefit to consumers.

As Amazon's Paul Misener concluded last week in his CNET piece, "Despite the continuing polarized debate, all three major groups of stakeholders - network operators, consumers, and content providers - would be better off with clear, balanced rules that prohibit harmful discrimination among content but also allow network operators to provide performance enhancement on equal terms, so long as it does not degrade the performance of other content. This would be a win-win-win solution, without compromise."

And as AT&T's Jim Cicconi blogged late last year, based in part on input from a number of additional stakeholders, it should be possible to reach consensus for a path forward to reach the middle ground reflecting such a triple win.

Among the topics discussed behind closed doors at the Commission this week were the transparency of broadband Internet service performance, network management practices, the treatment of specialized services, and prohibitions on blocking lawful content.

Of great concern to many distributed computing industry participants and observers is the implication that this last topic infers the commencement of a problematic new regime that would mandate the monitoring of online content to determine whether particular instances of usage were authorized, and then filtering it accordingly. In our view, this is an area which first needs to be addressed with new business models for content replication and distribution to take advantage of new technologies. rather than acting prematurely in an attempt to limit or curtail them.

Nevertheless, the DCIA generally lauds efforts to develop an alternative to the FCC's proposal to reclassify broadband from an information service to a telecommunications service.

It has been two months since Congressman Dingell initially requested an explanation from FCC Chairman Genachowski regarding the FCC's so-called "third-way" proposal for regulating the Internet. After sending a second letter last week, he finally received a response, to which the Democratic Congressman replied in part as follows:

"Unfortunately, the paucity of substantive responses to my aforementioned questions in your recent letter has served only to substantiate my fear that the Commission's proposed path with respect to the regulation of broadband is based on unsound reasoning and an incomplete record.

I worry that hurried action by the Commission to complete a rulemaking or issue a declaratory ruling concerning the classification of broadband Internet access services in the absence of a clear statutory mandate from the Congress will result in poor policy and protracted litigation, which itself will confound the Congress' and the Commission's efforts to encourage further investment in broadband infrastructure, create new jobs, and stimulate broadband adoption as we seek to implement network neutrality rules.

With this in mind, I reiterate my suggestion that the Commission abandon the classification effort it has set in motion and instead seek the authority it requires by asking the Congress to enact a statute that clearly delegates such authority. In this way, the Congress and Commission may ensure the steadfast legal foundation for an open Internet."

The Congressman's view is echoed by Kyle McSlarrow, President and CEO of the National Cable & Telecommunications Association (NCTA), who said his group sees "great danger in attempting to shoehorn modern broadband services into a depression-era regulatory regime without serious collateral effects to investment, employment and innovation."

There truly does not seem to be a substantive legal basis for Chairman Genachowski's proposal to increase his Commission's regulatory authority over the Internet unilaterally. And Congressman Dingell is currently among 282 Members of Congress - including 73 other House Democrats - who are opposed to this approach.

The consensus among lawmakers seems to be growing that legislation is required to better delineate FCC authority with regards to the Internet. Chairman Genachowski would be well served to acknowledge this, and defer to Congress to decide whether or not to pass laws that would clarify for the Commission a framework for the authority it has so aggressively sought to command.

The clearest path for Congress to take would be to revise or amend the Telecommunications Act of 1996 to ensure that the regulations governing broadband networks not only are fair and reasonable for the present, but also do not constrain untapped potential for the future in terms of continued growth and innovation by being bound to an outdated infrastructure.

As it stands, the Commission's reclassification proposal would at best create significant regulatory uncertainty. The final round of public comments on the FCC's proposal is due August 15th. We urge DCINFO readers to get involved with the process and voice your opinion at the agency and on Capitol Hill. Share wisely and take care.

Full Length Shows: Shifting the Dynamics of Online Video Viewing

Excerpted from Video Insider Report by Ian Blaine

The popularity of user-generated video remains one of the most profound phenomena of the digital age - but not necessarily of the digital economy. Consumers are demanding, and media companies are providing, access to more premium video content online than ever before. After several years of experimentation, it's clear that the real money in online video remains where it always has been historically - with professionally produced content created by leading media and entertainment companies.

While it's very easy to become enamored with the latest or coolest technological innovation, from the iPad to Internet-connected TV, when it comes to video, they are all just a means to an end. People are interested in watching quality, premium shows online -- plain and simple.

One of the most telling signs of this dynamic is the sustained growth in the number of people watching full-length television shows. According to a recent report issued by eMarketer, 33% of online adults in the United States, or 58.9 million people, will watch full-length television shows online this year on a monthly basis. Among adults who already watch video regularly online, the rate is 50%. In 2011, eMarketer forecasts the rate among these groups will rise to 39% percent and 56%, respectively.

Online video isn't just about clips anymore, and the business of online video certainly isn't. As a CEO whose company is fortunate to help manage online video for numerous programming companies and television service providers, I see this firsthand as well. Of the billions of premium video views we manage annually, we're seeing average viewing time increase across the board. And, I am more excited than ever about where things are headed.

Like most web enthusiasts, I want as much video online and accessible as possible. As an industry veteran I also recognize that it takes money to produce all the shows that we love, and that there are fundamental differences in terms of costs and revenue opportunities for how and where these shows are distributed. TV is still the dominant target for ad and subscription dollars and so far broadband video is not generating the same financial results due to lower ad loads, fractured audiences etc. I look forward to the day when technology and financial models evolve so that the kind of screen on which the video plays doesn't matter as much.

But, as a technologist, I also know the journey behind the scenes will be a bumpy one. In fact, I can hardly remember a time when the technology camps were more fragmented. The way video is presented to consumers is fragmented, with technologies like Flash, HTML5, or the upcoming WebM all diverging. The devices and operating systems on which video is presented to consumers are fragmenting as well, with mobile devices, game consoles and connected TVs powered by competitive offerings from the likes of Apple, Google, and Microsoft all enriching the market but by the same token complicating the media distribution and consumption life cycle. And, these are just two areas of fragmentation from a laundry list of issues confronting the industry.

While this fragmentation provides consumers and media companies with many choices, it can easily dampen the rate at which premium video is enjoyed more ubiquitously by consumers. Fortunately, there are some underlying technological systems that can help mask some of this complexity and let consumers enjoy long form video in a user-friendly way. And, most technologists that I know haven't forgotten that our products are just the means to the end for consumers -- who simply want to enjoy great content. We all have our work cut out for us, but the payoff for consumers and media companies will be worth it.

In the end, recent reports from both eMarketer and Nielsen show that people are clamoring to watch more premium shows on television and online. With one third of online adults watching full length TV shows, it's clear that premium, long-form video has earned its vaulted, new status in the online video landscape.

Cloud Computing in Plain English

Excerpted from DABCC Report

Around ten years ago, the application service provider (ASP) model was being touted as the next great wave of technology. The ASP model was trying to do software-as-a-service (SaaS), but it didn't ever really live up to the hype.

So here we are a decade later. What has changed? Technology has grown up, and hardware is dramatically more powerful than it was ten years ago. The ASP models simply didn't scale or make economic sense.

We now have improved bandwidth, more powerful hardware, and the secret sauce: server virtualization. Virtualization means we can slice up hugely powerful hardware into digestible chunks and sell it off to multiple customers. The concept of multi-tenancy (more than one company sharing the same computer) was not achievable without server virtualization. Multi-tenancy is key to the large cloud models.

There is a lot of noise around cloud - this-as-a-service, that-as-a-service, even X-as-a-service (XaaS)! What does this all mean, and how is this so different from existing outsourcing models?

The National Institute of Standards and Technology (NIST) has released a set of documents around the topic of Cloud Computing. I waded through this material and decided to put an interpretation around it. NIST breaks cloud computing down into five essential characteristics; three service models; and four deployment models.

Wikipedia breaks it down into nine "key features" and the same service and deployment models, so what cloud really requires to define it as cloud looks like it is still solidifying, but at least there seems to be consensus on the service and deployment models. Additionally the "essential characteristics" mentioned here are largely part of one or more of the "key features".

The Five Essential Characteristics of Cloud Computing are those things that are required of a service to make it qualify as true "Cloud Computing". In other words, if it doesn't do this, it isn't cloud computing.

1. On-demand self-service. The ability for the client to self-provision resources for itself - storage or server resources

2. Broad network access. Here we are talking about the proliferation of devices at the end-point that can receive the service.

3. Resource pooling. Shared access to computing power spread across multiple geographic locations - multi-tenancy.

4. Rapid elasticity. Scale up or back on demand. Burst capability.

5. Measured Service. Resources provided as controlled, metered service to the client.

Three Service Models

1. Software-as-a-service (SaaS). The provider provides access to the applications they are hosting.

2. Platform-as-a-service (PaaS). The provider presents an Operating System layer and possibly an application toolset; the customer can then add customizations and or applications.

3. Infrastructure-as-a-service (IaaS). The provider presents low level resources to the client, who can then build out operating systems and application layers.

Four Deployment Models

The four deployment models relate to where the services sit with relation to the customer and the levels of customer community including physical and security boundaries.

By adopting common standards, loads could be ported among clouds. Bear in mind the concept of "bursting," analogous to bursting in network quality of service (QoS) terms, in periods of high load I could move some of my load to a community or public cloud.

1. Private cloud. The customer or provider creates cloud infrastructure for the private use of the customer.

2. Community cloud. Similar to a private cloud, the community model creates a cloud for use by that community only.

This could for example be applicable to multiple subsidiaries sharing some cloud services.

3. Public cloud. The customer accesses cloud services that are made broadly available by the provider.

4. Hybrid cloud. A hybrid cloud comprises two or more of the other models. Commonality between the clouds means that loads can be moved and or spread between clouds.

After studying the "Essential Characteristics," I realized that a lot of technology being sold as "Cloud" isn't really cloud - a lot of things are components of cloud but cloud, according to these requirements, is a lot more than "stuff you can get over the Internet".

This is a very high level outline to the topic of cloud computing, but if Gartner, IDC, etc. are to be believed, then cloud is looming large on the horizon and we would all do well to have some basic idea of what it is.

Adobe AIR 2.5 Brings P2P Flash Video Chat to Android Devices

Excerpted from Tech Pinger Report

One of the latest solutions brought into the wild from Adobe was the demo software called "FlashTime," which was built on the upcoming Air 2.5, and which came to light as a P2P video chat client aimed at taking full advantage of the photo snapper included in one's mobile phone.

The solution was recently demonstrated on smart-phones running under Google's Android operating system and seems promising.

Mark Doherty, the Platform Evangelist for Mobile and Devices at Adobe, has been testing out the new features that came with the Adobe AIR 2.5 beta, including an Android app using Adobe AIR 2.5 that allows the Android smart-phone users to make P2P video calls.

Regarding details of the P2P video calling app, he said that the code will be made open-source, once it is made stable, and that should probably be around next week. It's currently still a prototype, so don't expect to see such an application appearing on your Android device anytime soon.

Top Ten Cloud Computing Venture Capital Firms

Excerpted from CloudTweaks Report

NEA has been helping to build great companies for more than 30 years. Its committed capital has grown to $11 billion and NEA has funded more than 650 companies in the Information Technology (IT), Energy Technology, and Healthcare sectors.

Norwest Venture Partners (NVP) has actively partnered with entrepreneurs to build and grow successful businesses for almost 50 years. The firm manages more than $3.7 billion in capital, has funded over 450 companies since inception and has demonstrated an exemplary track record producing premier investment returns during differing capital market environments.

US Venture Partners (USVP) has helped build great companies for three decades. Since its inception in 1981, USVP has invested over $2.7 billion in about 450 companies. Throughout, USVP's partners have worked diligently and consistently with early-stage companies, many of which have become industry leaders.

Ignition Partners is a venture capital firm dedicated to helping the best entrepreneurs seize opportunity. From turning their early idea into a business, to hiring the right team, providing the right industry and functional insight and connections, to growing the business strategically, globally, financially, to realizing the best ultimate outcome, Ignition is ready to go the distance. Ignition invests in emerging and future leaders in communications, Internet, software, and services across business and consumer targets.

Sequoia Capital in the US caters to the founders and management who have selected us as their business partners. Sequoia has learned that the only way to help develop a fabulous company is one step at a time. This only happens if the company makes wonderful products or delivers a service that thrills large numbers of customers. If that occurs then founders, management, and employees of these companies prosper. It is only then that the investor deserves to be rewarded. It has to happen in that order. There are no shortcuts.

First Round Capital is an early-stage venture capital firm. As a seed-stage investor, it often provides a company's first outside capital - and typically invests alongside angel investors. Its typical initial investment in a company is around $500,000 - but it has gone both higher and lower. First Round is not afraid of investing in pre-revenue companies, and understands the challenges of launching a new product. That's why it likes to take an active role in the companies it invests in.

Mission Ventures helps build successful enterprises in Southern California and creates superior returns on investment for its investors. This is accomplished by investing in the most promising early-stage companies in high growth, emerging markets, and providing significant assistance to those companies as they develop.

DAG Ventures is a venture capital partnership investing in and helping outstanding entrepreneurs create leading, long-term companies across a range of markets. With roots from the 1980's in cable TV, infrastructure, media, and wireless industries, the partnership today is privileged to work with world-class entrepreneurs as they build tomorrow's leaders in the information technology, energy, and life science sectors. DAG Ventures invests in companies with proven technology, from the prototype stage onward.

Hummer Winblad Venture Partners was founded in 1989 as the first venture capital fund to invest exclusively in software companies. Through its history, it has had the opportunity to invest in the pioneers and leaders of several generations of software applications, architectures, delivery methods and business models. It has helped entrepreneurs build companies in desktop software, embedded systems, client-server, distributed network computing, Internet, software-as-a-service (SaaS), and cloud computing.

Shasta Ventures was formed expressly to help entrepreneurs build great companies, its primary objective is to provide outstanding service to the companies in its portfolio. It means Shasta has the time to work with early-stage companies because it serves on a limited number of boards. And it means Shasta cares about the companies it invests in - not only the businesses, but the people as well.

BayTSP Appoints Stuart Rosove as CEO

BayTSP's Board of Directors has appointed Stuart Rosove as CEO of the company, succeeding founder Mark Ishikawa, who will continue to serve as a Director.

BayTSP is the global leader in providing the media and entertainment industries with the most comprehensive commercial search and discovery services for analyzing the impact of digital media online, providing the best anti-piracy measures and enabling effective marketing decisions through insightful business intelligence.

"The company appreciates Mark's work and contributions made over the last ten years and recognizes that his entrepreneurial drive and spirit put BayTSP at the forefront of the industry," said Larry Hootnick, Chairman. The Board was unanimous in its decision to recruit a new leader to take the company to the next level.

"Stuart's background and record of success in digital media and security made him a natural candidate," said Hootnick. "The Board supports his vision for strengthening the company's core competence and expanding to new, but related markets domestically and internationally while maintaining our reputation of excellence in customer service."

Rosove brings more than 15 years of senior management experience in technology infrastructure and security, digital media and intellectual property licensing. Prior to joining BayTSP, Rosove was Vice President, Media and Entertainment with Digimarc, where he was responsible for intellectual property (IP) licensing, market development, and strategic partnership development. In that role, he worked closely with Digimarc licensees, and various stakeholders in the digital media delivery ecosystem to promote the adoption of digital watermarking for managing, protecting and enhancing digital content. Rosove was also responsible for TVAura, the joint venture between Digimarc and the Nielsen Company which was announced in 2009.

Prior to Digimarc, Rosove was President and CEO of Activated Content Corp., a leading supplier of digital watermarking solutions to the music industry. Rosove was also CEO of AudioTrack Watermark Solutions and founder and CEO of Sequel Technology Corp. He has held senior management positions at Delrina and QNX Software. Rosove has completed the Executive Program at Stanford University Graduate School of Business, holds a Bachelor's Degree in Journalism from Carleton University in Ottawa, Canada and a Bachelor's Degree in English from the University of Manitoba in Canada.

New Cloud Computing Market Report

A new report indicates that the market for cloud computing services will reach $222.5 billion by 2015, fueled by end-users modernizing their networking infrastructure, further proliferation of the Internet and the tumultuous economy. 

Those factors combine to create a perfect storm in which companies will upgrade their networks to cut costs and boost performance, the report indicates. Solution providers stand to gain from the predicted cloud computing services market explosion, as a large chunk of the $222.5 billion will move through that channel. 

Analysts noted that the cloud computing services charge will be lead by marquee cloud vendors including Amazon Web Services, Google, IBM, Microsoft, Rackspace, Salesforce.com and many others. 

This Cloud Computing Services report analyzes the global market for cloud computing services by the following service segments: Application Services, Business Process Services, and Infrastructure Services (Application Infrastructure, & System Infrastructure). Annual estimates and forecasts are provided for the period 2006 through 2015.

The Cloud Computing Services report profiles 92 companies including 3tera, Akamai Technologies, Amazon Web Services, Dell, ENKI, Flexiant, Google, Hewlett-Packard Development Company, IBM Corporation, Joyent, Layered Technologies, Microsoft Corporation, Netsuite, Novell, OpSource, Oracle Corporation, Rackspace Hosting, Red Hat, Salesforce.com, Skytap, and Terremark Worldwide. 

Market data and analytics are derived from primary and secondary research. Company profiles are mostly extracted from URL research and reported select online sources.

Passware Leverages Distributed Computing for Password Recovery

Passware, a provider of password recovery, decryption, and electronic evidence discovery software for corporations, law enforcement organizations, government agencies and private investigators, announces Passware Kit 10.1. This is the first commercially available software that cracks passwords for the most difficult to decrypt file types - RAR archives and TrueCrypt hard disks - using advanced acceleration methods provided by graphic processing unit (GPU) cards.

Such GPU cards by NVIDIA are available on most laptops and computers. By using just a single card, Passware Kit can accelerate password recovery for the strongest encryption used in RAR archives, TrueCrypt volumes, and MS Office 2010/2007 documents. Passware supports multiple cards simultaneously, which further enhances password recovery time.

Users can now dramatically reduce the time-consuming process of cracking strong passwords by leveraging the advanced acceleration methods offered by GPU cards and the power of hardware-accelerated distributed password recovery. The current version of Passware Kit builds upon its ability to accelerate distributed password recovery using both GPUs and Tableau TACC1441 hardware, which exploits the computing power of multiple computers for superior performance resulting from distributed password recovery.

The new product enhancements result in record password recovery speed, when compared to competitive products. The time to recover passwords for strong RAR 3 archives is reduced 10 times, with the speed of more than 2,500 passwords per second with just a single FERMI card by NVIDIA. Recovery of passwords for TrueCrypt disks occurs 45 times faster than before, without using GPU technology.

"With the ability for distributed computing to harness the computing power of multiple workstations to work on a single password or key problem, we address the inherent challenge of cracking strong passwords and encryption," said Dmitry Sumin, president of Passware, Inc. "The ability to utilize GPU cards accelerates password recovery in record time, which gives law enforcement and government officials the added benefit of dramatically reducing the time-consuming process of cracking strong passwords."

Passware Kit 10.1 uses all types of GPU cards, including FERMI - the latest GPU-based supercomputing technology by NVIDIA. It also supports multi-core CPUs and Tableau TACC hardware accelerators. With Distributed Password Recovery, all the hardware available over the network is used effectively to recover a single password. The Distributed Password Recovery can be applied to over 40 different file types, including ones with strong encryption, such as MS Office documents, Zip and RAR archives, and TrueCrypt containers.

Passware Kit 10.1 is available now from Passware and an expanded network of resellers in the U.S., Europe, and Asia and recently added Australia, New Zealand, China and South Korea. The manufacturer's suggested price for Forensic edition starts at $795 with one year of free updates.

Cloud Working Group Developing Standard APIs

Excerpted from Information Week Report by Charles Babcock

A standards body formerly known as the Distributed Management Task Force (DMTF) is trying to cut through the varied and conflicting terms used in cloud computing to supply both a common vocabulary and a set of public application programming interfaces (APIs) that could be used by many cloud vendors to supply standard cloud services.

In two documents issued Monday, "Architecture for Managing Clouds" and "Uses Cases and Interactions for Managing Clouds," the DMTF lays out what it has concluded are the essential functions for cloud computing and the language that can be used to describe them.

The documents were produced by a unit of DMTF known as the Open Cloud Standards Incubator, formed in April 2009, and will serve as the groundwork for the next step: drafting APIs for infrastructure-as-a-service (IaaS) through a newly appointed Cloud Management Workgroup.

"If we come up with a good API, all the cloud suppliers would be able to implement it," said Winston Bumpus, President of the DMTF. The specification would actually cover a set of APIs, with an interface for each phase of cloud operation, such as one for handling the submission of an external workload, loading it into a virtual machine, starting the virtual machine, storing its results, and terminating it.

In the future, predicted Bumpus, cloud users will face an array of different suppliers that they can be used in the same way without stopping to reconfigure workloads or rework applications. It will be possible to move from cloud to cloud, invoking a set of standard APIs.

Today, pretty much the opposite is the case. VMware is busy providing cloud software to new suppliers who will run VMware virtual machines, while Citrix Systems is allied with Microsoft in producing a virtual environment that run both Citrix XenServer and Microsoft Hypver-V virtual machines, each using varying API to admit and handle the workloads involved. "We're very optimistic we're going to solve this problem," said Bumpus in an interview. He is director of standards at VMware as well as President of the DMTF.

Such a grand plan lacks one backer, however, and that's Amazon Web Services, supplier of the market leading EC2 infrastructure as a service cloud. "No, they are not participating. I have reached out to them," Bumpus said. But he doesn't expect an API set from DMTF will have to go out and compete head to head with Amazon's own APIs. If the working group's APIs are well drafted and widely followed, the pressure will build on Amazon to support it.

The DMTF's incubator documents establish such things as a service catalogue will be basic to the operation of each cloud in the future. Security will be resolved through communications with a security manager server, etc.

Bumpus said participating companies have already submitted recommended APIs for many functions. VMware submitted the VMware Cloud API last September. In November, Fujitsu, a recent entrant in cloud computing, submitted its own API set. HP submitted APIs in January, Telefonica in March and Oracle in July.

Working with these varying API implementations will be difficult, but Bumpus points out that all are based the Representational State Transfer(REST) protocol to HTTP-based web services, and all try to accomplish similar goals. "There are differences, but not meaningful differences" that can't be resolved in committee, he said.

The resulting API set may one day allow the establishment of more public -- and private -- clouds, with users having a much firmer idea of how to interact with them. "I see a change in how we view and do computing," said Bumpus. Government requirements will help drive suppliers toward a common standard, enabling "new compatibilities that we find hard to imagine today," he said.

Judge Rules that Circumventing DRM Is Not Unlawful

Excerpted from Download Squad Report by Sebastian Anthony

In what will surely become a landmark case - or at least a massive thorn in the MPAA and RIAA's feet - a judge has ruled that bypassing digital rights management (DRM) via hacking, reverse engineering, or any other means is not in itself unlawful. The case itself ruled that General Electric, in using hacked security dongles to repair some uninterruptible power supplies produced by another company, did not violate the Digital Millennium Copyright Act (DMCA).

Why? Because the end goal was legal. If the hacked dongles had been used for the forces of evil, the story would be different.

While this doesn't sound immediately applicable to DRM-protected software, music, and movies, bear in mind that the DMCA is the foundation for every spurious copyright claim made by RIAA, MPAA, and the myriad of other digital rights groups.

In essence, this ruling means that you're free to break DRM on media that you own. No longer is it unlawful to rip your own DVDs or crippled audio CDs onto your hard disk. I think there might also be some implication for the DRM used on contemporary games like "Assassin's Creed 2."

In case you were wondering, this doesn't make copyright infringement lawful. It just means that bypassing DRM to reach a legal goal - i.e., fair use of things you own - is now protected by common law.

Comprehending Cloud Computing

Excerpted from Business Line Report by Avijit Gupta

A CIO of a large manufacturing organization asked me recently, "What is this 'cloud computing' thing you people keep talking about? Is it something 'out there' somewhere on the Internet? What does it mean to me or for that matter to my organization? Is it outsourcing of application hosting?"

I responded by saying that cloud computing is frequently taken to be a term that simply renames common technologies and techniques that we have come to know in information technology (IT) all along. At times, it is interpreted to mean data-center hosting that permits near real-time, policy-based control of computing resources, or it may be interpreted to mean only data-center hosting rather than understood to be the significant shift in Internet applications. And she quipped, "You seem to be talking more Cirrus than Stratus!"

Thinking that she was perhaps getting a bit confused, which was not the intentio, I explained it slightly differently by saying that 'cloud' is a collection of Internet-based or private-network services, providing users with scalable, abstracted information technology capabilities, including software, development platforms, virtualized servers and storage, etc.

It can include different business, consumer, and personal applications with distinct characteristics. For ease of understanding, lets us consider that you are a company located in a big shared office complex, where you pay the operational expenses based on space and utilities that you use. In a cloud computing environment, payment is variable based on the resources or infrastructure usage. Something like 'pay-as-you-go' model. That's one key characteristic.

Another very notable characteristic of cloud computing is scalability. Just as you can rent more space in a building or get more electrical power based on your needs, the cloud computing environment offers 'immediate scalability' depending on usage. Cost, therefore, can be scaled up or down with relatively no penalties. Also, just as in a shared office building, you would share office space and facilities with many companies, while keeping your own privacy inside your offices, cloud computing environment allows numerous enterprises to subscribe to the computing capabilities while retaining privacy and security.

And I explained further, major cloud computing categories can include software-as-a-service (SaaS), which is a model of software deployment, whereby a provider licenses an application - Oracle, for example - to customers for use as a service on demand. Software vendors may host the application on their own infrastructure or download the application to the consumer device, disabling it after use or after the on-demand contract expires.

The second category is platform-as-a-service (PaaS), which is delivery of a computing platform and solution together as a service. This model facilitates the deployment of applications without the cost and complexity of buying and managing the underlying hardware and software.

As I paused, I got a volley of questions from her almost instantly: "So far so good, but are you saying there no challenges? Does it mean that thanks to cloud computing, our information no longer has to be stored on our computers because now it is in the cloud? Until now we had to be specialists in security to know what an antivirus was and how to install it on a computer. Now, with cloud computing, we don't have to worry about anything, because we can consider that out information is protected by someone else?" Does it mean that lack of security would cease to be an issue?

I knew questions like these would be coming from her anytime and in anticipation, had already thought of mentioning about some of the key operational and governance issues related with cloud computing environment.

I went about explaining that as an organization, one would need to carefully consider issues such as control and privacy over data, for example, who owns the data and how it is used. How is security enabled, including audit trails and how does one ensure legal and compliance-related matters, etc.

Even though the information might be stored in the cloud you still have to connect to it via a computer. And from whatever computer you access, there might be security issues, which need to be addressed. One can obtain relevant documentation on controls assurance from service organizations, etc.

Recent research reports estimate that global revenues from cloud computing are expected to grow significantly in the near future. If security, operational and governance issued are addressed adequately, cloud computing can offer significant benefits to you organization.

This cloud definitely has a silver lining.

Motion to Freeze LimeWire Assets Denied

Excerpted from Digital Music News Report

The RIAA will not be freezing LimeWire assets after all, according to a judge's order shared exclusively with Digital Music News. 

The order denies an RIAA motion but does not reverse the original decision against LimeWire. If nothing else, it offers needed breathing room during the damages phase. "While we believe this is a positive development in the case and one that certainly benefits our global user base, it doesn't alter our long-stated strategy," a LimeWire executive told Digital Music News. 

The company is currently developing its cloud-focused re-launch, an ambitious "plan B" that comes under serious legal stress. "We will continue to work hard to garner the confidence and support of the music industry and other key stakeholders in the development of a new service that pushes the boundaries in digital music discovery and consumption." 

The RIAA declined comment. Court documents may be available ahead of the weekend.

Back to Square One for US Version of Spotify

Excerpted from Erictric Report by Eric Calouro

Spotify is a popular P2P streaming music service in Europe. It's convenient, quick, and satisfying. We should know, as we've had access to the service since mid-2009 (and I've been using it every day, mind you).

But for the rest of us in the United States, Spotify is still not available - and it should have been late last year, according to initial time frames put out by the company.

Billboard is reporting that Spotify negotiations with record labels in the United States have completely broken down, and the company is reportedly "back to square one." Certainly not good news, but the company is still said to be aiming for a late 2010/early 2011 launch stateside.

Of course, that launch is entirely dependent on the ongoing negotiations - and as of this moment, things are seemingly looking grim. More information as it becomes available.

Anti-ACTA Movement Gaining Momentum 

Excerpted from Media Activism Report

The Internet Freedom Movement is gaining momentum as the Anti-Counterfeiting Trade Agreement (ACTA) threatens the privacy of Internet users across the world.

ACTA is a proposed "treaty" for policing Internet service providers (ISPs) across the world in order to curb Internet copyright infringement. This would establish a new governing body separate from the World Trade Organization (WTO) and other similar organizations that would force ISPs to monitor and report on the activity of their users. This is wrong for many reasons:

1. It makes it more difficult to distribute software online: Without file sharing and P2P technologies like BitTorrent, distributing large amounts of free software becomes much harder and more expensive. BitTorrent is a grassroots protocol that allows everyone to contribute to legally distributing free software.

2. It will make it harder for users of free operating systems, such as those that are Linux-based, to play media: Consumers would no longer be allowed to buy media without digital rights management (DRM) - and DRM'd media cannot be played with free software.

3. It allows your devices to be confiscated without explanation or due process as part of random security checks. Devices merely suspected of hosting infringing material would be taken away and searched. Not only is this an invasion of privacy, this means that bringing an iPod or a laptop through airport security could cost hours of your time. This would extend the time required for the average check-in up to 70%, assuming that just 1-in-10 people have a searched media device.

4. ACTA requires that existing ISPs no longer host free software that has even a possibility of including copyrighted media; this would substantially affect many sites that offer free software or host software projects such as SourceForge and Download.com. Much of the software we use on our computers is free. Are we to pay the price because of a misleading treaty?

5. Part of the ACTA treaty states that import generic medicines are to be restricted and controlled. This could have disastrous effects that cost lives, especially in developing countries. The majority of pharmaceutical companies do not supply medicine to third-world countries. This is because more profit can be made by selling drugs to richer countries. Most medicine received in third-world countries is imported by concerned people and charity groups. If ACTA is approved, these medicines would be confiscated for "security" reasons. Should people die because of inaction and unwillingness to act against ACTA?

6. If ACTA is implemented, privacy on the Internet is no longer a given. ISPs will be forced to monitor what websites you visit and what you type, search, and do. People have a basic right to privacy that this treaty clearly ignores. Are you willing to pay this price?

7. ACTA gives governments and ISPs the right to block websites deemed "unsuitable." There are no clear guidelines as to what is deemed suitable or unsuitable. Do you honestly believe that this power will not be abused for political and economic gain? Currently, certain countries have so far banned Facebook, YouTube, and BlogSpot. Does this seem like national security?

8. This treaty will not prevent infringement. This is an ineffective and ridiculous agreement that will only harm the common people, not eliminate infringing material. Again, in markets where websites and P2P networks are blocked, some of the most rampant infringement industries in the world exist. Why? Because infringement will always find another medium. Only we will suffer.

Protests are already being discussed for the dates of October 29th and November 5th, but we need more participants to make these days memorable. If you are interested in participating or need more information on ACTA, please visit here for more details.

Coming Events of Interest

NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year – this show features lively debate on timely cutting-edge business topics.

M2M Evolution Conference - October 4th-6th in Los Angeles, CA. Machine-to-machine (M2M) embraces the any-to-any strategy of the Internet today. "M2M: Transformers on the Net" showcases the solutions, and examines the data strategies and technological requirements that enterprises and carriers need to capitalize on a market segment that is estimated to grow to $300 Billion in the year ahead.

Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.

Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.

Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

P2P Streaming Workshop - October 29th in Firenze, Italy. ACM Multimedia presents this workshop on advanced video streaming techniques for P2P networks and social networking. The focus will be on novel contributions on all aspects of P2P-based video coding, streaming, and content distribution, which is informed by social networks.

Fifth International Conference on P2P, Parallel, Grid, Cloud, and Internet Computing – November 4th–6th in Fukuoka, Japan. The aim of this conference is to present innovative research results, methods and development techniques from both theoretical and practical perspectives related to P2P, grid, cloud and Internet computing. A number of workshops will take place.

Copyright 2008 Distributed Computing Industry Association
This page last updated August 8, 2010
Privacy Policy