Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

Kulabyte

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

August 1, 2011
Volume XXXVI, Issue 1


36% Growth Is Expected for Net-Connected TVs in Next 5 Years 

Excerpted from Media Daily News Report by Wayne Friedman

New bells and whistles on Internet-connected televisions aren't going to waste, according to a new survey. Turns out they are being used regularly.

Over 60% of Internet-connected TV households use TV apps at least once per week, according to Scottsdale, AZ based InStat. New wave TVs allow consumers to connect with Netflix, YouTube, Facebook, and more.

"As expected, Netflix and YouTube currently dominate the TV application space," says Keith Nissen, research director at In-Stat. "But as Netflix competitors become more numerous and as applications are optimized for the big screen, TV apps will become part of the mainstream TV viewing experience."

Right now, InStat says 22% of US TV households already own an HDTV with integrated TV apps. Connected TVs with integrated TV applications will grow by an average 36% over the next five years.

Still, the survey says TV apps are not the primary reason for purchasing connected TVs - and that consumer behavior of these new TV apps doesn't lead to more purchasing of other video content, especially when it comes to customers of Netflix.

InStat says consumers now favor both traditional pay-TV and online video services, which have risen to 30% in 2010 from 18% previously.

In regard to DVR use, the research suggests playback of DVR programming does not lead to increased use of free video-on-demand services from a TV programming service.

US Adults Love Video-Sharing Sites, Pew Says

Excerpted from PC World Report by Jeff Bertolucci

Video-sharing sites may have caught on first with the kiddies, but adults have taken to video services such as YouTube and Vimeo in a big way. More than 70% of online adults in the US report watching clips on video-sharing sites, a 5% increase from last year, and a sizable 38% jump from 5 years ago, according to a new survey by the Pew Internet Project.

Video-sharing sites are particularly popular among parents. More than 80% of parents in the survey reported visiting these sites, compared with just over 60% of non-parents.

Interestingly, parental use of video sites is up nine points from Pew's May 2010 survey, while non-parental use is down two points. A possible explanation is that parents with children living at home are younger than non-parents, and "use of video-sharing sites is linked to younger users," Pew explains.

The rising popularity of video-sharing sites is being driven in part by the explosion of user-contributed content on YouTube. This trend may encourage site visits by contributors' friends and others who forward links to popular amateur videos.

The latest statistics from YouTube show that users upload 48 hours of content every minute to the site, Pew reports.

The popularity of video-sharing is growing across geographically and racially diverse segments of the US adult population. Rural Internet users, for instance, are now just as likely as their urban and suburban counterparts to frequent video sites.

Nearly 80% of online non-whites, including African-Americans, Hispanics and other groups, report using video-sharing sites as well--that's 12% higher than in April 2009, and 41% higher than in 2006, Pew reports.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe commend Octoshape on the completion of its EBU commissioned whitepaper that addresses the main drivers behind the online video explosion, the challenges, and the solutions.

Octoshape analogically compares the scope of video usage expansion on the Internet to an increase of automobile traffic on highways by 100 times in five years.

It then describes traditional streaming technologies and their associated challenges in sustaining high-quality video experiences, and next how the quality problems can be addressed, including by means of a suite of new technologies.

Video is now migrating to the Internet so rapidly, thanks to broadband users having access to capacity supporting both standard and high definition (SD & HD) formats, that it may become fully congested if heavy infrastructure investments do not continue and innovative solutions are not deployed.

174 million US Internet users watched online video in March 2011 for an average of 14.8 hours per viewer. The total US Internet audience viewed more than 5.7 billion sessions during a single month.

But for Internet video streaming to become truly successful from a business perspective, it must support mass TV-scale audiences, at a quality level meeting living-room viewing expectations, and at a cost enabling profitable business models.

Viewers are purchasing larger screens, and increasingly seeking HD experiences, exacerbating issues such as slow start times and buffering. Streaming in HD - and soon 3D - requires at least five times the bandwidth of SD, which poses even greater challenges and expense.

As the whitepaper demonstrates in detail, Octoshape's Multicasting and the use of inexpensive cloud resources solve both the scale and cost problems by delivering the needed quality of distribution with Instant On - fast video response from click to play; Stability - no buffering or interruption in the stream; TV grade uptime - always on; Sustained High Bitrate - no adjusting the video quality; and Fast Channel Change - rapid switching time between program sources.

Having a sustained high-quality stream will also extend viewer engagement time, reduce the need for support, and enable subscription services.

Octoshape outlines how packet-loss issues create problems with throughput and stream quality for traditional TCP-based streaming solutions, particularly in such implementations as mobile networks, large scale streaming, and global distribution.

If the driver of a car used the TCP approach, he or she would increase speed until crashing into another car (congestion problems) or hitting a speed bump (a small packet loss). In either case, the driver would take his/her foot off the accelerator then slowly speed up again until the next incident.

Clearly if everyone was driving like this, the pace of the traffic would slow to a crawl. To maximize throughput on highways and on the Internet, traffic must flow with a constant speed until a real problem occurs, then the speed must be reduced until the problem has been resolved.

Traditional contend delivery networks (CDNs) combat this packet loss problem by moving streaming servers close to the edge. Said another way, "Distance equals quality". The paradox is that even having servers close does not "fix" the issue. There are still many routers and networks between the edge server and the consumer.

For this reason, the industry has begun moving to adaptive bitrate technologies to trade off buffering for reduced video quality. This is a work-around, not a fix.

Instead, making best-effort networks perform like provisioned networks, Octoshape has developed an innovative streaming transport protocol, which is resilient to packet loss. The technology provides an efficient use of Internet capacity and can deliver consistent sustained high video quality along with smooth traffic flows. This technology, called Infinite HD, breaks the relationship between distance and quality.

Octoshape's ability to sustain high quality over best-effort networks is founded in the core algorithms employed in its transport protocol, using a unique resilient-coding scheme inside a UDP transport.

This scheme enables streaming integrity surviving the packet loss without the need for retransmission or the data overhead of forward error-correction (FEC) schemes; pulling data from multiple sources simultaneously; and maximum achievable bitrate detected and sustained as an inherent part of the flow-control design.

Octoshape deploys software on or near a standard encoder called the Octoshape Broadcaster. There is also an Octoshape App on the receiving device. In between the Octoshape Broadcaster and the App is the OctoshapeCloud. The OctoshapeCloud consists of a mix of resources including a suite of Multicast technologies and multiple cloud infrastructures (e.g., Amazon EC2). The Octoshape server software exists in the OctoshapeCloud.

In the OctoshapeCloud, the stream is broken up into many unique streamlets and replicated.

As a result, any source can drop out, any packet can be lost, a data center can go offline, or a fiber could be cut, and the consumer will not see a frame drop in the video.

The same resiliency created from the OctoshapeCloud to the App is built in to the connection between the broadcaster and the OctoshapeCloud. This pathway in the "First Mile" is perhaps the most important.

The UDP-resilient stream also facilitates a smooth and deterministic flow of traffic whereby a 1 Mbit/s feed looks to the network like a smooth 1 Mbit/s flow of data, avoiding the congestion problems that the TCP spike creates, and makes it possible to calculate how many streams can go through a pipe.

These core innovations have made way for dramatic architectural improvements, enabling Internet distribution methods that have before been challenged. Two of these innovations are Octoshape's Multicast Suite of technologies and the use of cloud infrastructures.

Octoshape's suite of three Multicast technologies - Native Source-Specific Multicast, AMT (Automatic IP Multicast without explicit Tunnels), and Octoshape Simulated Multicast - provide the magnification effect on top of the Cloudmass infrastructure.

Octoshape has addressed and solved the quality problems for streaming on the open Internet, rather than working around the problem by adapting to the ever diminishing capacity available. Adaptive bitrate technologies simply substitute buffering for low video quality, failing to address the expectations of the audience for a living-room experience.

By solving for quality, Octoshape has enabled streaming in high quality over long distances, using a wide set of cost-efficient resources such as Multicast and cloud infrastructures.

This increases scalability and reduces the cost of distribution to meet the exploding demand for Internet video. Furthermore, the solution enables global federated content distribution, Multicast-enabled enterprise streaming, and reduced traffic in all peering points.

We strongly urge DCINFO readers to download and thoroughly peruse Octoshape's compelling whitepaper. Share wisely, and take care.

Enough with the Cloud Already - Long Live the Cloud 

Excerpted from Online Spin Report by Jason Heller

Allow me to demystify a monumental shift that is underway, hidden behind the buzzword of "the cloud."

For those who don't care about the minor details (most people), for practical purposes the cloud is just the Internet. You use the cloud every day by accessing various websites and web-based services and don't even realize it - because the nomenclature is unimportant. The benefit is what matters, and the experience should be seamless. You don't really care about the exact broadcast technology that allows your favorite TV program to appear on your TV screen, you're just happy that it's there.

The cloud as a delivery platform is changing the consumption dynamics and economics of the entertainment industry. It's as exciting to be a consumer as it is a marketer today!

So why does the cloud matter?

Consumers' demand for digital content, anywhere, on demand, and on any (or every) device has spawned the need to provide this access in an efficient and reliable manner. Advances in technology and the willingness and cooperation of the entertainment industry to provide a wider range of access have converged at a major milestone in history. Just ask Blockbuster Video. The reality of on-demand access to a wide array of content across all screens and devices is here. Well, mostly here.

Today's generation may not have experienced a TV without a remote control, or a rotary telephone. But the next generation will find accessing content from anywhere other than the cloud to be a foreign or archaic concept. It's that big a deal.

The future of TV is the web.

One of the downsides of being an early adopter is that you tend to forget that your media and gadget-based experiences are not yet indicative of the average consumer.

Like all early adopters, I have seen the future. Netflix, Hulu Plus, YouTube and Vudu are just another set of channels on my Samsung-connected TV. But they are so much more than that, providing seamless experiences across multiple devices from the living room to smartphones, tablets, gaming consoles and beyond. It's only a matter of time before original programming comes to these new digital channels. That said, there are still licensing issues and a brave new world where these new services coexist among traditional networks and MSOs - who happen to flex a lot more muscle and have a lot to lose. One thing's for sure - consumers want on-demand content, all the time, everywhere. The people have spoken.

While the TV and movie industries are still evolving, many of us are already at the point where we can't live without Pandora. In addition to hours of entertainment, the discovery of new music and ease of purchase is fueling the growth of a new entertainment ecosystem. Amazon, Spotify, Google music, Apple's iCloud and others are banking on the ecosystem being big enough to support multiple entrants with varied offerings. The game is changing quickly. As marketers, it's vital to keep your eyes on the players and understand these new environments. In some cases new media opportunities are born, while in others they are cannibalized.

I'm a big fan of streamlined experiences. For the in-home experience, the single device model seems to be the most streamlined, but it is probably the least adopted currently. While some brands like Samsung have developed their own platforms, the GoogleTV model, a proven and familiar operating system adapted for the connected TV, is positioned to be the most logical one over time. Connected TV operating systems will surely beat out third party devices like AppleTV, Boxee, and Roku in the long term. Of course that means a real Apple TV is imminent. However, the popularity of gaming consoles may keep them in the running for some time.

Bigger is better.

Cloud computing requires massive infrastructure. The need for reliability, security and competitive pricing has thus far limited this space to those who can handle serious scale. Amazon, Google, Microsoft, AT&T, and a growing second tier --including Salesforce and the larger ISPs -- are leading the way. An alternate option is developing a private cloud. But ultimately the logistics don't really matter unless your job includes selecting a cloud provider.

The downside however, is obvious. The bigger you are the harder you fall. When Amazon's cloud service crashed earlier this year, thousands of businesses, including FourSquare, Quora, Hootsuite, and most notably Sony's PlayStation Network, with its 75 million gamers, went offline, resulting in a substantial loss of revenue, and disappointing consumer experiences.

While this shift to streaming, cloud-based entertainment is significant, it's no immediate threat to the status quo. The economic model still favors traditional broadcast distribution. TV consumption is actually on the rise. However, as adoption of connected TVs and cloud-based streaming entertainment proliferates, we'll need to figure out where the blurry line gets drawn on the practical definition of TV consumption.

Time to Recall Essential Truths of Online Video

Excerpted from Video Insider Report by Neil Perry

These are indeed amazing and volatile times for online video. Every day there's a new study showcasing the accelerating growth of online video; the never-ending demand for video content by brands; and a prevailing gold-rush mentality by media companies (both new and old) to get their piece of the pie.

While the old saying that a rising tide lifts all boats might be true, all of us in the online video sector should periodically take a deep breath to ensure that we're not losing track of some important fundamental truths about online video in the midst of all this excitement and activity.

Truth #1 - Quality still matters: There's currently an understandable obsession in the marketplace around developing online video content quickly and repurposing it in as many ways as possible, whether on YouTube, Facebook, microsites or elsewhere. However, quality content is critical to the success of your brand. It always has been, and on that point nothing has changed.

While the tone and style of the creative work you fill your Facebook and YouTube pages with may be significantly different from what gets put on prime time TV, we can't lose site of the fact that every piece of creative you put out there reflects on your brand, and no one in today's economy can risk releasing a bad image of the brand you've worked so hard to develop and position.

Truth #2 - There's still room and necessity for the "old way" of doing business, too: Creative briefs, strategic input from agencies, and media planning still matter and in many ways are more important now than ever.

Those of us on the front lines of video production should spend an inordinate amount of time on fine tuning the creative briefs for each of our creative assignments to ensure that the videos we produce are not only of the highest quality, but dead on target for the expressed needs of the brand. As part of this we rely heavily upon ad agencies for solid input and direction, as agency media planning helps our creative teams understand the exact target demo we need to address in our videos.

Truth #3 - You never get a second chance to make a first impression: The folks at Head & Shoulders had it right a couple of decades ago with their TV campaign and it is just as relevant to us today. For many newer brands, online video will be their introduction to the masses, so making a good first impression is as important as ever.

A ton of new companies are making their first foray into video advertising. These are often smaller brands that haven't had the luxury of affording expensive TV spots, but now thanks to vehicles like crowdsourcing they have the opportunity to dip their toes in the water with quality videos that tell their brand story. Again, now's not the time to put a video out there that misses the mark or tells an incomplete story.

Truth #4 - Going viral for the sake of going viral is rarely going to work. I've written about this in the past here in MediaPost. Developing a video strictly with the hope of getting your brand a couple million views on YouTube is a risky proposition at best. That's not to say you shouldn't be out there trying to stretch your brand's persona to the masses.

I recommend to brands and agencies that they look more to creative and unique representations of their brand that will resonate with the audience, and that are quite different from what they are used to seeing from the brand on TV or online. Push the envelope. Encourage creative expression for your brand. Let loose the reins of control and tradition, and let the creative community embrace your brand as they see it. You'll be pleasantly surprised with the results. These are exciting times for all of us in online video.

If we all take a periodic breather to calibrate what we do with some core truths, the good times stand a greater chance of having genuine staying power.

Amazon AWS Cloud Computing Unit Drives Analysts Crazy

Excerpted from Seeking Alpha Report by Paolo Gorgo

Get out your crystal balls: join several analysts in today's guessing game, giving Amazon's AWS cloud computing unit a revenue run rate.

Amazon Web Services (AWS) provide companies with an infrastructure web services platform in the cloud. The company started offering these services in 2006, but only recently analysts have started considering the unit worth a closer look.

Back in 2010, Amazon's Chief Executive Jeff Bezos said, at the company's shareholder meeting, that AWS had the potential to be as big as the firm's retail business - giving investors a good reason for some added diligence into its financials.

Unfortunately, Amazon doesn't disclose exact numbers on its AWS revenue stream. These services are included in the "other" category, with miscellaneous marketing and promotional agreements, other seller sites and co-branded credit card agreements. In Q1 2011, this category delivered about $311 million in sales (+65% Y/Y), mostly in North America (see Amazon's latest 10Q, page 18).

In 2010, UBS estimated that AWS would represent roughly $500 million revenues in 2010, about $750 million in 2011 and approximately $ 2.5 billion in 2014.

Cowen & Co.'s Jim Friedland sees AWS being a more than $4 billion business by 2016.

Citigroup's Mark Mahaney recently noted that "on each day, AWS adds enough server capacity to run the equivalent of what Amazon needed when it was a mere $3 billion company in 2000".

According to an article published today by Reuters, Citigroup believes that Amazon AWS services may be reaching a psychological inflection point:

"While still very small for Amazon (likely about $750 million revenue run rate), given the size of the market opportunity and Amazon's strong competitive positioning, we believe that this could soon be a $1 billion revenue segment," Mahaney wrote in a note to investors this week.

Amazon is heavily investing in data center infrastructure. Capital expenditures in Q1 2011 were about $300 million, including investments in technology infrastructure necessary to support AWS. Getting more metrics on this vertical will also be key to fully understand its business model. Here is a related comment taken from today's Reuters article:

Citigroup's Mahaney said AWS gross margins may be up to four times higher than Amazon's overall margins.

"We'll be listening on the earnings call for any details on new traction for this segment," the analyst wrote in a recent note to investors.

An indirect way to try to understand AWS growth rate is to look at some data that are disclosed by the company, and related to the number of "objects" that are stored on the service.

At the end of the second quarter, S3 held more than 449 billion objects and processed up to 290,000 requests per second for them at peak times. Here is a quick look at an Amazon blog post reporting these data:

On June 22, Werner Vogels, AWS's CTO, while giving his presentation at the GigaOm Structure 2011 conference, talked about AWS hosting "339 billion objects, more than doubling the volume from the same time last year, when S3 stored 150 billion objects". While it is impossible to say at what exact date this number was reported, the feeling we get is that Amazon's AWS unit keeps growing at rates that don't seem to be declining at all.

While still an "ancillary" business to Amazon, AWS will be an interesting unit to watch in the future, even if it may be hard to consider Amazon as a proper way to invest in a cloud computing player. We can only reiterate our wish that the company will soon start to break down this vertical better on its balance sheet, in order to allow investors to get more clarity into its business model, including costs, CapEx, margins and growth potential.

Ignite Technologies iPad App Now Available on App Store

Ignite Technologies, the leader in providing secure and scalable Enterprise Content Delivery Solutions enabling customers to efficiently publish, deliver and manage digital assets, today announced the launch of its new App, MediaPlace, now available in the App Store. The new App expands the devices on which enterprise companies using Ignite's Content Delivery Solution can consume content.

MediaPlace makes it easy for employees with iPads to access the corporate video portal away from the office. Critical corporate messaging can be accessed at any time along with the ability to collaborate with co-workers through social features such as commenting and rating of corporate videos. Employees can set preferences for subscription to specific publishers and topics depending on their interest and create playlists of favorite videos for easy reference.

"Our customers are experiencing the surge of iPads and other tablet use within the enterprise," said Jim Janicki, President and Chief Executive Officer at Ignite Technologies. "MediaPlace extends Ignite's Content Delivery Solution to these devices providing secure, easy access to their corporate video portal."

MediaPlace enables the capability to download content, creating a personal library for offline viewing.

Features enable users to rate and see average ratings for all videos, comment on videos and view other employee comments, subscribe to publishers and topics, create a personal playlist, create a 'My Library' of downloaded content, and view high-quality videos online or offline

To experience MediaPlace on an iPad, proceed to the Apple App Store from your iPad and download the free MediaPlace App. Use the company code 'WebTrial' when prompted along with your e-mail address.

MediaPlace for Apple iPad is a free app and is immediately available. MediaPlace for Android will be available early Q4, 2011. For more information about Ignite's Content Delivery Solution, go to www.ignitetech.com.

Translattice Shakes Up Distributed Computing 

Excerpted from All Things Digital Report by Arik Hesseldahl

One of the basic assumptions about cloud computing is that service outages are bad. An application that goes down for one reason or another is an expensive problem when it happens. But it's also an expensive problem for which to plan ahead, usually involving buying a lot of redundant hardware and software that kicks in when the primary systems fail. It's not an attractive notion, but then again neither is downtime.

Most of the time, database applications run in one central location. Sometimes there are legal requirements about maintaining data within national borders, or corporate policies about keeping data on company-owned hardware. The reasons can vary. Organizations have put a lot of attention on fault-tolerant hardware, redundant network connections, and recovery processes. But applications themselves get short shrift.

A new company called Translattice, backed by $9.5 million in funding from DCM, an early-stage venture capital firm, aims to change that with a new architecture that distributes applications. Make your application resilient, the thinking goes, and you needn't spend quite so much on fault-tolerant hardware and extra network connections that will otherwise sit idle until they're needed.

I talked last week with Translattice CEO Frank Huerta and Michael Lyle, its chief technical officer, about the company's new architecture and its plans to shake things up in cloud computing.

Frank, when you think of cloud computing and data centers, you tend to think that there's already a lot of redundancy built into the infrastructure, and yet there are still lots of outages. What's going on?

Huerta: One of the main problems we're addressing is the complexities in the infrastructure. United Airlines went down recently, and USAir. You're continuing to see more and more problems in the infrastructure. And the reason for that is that it's starting to hit the wall in terms of what it can deliver.

So what does Translattice do to solve that?

Huerta: Translattice is about the deployment of enterprise class applications, like CRM and ERP applications in globally distributed environments, including the cloud. Everything else to this point has been monolithic. This is a different paradigm, and we think it opens up a lot of other advantages. We've built this platform for cloud and traditional applications. The components are all identical and all aware of each other so the system is aware of where the data is at all times. And by policy you can control where it is and how much redundancy you want. But they all work like they're operating from one central database, when in fact they're distributed around wherever you have a presence.

So how do you do it?

Huerta: One thing is that we've solved the distributed relational database problem. This was an unsolved problem in IT for the past 25 years, so it's a major technical accomplishment. We've taken all the key components in the data center - the storage, the database, the app server, load balancing - and we've built it into a machine we call a Translattice Node. And that Node is a rack mountable box with commodity hardware inside, and it can be run as a physical appliance, or it can be run as a virtual instance in the cloud like on Amazon. And this is the platform on which you run your applications.

How is it different from the traditional set-up?

Huerta: When you turn it on you get this re-mapping of what you can do with your applications. If you need additional computing resources in a certain location, you just add boxes there. The infrastructure nodes now share information amongst each other. Your performance is better, because we move data closer to where you're going to be using it. If you move from New York to Germany, the system automatically sees where you're logging in from and moves the data you use closer to you by moving data accordingly, so you get local performance. In many ways it's like what Akamai has done with Web content. They cache Web information so that when you visit a Web site you get served with a cache from a location that's closer to you. But this is a generation more advanced. We do the same thing but with dynamic application data in real time. You also get better control of the data and can control where it can and can't go by policy. And then you get much better resilience. You can set policies concerning how much resilience you want in the system by saying how much you want your data copied and whether or not you want it replicated on multiple continents.

What kind of customers do you have?

Huerta: We have a few beta customers and we're just in the process of getting our first paying customer, which we can't announce just yet, and we're setting up pilots with large financial companies and with governments. Financials and governments seem to be early adopters of this kind of technology because they can't afford for things to go down.

We've seen how the federal government in the U.S. plans on cutting back the number of data centers it operates, and that it's turning more to the cloud to save on operational costs. Is this likely to fit into that strategy?

Huerta: This would be one way for the government to make its infrastructure more efficient, sure. And certainly as it moves more stuff to the cloud, this is a strong platform for running legacy applications on the cloud, and yet still keep it within their own infrastructure.

Yet you're distributing the data, and that idea is sometimes anathema to financials and governments who are usually the biggest sticklers when it comes to moving data across national boundaries. How do you get around that?

Lyle: We're working with financial firms that have been forced to deploy five copies of their banking systems around the world both for performance because you need the data close to where its being worked on, but also because they're not allowed to have customer data cross national boundaries so that means they don't have the minute-by-minute view of the business, and they have to run a big settlement process at the end of the day. They end up not being able to offer the same products to customers in every country. And just running five copies of all that infrastructure is expensive. Our ability to de-centralize the system, and keep it as one big cohesive application processing platform while at the same time complying with all the business rules about where data is stored, really could revolutionize the way that banks are doing business.

Top 5 BitTorrent Clients for Windows

Excerpted from Zeropaid Report by Jared Moya

Long ago BitTorrent surpassed direct connect-style downloading to become the preferred method of file sharing because of its speed and ability to share large files, especially video. As such, over the years the popularity of certain BitTorrent clients has ebbed and flowed depending on user tastes.

So for those unfamiliar with which to use, or for those regular users perhaps looking to see how your favorite BitTorrent client stacks up the rest I've compiled a list of the top 5 BitTorrent clients to see which is the right one for you.

1. uTorrent

uTorrent is, in my opinion, the best of the bunch. It's simple, easy to use, and sports a low memory footprint.

Features include Streaming: Watch videos within seconds with progressive downloads - no need to wait. Especially great for previewing a file before committing to the full download; Remote Access: Start, stop, and monitor torrent downloads on the go. Access your client from any Web browser, or download our Android app; Ratings and Comments: Leverages the collective wisdom of the community to ensure the quality and security of downloaded torrents; Drag-and-Drop Sending: Easily send massive personal files - e.g. home movies, cell phone videos and hi-res photos. Select a file on your computer, drag it into the uTorrent "Drop files to send" box and a Web link is yours to share; and Portable Mode: Run your uTorrent client directly from a USB key and take it with you anywhere.

uTorrent also sports the App Studio. Launched last November, the App Studio enables one-click downloads of content and features right inside uTorrent. Offers downloads of music, movies, and books as well as social media apps like TorrentTweet or antivirus apps like BitDefender's VirusGuard.

Moreover, it's built for speed and you can leave it running If you're looking for a guide on how to set up and use uTorrent we have one here.

2. Vuze

Formerly Azureus, Vuze was the first BitTorrent client to offer a wide variety of features and plug-ins. The downside is the relatively high memory usage, but for those where this isn't an issue Vuze offers a far more compelling BitTorrent experience.

Features include: Vuze Meta Search offers aggregates results from a variety of top sites; Automatically adapts to optimize for your network; Watch in Full Screen HD (1080p); Can play virtually any type of video file - AVI, XVID, Quicktime, and more; and Offline playback (on planes, trains, automobiles); Drag-and-drop content to play back on the device of your choice: iPhone, iPod, iPad, Xbox 360, Playstation 3, PSP, and TiVo; Vuze Remote: control your Vuze client from any computer or smartphone with a web browser; and RSS Feed support.

Moreover, Vuze is the BitTorrent client to choose if you want a more robust downloading experience.

3. BitTorrent (mainline)

The official BitTorrent client has grown by leaps and bounds over the last few years, offerring new features and options that set it apart from the rest. It's all part of "Project Chrysalis," its effort to achieve the "next generation" of the BitTorrent Mainline client.

Features include RSS feed support; Download and upload scheduling; Transfer caps to avoid ISP overusage fees; Add Torrent from URL; Intelligent: BitTorrent auto-adjusts bandwidth usage based upon your network and the Internet; Plug-n-Play; Advanced: BitTorrent leverages uTP, the latest BitTorrent protocol; BitTorrent maximizes the use of network bandwidth while reducing congestion & it doesn't interfere with your other surfing; and Low memory footprint.

The BitTorrent Mainline client also supports the same App Studio I mentioned about uTorrent. The App Studio lets you add new features, skin your client and more.

4. BitTornado

It's popularity has slowly waned over the years, but it's still enjoys a loyal following. It doesn't feature the fancy bells and whistles of the others like uTorrent and Vuze, but it's fast, reliable, and easy-to-use.

Features include: Upload/download speed throttling; Option of Disabling and Setting Priority of Files in any torrent; Detailed information about connections to other peers; UPnP Port Forwarding (Universal Plug and Play); IPv6 support (OS support required); PE/MSE support; and Quick resume.

The only real downside to BitTornado is that it's a little bit too "lightweight" in my opinion. I love programs that use minimal resources, but memory has become cheap enough these days that unless your running an old tower with 512k there's no reason to choose BitTornado.

5. BitComet

BitComet also still enjoys a loyal following, and offers search features far different than the others. BitComet lets you browse some 14 tracker sites for content, including Demonoid and BTJunkie with minimal configuration required.

Features include: HTTP/FTP Download; Preview while Downloading: Preview of avi, rmvb, wmv and other video files is available during downloading process; Magnet URI: Start BitTorrent download without ;torrent file any more, using DHT network; AND Disabling or Setting Priority of files in torrent: Files can be skipped for downloading, or set to higher / lower priority, allowing you to select which file finish first.

Downsides? Ads. The program sports annoying in-client ads as well as taskbar ad popups that wholly ruin the sanctity of P2P.

Shares of Atrinsic Are Poised to Triple

Excerpted from Seeking Alpha Report by Mark Gomes

Pandora's IPO has spurred one of the hotter debates on Wall Street. On one side, the Internet music business has caught fire. Bulls focus on Pandora's market-share dominance and exploding customer base. Bears counter that the market's barriers to entry are low. They also contend that the royalties Pandora pay to record labels will limit the company's profitability, making it hard to justify its multi-billion dollar market cap.

No matter which side you're on, there are some hard facts telling investors that shares of Atrinsic are poised to triple (in fact, with proper execution, the facts show they could rise by 10x or more):

1. ATRN is the parent of Kazaa, which was one of the original file-sharing programs (along with Morpheus and the iconic Napster). Kazaa comes with an enviable pedigree -- some of its early owners went on to create Skype.

In 2007, after lawsuits shuttered the file-sharing companies, Kazaa paid $100 million for on-demand licensing rights with the four major record labels.

With those rights, Kazaa began offering legitimate Internet music services. It's important to note that its on-demand licenses are required for users to choose the specific songs they want to hear.

Only six companies in the world hold these licenses and only three of the six are associated with public companies -- Rhapsody (which is owned by RealNetworks and Viacom), Napster (which is owned by Best Buy), and ATRN's Kazaa.

Pandora's music rights are more common and limited in scope. Thus, with its runaway lead in the market, holding the rare on-demand licenses may be the only way for competitors to gain an advantage (or for Pandora to extend it's lead). Indeed, the market seems headed toward an end-game where 1) Pandora will acquire one of on-demand license holders and 2) the remaining on-demand license holders will develop Pandora-like functionality. The eventual market winner will likely come from this group.

In this regard, Kazaa has already made some major moves to become more like Pandora. Thus, ATRN has been thrust into the right place at the right time.

Writing software to stream music is relatively easy. But entering the Internet music market against Pandora is not as simple as many believe. With the explosive popularity of Internet-based music, major record labels have become resistant to hand out new on-demand licenses to anyone. Just ask Apple, Amazon, and Google - they've tried for a long time with no luck.

If these giants want a legitimate and significant foothold in the marketplace, they may need to acquire one of the existing on-demand license holders. However, Rhapsody and Napster's owners are unlikely to sell at an attractive price. As for the private vendors, only two appear ripe for acquisition, but their VCs surely know the value of their licensing rights (I'll discuss the third shortly). Thus, ATRN may be the only underpriced asset left in the market.

Kazaa paid $100 million for those hard-to-get licensing rights. However, this value does not show up on ATRN's balance sheet. As result, ATRN's market cap has drifted well below those levels. At present the company is only valued around $20 million. A $100 million valuation for its music licenses (which now appear unobtainable) implies that ATRN's shares (which have recently traded in the $2 - $6 range) should be worth more than $15. Forget being poised to triple based on this metric, shares of ATRN could be poised to quintuple.

Please click here for the rest of this report.

Cloud Trends: The Power of Distributed Computing for Accelerating Processes 

Excerpted from TMC Net Report by Mae Kowalke

At its core, the cloud trend is really about distributed computing: making a pool of computing resources accessible on-demand in an easily managed and convenient manner. Looked at this way, clouds can be used for many different functions, not just consumers and small businesses accessing applications over the Internet.

For example, with the right technology companies can create private clouds that let an entire network's resources be leveraged to greatly accelerate processes. This is exactly the niche where Xoreax brings value.

The company offers a simple agent that can be installed to create private clouds to harness a network's computing power, without needing to make changes to source code.

Xoreax Grid Engineer (XGE) technology is highly configurable; users can control how much CPU power on each computer is used, and set associated limits (e.g.: if more than 20 percent of a particular machine's processing power is being used locally, it is not tapped by the network).

According to Dori Exterman, CTO, Xoreax's technology is used by more than 20 percent of Fortune 100 companies. XGE forms the backbone of IncrediBuild, a product that accelerates Visual Studio builds, making them up to 30 times faster.

The technology is also used by financial industry, healthcare, medical research, gaming and energy for all kinds of processes that take a lot of time.

"It's really useful for developers who sometimes need to wait 2-3 hours a build to complete, and now only need to take a short coffee break," Exterman said during a TMCnet video interview at Cloud Expo 2011.

Exterman acknowledged that there are a lot of players in the field once known as 'grid computing' and now more commonly referred to as 'high performance computing' or HPC for short.

"What differentiates us from competitors is the element that you don't need to change your source code," he emphasized during the video interview. "Other solutions require you to change your architecture, which takes a lot of time. The only thing our customers need to do is write a small XML file with names of the processes they want to distribute, and everything else is done for them."

Depending on the application, some optimization of network speeds may also be in order.

"We are mainly focused on applications that require intensive CPU power but have few I/Os," Exterman said. "For example, simulations that require processing many numbers. Some simulations take weeks to complete if you run them only a on a local machine. When you can distribute them to hundreds of computer, it makes things much faster."

For more discussion about distributed computing, including predictions about development of hybrid solutions for private and public clouds, watch the full video interview.

Open Cloud Initiative Launches to Drive Open Standards in Cloud Computing

Today the Open Cloud Initiative (OCI), a non-profit organization was established to advocate open standards in cloud computing, announced its official launch at the OSCON 2011 Open Source Convention. 

Its purpose is to provide a legal framework within which the greater cloud computing community of users and providers can reach consensus on a set of requirements for Open Cloud, as described in the Open Cloud Principles (OCP) document, and then apply those requirements to cloud computing products and services, again by way of community consensus. 

The Open Cloud Initiative (OCI) has launched its official website at http://www.opencloudinitiative.org/ and commenced a 30-day final comment period on the Open Cloud Principles (OCP), which are designed to ensure user freedoms without impeding the ability of providers to do business. They are focused on interoperability, avoiding barriers to entry or exit, ensuring technological neutrality and forbidding discrimination. 

They define the specific requirements for Open Standards and mandate their use for formats and interfaces, calling for "multiple full, faithful and interoperable implementations", at least one of which being Open Source. Full text of the Principles can be found at http://www.opencloudinitiative.org/principles

"The primary purpose of the Open Cloud Initiative (OCI) is to define "Open Cloud" by way of community consensus and advocate for universal adoption of Open Standard formats and interfaces" said Sam Johnston, founder and president. "Inspired by the Open Source Initiative (OSI), we aim to find a balance between protecting important user freedoms and enabling providers to build successful businesses." 

The Open Cloud Initiative (OCI) is governed by a Board of Directors comprising leaders from the cloud computing and Open Source industries, including Rick Clark, Marc Fleischmann, Sam Johnston, Shanley Kane, Noirin Plunkett, Evan Prodromou, Sam Ramji, Thomas Uhl, John Mark Walker and Simon Wardley. The Open Cloud Initiative (OCI) is being founded as a California public benefit corporation (non-profit) and intends to obtain federal tax exemption by way of 501(c)(3) educational and scientific charity status in due course. For more information, including the Open Cloud Initiative (OCI) Articles of Association, Bylaws, Open Cloud Principles (OCP), or to participate in the community, please visit http://www.opencloudinitiative.org.

Coming Events of Interest

TransmitCHINA Talks - September 14th-16th at the Great Wall of China. International leaders, thinkers, innovators, and creators will have an exclusive opportunity to hear a cross-section of preeminent thought leaders from some of the world's most innovative organizations in the digital and creative content ecosystem.

NY Games Conference - September 21st-22nd in New York, NY. The most influential decision-makers in the digital media industry gather at this event, now in its third year, to network, do deals, and share ideas about the future of games and connected entertainment. Lively debate on timely cutting-edge business topics.

Digital Music Forum West - October 5th-6th in Los Angeles. CA. Top music, technology, and policy leaders come together for high-level discussions and debate, intimate meetings, and unrivaled networking about the future of digital music. Digital Music Forum is known worldwide.

Digital Hollywood Fall - October 17th-20th in Marina del Rey, CA. Digital Hollywood (DH), the premier entertainment and technology conference in the country, once again welcomes the Variety Summit, which has been co-located with its past three DH events.

Future of Film Summit - November 7th-8th in Los Angeles, CA. An exclusive group of industry thought-leaders discuss the current state of the industry, and how film and transmedia deals will be struck in the coming years. This is a unique opportunity for creatives, producers, buyers, and film financiers.

Streaming Media West - November 8th-9th in Los Angeles, CA. Attended by more than 2,500 executives last year, SMW covers the entire online video ecosystem from content creation and management, to monetization and distribution. The number-one place to come see, learn, and discuss what is taking place with all forms of online video business models and technology.

World Telecom Summit 2011 - November 9th-11th in Singapore. The 2011 program will focus on topics that demonstrate innovation across the telecommunications industry, both on a commercial and technical level, to improve profitability and quality of next generation technologies and customer experiences.

Future of Television - November 17th-18th in New York, NY. Top television and digital media industry executives discuss the increasing importance digital media for the future of the television industry. Topics include viewer trends; programming for non-traditional platforms including online video, VoD, HD, IPTV, broadband and mobile.

Copyright 2008 Distributed Computing Industry Association
This page last updated August 8, 2011
Privacy Policy