Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

Dancing on a Cloud

DataDirect Networks

Kaltura

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

August 6, 2012
Volume XL, Issue 6


Unicorn Media Sponsors CLOUD COMPUTING WEST 2012

The DCIA and CCA proudly announce that Unicorn Media has signed on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.

Unicorn Media is the leading provider of Internet video solutions that enable companies to maximize IP video profitability. Its patented technology, Unicorn Once, allows customers to ingest video content one time and deliver it to every Internet-connected device via a single URL.

Content owners can monetize their content on any device by dynamically inserting targeted ads and analyzing content and ad performance in real-time on every platform, allowing for on-the-fly changes to maximize profitability.

Built for scale, Unicorn Media was founded by digital content distribution infrastructure experts and has revolutionized media distribution by creating a comprehensive solution with unprecedented ease-of-use. Headquartered in Tempe, AZ, Unicorn Media has offices in Los Angeles, San Francisco, New York, and Chicago. Unicorn Media is a privately held and funded company.

CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry's fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.

Unicorn Media will participate in a panel discussion at the Entertainment Content Delivery conference within CCW:2012.

CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.

At the end of the first full-day of co-located conferences,attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.

So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.

Cybersecurity Bill Is Blocked in Senate by GOP Filibuster

Excerpted from NY Times Report by Michael Schmidt

A cybersecurity bill that had been one of the Obama administration's top national security priorities was blocked by a Republican filibuster in the Senate on Thursday, severely limiting its prospects this year.

The Senate voted 52 to 46 to cut-off debate, falling short of the 60 needed to force a final vote on the measure, which had bipartisan support but ran into a fight over what amendments to the legislation could be proposed.

Soon after the vote, the White House released a statement calling the outcome "a profound disappointment."

"The politics of obstructionism, driven by special interest groups seeking to avoid accountability, prevented Congress from passing legislation to better protect our nation from potentially catastrophic cyberattacks," the statement said.

The bill's most vocal opponents were a group of Republican Senators led by John McCain of Arizona, who took the side of the US Chamber of Commerce and steadfastly opposed the legislation, arguing that it would be too burdensome for corporations.

The bill would have established optional standards for the computer systems that oversee the country's critical infrastructure, like power grids, dams, and transportation.

In the hopes of winning over Mr. McCain and the other Republicans, the bill had been significantly watered down in recent weeks by its sponsors, led by Senator Joseph Lieberman, who made the standards optional. Original versions of the bill said the standards would be mandatory and gave the government the power to enforce them.

Mr. Lieberman, the independent from Connecticut who is Chairman of the Homeland Security and Government Affairs Committee, and the bill's other sponsors, including the committee's ranking member, Senator Susan Collins, Republican of Maine, had worked for the past several years to pass cybersecurity legislation.

At a meeting last week, Mr. Lieberman got into an argument with Mr. McCain, his closest ally and friend in the Senate, about his opposition to the bill. Mr. Lieberman questioned why Mr. McCain was doing the bidding of the US Chamber of Commerce and asked what Mr. McCain would say if the nation was crippled by a cyberattack.

Mr. McCain angrily said his reputation on national security issues was unquestionable.

The Obama administration had tried to sell members of Congress on the need for the legislation through closed-door briefings from high-ranking national security officials and pleas from officials who had served in President George W. Bush's administration about the looming threat of a catastrophic cyberattack.

After the vote, Ms. Collins said it was a "shameful day" and expressed disappointment with her fellow Senators who lacked "a sense of urgency" about a looming cyberattack.

"We often hear from the members on both sides of the aisle, but particularly Republican members, that we need to be listening more to generals on the ground," Ms. Collins said. "But listen to the generals who had responsibility in this area" who told members of Congress "over and over again" that the nation was not prepared for a cyberattack.

"I cannot think of another area where the threat is greater and we are less prepared," she said.

The Senate Republican leader, Mitch McConnell of Kentucky, said that, "No one doubts the need to strengthen our cyberdefenses."

"We all recognize the problem, that's really not the issue here," Mr. McConnell said.

"It's the matter that the majority leader has tried to steamroll a bill," Mr. McConnell said, referring to Senator Harry Reid, Democrat of Nevada.

Despite threats of a veto from President Obama, the House passed its own cybersecurity bill in April, which called for more information sharing between national security and intelligence agencies and businesses.

The bill called for the government to provide businesses with classified information about cyberthreats and gave companies the option of sharing information about cyberthreats with the government.

White House officials said the President opposed that bill because it called for too much information sharing between the government and businesses, which could have led to violations of Americans civil liberties.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyEarly this week, the Distributed Computing Industry Association (DCIA) signed-on to the following letter to US Senate Majority Leader Harry Reid and Minority Leader Mitch McConnell to help advance better protection of consumer privacy online.

With the failure later in the week of the inadequately vetted CISPA in the Senate, ECPA reform, which remains a priority for us, may now proceed in other ways:

"As the Senate considers cybersecurity legislation, we urge you to make in order and to support an amendment that Chairman Leahy has introduced that would update a key privacy law that is critical to business, government investigators, and ordinary citizens.

Chairman Leahy's amendment #2580 addresses the Electronic Communications Privacy Act (ECPA), a law that Chairman Leahy himself wrote and guided through the Senate in 1986.

ECPA was a forward-looking statute when enacted. However, technology has advanced dramatically since 1986, and ECPA has been outpaced.

As a result, ECPA is a patchwork of confusing standards that have been interpreted inconsistently by the courts, creating uncertainty for service providers, for law enforcement agencies, and for the hundreds of millions of Americans who use mobile phones and the Internet.

Moreover, the Sixth Circuit Court of Appeals has held that a provision of ECPA is unconstitutional because it allows the government to compel a service provider to disclose the content of private communications without a warrant.

Chairman Leahy's amendment would make it clear that, except in emergencies, or under other existing exceptions, the government must use a warrant in order to compel a service provider to disclose the content of e-mails, texts, or other private material stored by the service provider on behalf of its users.

Chairman Leahy's amendment would create a more level playing field for technology.

It would cure the constitutional defect identified by the Sixth Circuit. It would provide clarity and certainty to law enforcement agencies at all levels, to business and entrepreneurs, and to individuals who rely on online services to create, communicate, and store personal and proprietary data.

These protections for content are consistent with an ECPA reform principle advanced by the Digital Due Process coalition, a broad-based coalition of companies, privacy groups, think tanks, and academics.

For Internet and communications companies competing in a global marketplace, and for citizens who have woven these technologies into their daily lives, as well as for government agencies that rely on electronic evidence, the protections for content in the Leahy amendment would represent an important step forward for privacy protection and legal clarity.

While the signatories to this letter have very diverse views on the cybersecurity legislation, and some take no position on the legislation, we urge you to make the Leahy amendment #2580 in order and to support it when offered."

Other signatories included Adobe, American Booksellers Foundation for Free Expression, Americans for Tax Reform, Association for Competitive Technology, American Library Association, Association of Research Libraries, Bill of Rights Defense Committee, Business Software Alliance, CAUCE North America, Center for Democracy & Technology, Center for Financial Privacy and Human Rights, Center for National Security Studies, Citizens Against Government Waste, Competitive Enterprise Institute, Computer and Communications Industry Association, The Constitution Project, Data Foundry, eBay, EDUCAUSE, Engine Advocacy, FreedomWorks, Liberty Coalition, Newspaper Association of America, Microsoft, Neustar, Personal, Salesforce.com, Sonic.net, SpiderOak, Symantec, TechFreedom, TechAmerica, TRUSTe, and the US Policy Council of the Association for Computing Machinery.

Share wisely, and take care.

Mid-Year Report Card on Business Video

Excerpted from Ignite Technologies Blog by Kimberlee Lueders

Now that we're past the mid-point of the year, it's report card time for the old crystal ball.

At the beginning of this year, I used this space to make a few predictions about trends impacting the streaming sector in 2012. The goal was to avoid gushing about obvious issues, such as the growing role of social media and mobile video, in the world of webcasting. Instead, I tried to glimpse ahead at some likely 2012 venues that held the potential for changing the way we perceive (and use) streaming video.

Let's take a look at the streaming lessons learned so far in 2012 in the venues we discussed at the beginning of the year:

The 2012 presidential election: My crystal ball projected that the presidential election would spark additional creativity in marketing political candidates online, establishing templates for how large corporations could better leverage online video for marketing. 

So Far… The volume of online videos posted by the campaigns continues to swell. Production values for the videos continue to be high, as well. But truly creative uses of video in this realm are sparse. The biggest lesson for corporate marketers here is that the online venue gives more people than ever a video megaphone, making it more challenging for companies to consolidate or control any branding message they seek to put into the marketplace.

London Summer Olympics: From the perspective of early 2012, the rogue video blogger appeared to be a viable threat to the economics of Olympics broadcasting. Time zone challenges would let interested viewers see anything they wanted online from fans using smart-phone and other video capture devices to capture and transmit their own videos from the competition.

So Far… NBC, the US television network with Olympics coverage rights, is producing more than 5,000 hours of Olympic event coverage, making it broadly available online. Such saturation largely renders the idea of rogue amateur video moot. More than enough professionally produced content is available to satisfy even the most ardent Olympics fan. 

The real lesson from Olympics streaming lies in understanding the impact of video ubiquity on viewership. The Olympic fortnight will tell us whether real-time access to virtually any Olympic event via desktop, tablet device, and smart-phone will splinter viewership and destroy ratings or spark greater interest in Olympic storylines that will fuel higher ratings for core Olympic telecasts on the broadcast network.

Your Trip to the Mailbox: Troubles at the US Postal Service prompted me to project that companies would accelerate experiments in leveraging online video in marketing campaigns, preparing them for the day when sending traditional junk mail would no longer be an economically viable option

So Far… Online video marketing experimentation continues in full force, but any full transition in corporate marketing will take years before dethroning the marketing king that is junk mail. In this area, expect online video to spark an evolution rather than a revolution.

Your midday conference call: At the beginning of the year, small online group meetings featuring video communications appeared poised to become more commonplace than ever.

So far… Nothing appears to be derailing broader corporate adoption of video into small group meetings online. The quality of vendor offerings continues to rise and the willingness of organizations — and individual executives — to experiment with more forms of online video communications is as strong as ever. At the same time, providers of conferencing services are growing more aggressive in offering expanded solutions that enable large-scale one-to-many streaming webcasts on a more cost-effective basis. The net effect is more extensive adoption of online video in a range of business communications applications.

On balance, the venues projected at the beginning of 2012 to have significant impact on the streaming industry have delivered on the promise of change — even if the actual changes seen sometimes have been different from those predicted.

I guarantee that streaming will foster even more market change in the second half of the year. But we'll just have to all watch together to see how that unfolds without the benefit of prognostication. I'm taking a break from the crystal-ball gazing business — at least until the calendar turns to 2013 and lures me into taking another peak into the future of streaming.

Streaming Olympics Pays Off for NBC

Excerpted from Media Daily News Report by Wayne Friedman

NBC has been posting big digital video usage results from the London Olympics versus the Beijing Games four years ago. The network has seen nearly a 200% rise in total video streamed to 75 million, with over a 300% rise in live streams to 34 million.

NBC has been pulling in an average of 31.5 million unique viewers on laptop/computer use versus 29.1 million uniques at the Beijing Games.

NBC previously said it earned some $60 million in advertising sales for its digital platform efforts.

Mobile users accessing the NBCOlympics Web site have nearly doubled from the Beijing Games - to 5.2 million from 2.8 million. Those using NBC Olympics Live Extra app - which specifically bring live, re-air, and highlighted video - have posted 7.0 million users.

NBC has been implementing the first major use of so-called "TV Everywhere" authentication efforts, giving free Olympic coverage access to US TV consumers who have cable, satellite, or telco TV monthly service. This comes to more than 90% of all 115 million US TV homes. NBC says 6.2 million phone/tablet devices have been verified for the games.

Some of the best individual digital live video results come from traditional TV's most popular Olympic sports - swimming and gymnastics.

NBC says five Olympic events so far have surpassed 1 million live streams; Tuesday's women's gymnastics team gold medal final pulled nearly 1.5 million. On Thursday night, two swimming events pulled over 1 million - one when Michael Phelps defeated the field, including Ryan Lochte, to win the gold in the men's 200 individual medley. That event hit 1.2 million streams. All-around US gold medal gymnastics winner Gabby Douglas grabbed almost 1.1 million streams.

Monday's men's gymnastics team gold medal final hit 1.6 streams; and Tuesday's swimming gold medal final scored 1 million. NBC says four of the five events (yesterday's swimming final being the exception) were streamed only to cable, satellite, and telco customers who verified their accounts.

IPC Selects Octoshape to Stream London 2012 Paralympic Games

Octoshape announced this week that the International Paralympic Committee's (IPC) website will feature a groundbreaking new video player for the London 2012 Paralympic Games that uses a combination of technologies to integrate live footage and results in a single unified and synchronized view.

Developed by the IPC's worldwide partner Atos and featuring Octoshape's innovative technology, the new Sport Media Application in Real Time (SMART) Player is the first of its kind and is set to revolutionize online streaming.

Viewers using the new SMART player will enjoy the signature features of Infinite HD-M powered experiences with significantly improved video quality and advanced features like Digital Video Recording (DVR), allowing viewers to pause and rewind live sporting action.

The linear video delivery for the London 2012 Paralympic Games will be delivered to consumers via Octoshape's recently announced Infinite HD-M Federated Multicast Broadband TV platform. This technology enables the quality, scale, and economics of traditional broadcast technologies over the public Internet. Telco and cable operators that are part of the Infinite HD-M Federated network receive signals via native IP Multicast in a way that allows them to easily manage large volumes of traffic without needing to upgrade their Internet capacity.

During London 2012 the IPC will live stream more than 780 hours of sporting action via five channels, two of which - streaming swimming and wheelchair basketball - will benefit from the SMART Player. As with previous Games, the IPC's online action will be sponsored by worldwide partners Samsung and Visa.

"For London 2012 we are striving to provide the best possible video experience," said Craig Spence, Director of Media and Communications for the IPC. "We chose Octoshape to ensure an exceptional consumer video experience for the 780 hours of live coverage we will be streaming."

"We are very excited to power this next generation experience for such an important initiative," said Michael Koehn Milland, CEO of Octoshape. "Our technology will enable audiences from around the world to view the London 2012 Paralympic Games in the highest quality and smoothest video playback."

DDN Debuts Comprehensive Management Suite to Simplify Big Data Infrastructure

DataDirect Networks (DDN), the leader in massively scalable storage, today unveiled DirectMon, a robust centralized management solution for DDN's award winning storage, file system, and In-Storage Processing technology.

With a unified interface designed to handle all aspects of big-data storage infrastructure administration, DDN's DirectMon minimizes administrator overhead and provides a comprehensive framework for both real-time and predictive systems management and tuning of SAN, NAS and parallel file storage environments.

"DDN's DirectMon takes the complexity out of managing even the world's largest big-data environments, and enables our customers to deploy simple infrastructure capable of scaling up for deep storage capacity and scaling out for added storage capacity and performance," said Jean Luc Chatelain, Executive Vice President of Strategy and Technology at DDN.

"DirectMon increases our customers' agility as they continue to resolve critical growth and challenges in the big-data era."

DirectMon is immediately available for DDN's SFA platforms, including the 6620, 10K and 12K series of products. Additionally, DirectMon will immediately support DDN's GRIDScaler file system and IBM GPFS environments.

This is the first of a release strategy that will, over time, address the broader range of DDN's file storage products as well as the company's analytics infrastructure direction.

"Around the world, we are seeing an extraordinary amount of investment being made to store, manage, and process the massive data sets of the big-data era," said Jeff Boles, Senior Analyst and Director of Validation Services, Taneja Group.

"DDN was one of the first vendors to realize that Big Data is equal parts management and storage. Our research shows that big-data customers are consistently challenged with data management as their Big Data initiatives grow."

"DirectMon will undoubtedly be a key tool for Big Data customers as they face inevitable future growth, and this goes hand-in-hand with DDN's company DNA: DDN storage systems have long enabled new customers to enter into big-data initiatives when they wouldn't have been able to otherwise for reasons of storage complexity," he added.

Google Moves to Rival Amazon in Cloud Computing

Excerpted from Terracloud Report

Google is making its next move on the chessboard of cloud computing, and on the tails of recent outages of Amazon's cloud-based services due to lightning storms, it stands a chance at orchestrating a good, fierce rivalry with the online giant. Since Google recently announced its Kindle Fire-rivaling tablet, Nexus 7, at competitive prices, this is really throwing down the gauntlet.

The Google Compute Engine infrastructure utilizes a cloud stack, meaning that a number of different services are offered via different venues for cloud-based computing: Google Apps, the Google App Engine, and Google Drive. While Amazon has focused on offering infrastructure, this approach to cloud computing emphasizes specific functions.

Google isn't the only company doing this, of course — Microsoft's Azure Cloud presents a similar menu of cloud-based services, as does Oracle, Vmware, etc. But Google hopes to take them off the board with old-fashioned elbow grease: it claims that the Google Compute Engine can simply outperform any of its rivals, based on the kind of technology that powers their search engine.

Amazon offers EC2 at commodity prices, but Google hopes to take it down with dollars as well. The claim from Google is that its Compute Engine Infrastructure will offer "50 percent more compute per dollar." There's no doubt that Amazon has benefited from being the only real player in the game for the past several years. Given this claim, the price of cloud computing is going to lower dramatically.

Google also has the advantage of its branding and resources. It's a tech-based company — unlike Amazon, which is best known for selling everything from music to beach balls. Amazon has a head start, but Google has firepower.

Suffice it to say, Google is hoping to learn from the failures of Amazon's EC2. It had better. The latest outage of the Amazon Elastic Compute Cloud took out Netflix, Instagram, and Pinterest, and the outages have already cost Amazon customers. With new players in the game, the standards for cloud computing are going to rise as quickly as prices drop.

But how does Google plan to avoid these failures? Its tech savvy, of course. It's trying to design its cloud-based services so that they work as part of a global state (rather than being bound by region). To that end, it's working on a new piece of technology called Spanner. Spanner is described as "a storage and computation center that spans all of our data centers."

So how does this cloud computing shakedown look so far? Amazon has a head start. And id does offer a lot: templates for cloud-based services, storage, databases, content delivery, and identity management.

Google's promises are striking some as a little thin in comparison to Amazon's offerings. It has integration with the highly popular Google Docs, but its Compute Engine Infrastructure doesn't offer quite as much as EC2 yet. With Google's history of innovation behind it, many have faith that it has yet to reveal all the plans up its sleeve. (There's enough at stake here that Google, obviously, isn't about to make this technology open source. Not yet, anyway.)

But there's no question that however Google compares now, the appearance of rivals to Amazon is a game changer. We'll see how things shape up in the next few months.

Rackspace Open Cloud Takes on Amazon AWS

Excerpted from Network Computing Report by Mike Fratto

What Is It? Open Cloud is a suite of offerings based on the OpenStack cloud environment, which runs on the open source XenServer.

Open Cloud includes the following new services: Cloud Servers, a server virtualization offering Control panel, a customer portal for managing all Rackspace products, and Cloud Databases, a MySQL 5.1 database service.

Open Cloud also includes the following services, which are in limited preview release and should be available by the fourth quarter: Cloud Monitoring; Cloud Networks, which is based on OpenStack's Quantum module; and Cloud Block Storage, based on Swift.

Three existing services — a content delivery network (CDN) called Cloud Files; Cloud Backup; and Cloud Load Balancers — round out Open Cloud.

Cloud computing platforms tend to lock customers into proprietary file formats and service APIs. Since there are few standards governing cloud computing, relying on an open source project like OpenStack means customers face fewer lock-in issues from their cloud service providers.

Additional benefits of Open Cloud include no lock-in with cloud servers; support for hybrid cloud usage models, which allows users to run public and private clouds across multiple cloud providers; and high-performance MySQL instances with cloud databases.

The instances are partitioned from other customers and metered like a utility based on CPU and RAM usage over time.

Amazon Web Services Boosts Database Cloud Computing

Excerpted from ZDNet Report by Charlie Osborne

A branch of Amazon, Amazon Web Services (AWS), has announced new features for customers using high-performance databases through cloud computing.

With the launch of Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS (input/output operations per second), the new EBS volume type is designed to improve the manageability of I/O intensive tasks.

Consistent, reliable volumes that can deliver rapid response times are crucial for a number of database-driven tasks — especially in the wake of cloud computing —a nd Amazon's Provisioned IOPS aims to fill this crucial role in the market.

The Provisioned IOPS EBS volume type allows customers to specify volume size and volume performance. The volumes have also been designed to allow customers to test, develop and deploy their applications with specific performance levels - through the management console, EBS volumes can be provisioned with required storage and IOPS before being attached to their Amazon EC2 instance.

Amazon EBS currently supports up to 1,000 IOPS per volume, although higher limits are in the pipeline.

Multiple EBS volumes can be added to a single EC2 instance. Customers can now also launch selected Amazon EC2 instance types as EBS-optimized instances — with output options of between 500 Mbit/s to 1,000 Megabits per second.

Jeremy Przygode, CEO at Stratalux, a cloud solutions firm, said:

"A common request we see from both our large and small customers is to support high performance database applications. Throughput consistency is critical for these workloads. Based on positive results in our early testing, the combination of EBS Provisioned IOPS and EBS-Optimized instances will enable our customers to consistently scale their database applications to thousands of IOPS, enabling us to increase the number of I/O intensive workloads we support."

High-performance and reliability are not only a concern for the enterprise; they are also necessary for research programs, large-scale database deployment, and scientific projects. Amazon EBS is currently in use at NASA's Jet Propulsion Laboratory, where the new EBS Provisioned IOPS capability was prototyped to cope with the laboratory's computing and cloud demands.

The Amazon EBS Provisioned IOPS volumes are currently available in specific areas; including Virginia, California, Oregon, Ireland, Singapore, and Japan. Additional region launches are planned in the coming months.

AWS currently operates in 190 countries, and provides an infrastructure platform in the cloud aimed at enterprise, government, and start-ups.

Top Cloud Services for File Sharing and Syncing

Excerpted from PC World Report by Paul Lilly

With the help of cloud services, you can edit your documents on any Internet-connected device, and keep them up-to-date. We looked at Box, Dropbox, MediaFire, SkyDrive, and SugarSync to determine which one is the best.

The cloud delivers convenience, and nothing is more convenient than synchronizing files stored on multiple computers and accessing those files from any PC, smart-phone, or tablet with Internet access. We tested the top five syncing services.

Box. Anyone can register an account with Box and begin using it for free, but to take advantage of its robust collaboration and security features, you must open a paid Business or Enterprise ac ­ ­count starting at $15 per month, per user (minimum of three users). Paying unlocks a truckload of enhancements, including Google Apps integration and other tools that business users will find practical. The user-admin console, for example, lets an IT administrator add users and manage their settings in bulk.

Personal accounts of up to 5GB are free; if you need more space, Box offers 25GB for $10 per month and 50GB for $20 per month — that's the least bang for the buck among the five services in this category. With a Personal account, you can share your files with other people, with or without giving them editing privileges, and you can restrict sharing to collaborators only. Box also provides the option of restricting file previews or downloads, but you're not allowed to set passwords or automatic expiration dates unless you have a paid account.

Dropbox. Simplicity is one of Dropbox's greatest strengths. Install the service on your PC, and it plops a virtual folder on your desktop. The folder acts just as any other folder does, except that it automatically uploads and syncs the files that you put in it to your online account. Changes upload in real time, so you need never worry about working with an outdated file.

On a free account, you get only 2GB of storage. If you want more, you have to pony up for a paid account; prices range from $10 per month for 100GB to $50 per month for 500GB. Pestering your family and friends to open accounts will earn you a 500MB bonus per referral, up to an additional 16GB.

One great feature: Dropbox keeps a history of file changes, so you can roll back to a previous version at any time. And the tech-savvy can come up with a million and one creative ways to use Dropbox.

For example, you might integrate it with a BitTorrent client so that you can download torrent files remotely. First, set your BitTorrent client on your home PC to monitor a folder on your Dropbox account and to automatically open any .torrent file copied to it. Then, while you're at work or traveling, use your remote PC to copy the .torrent file to Dropbox, and your home PC will begin downloading that file the next time Dropbox syncs.

On the downside, when you share a folder, you can't set a password or give some people permission to edit files while withholding permission from others. You also can't upload files to your Dropbox account via email. If neither of those limitations is a deal breaker for you, Dropbox is a strong contender.

MediaFire. Unlimited storage and downloads sounds enticing—until you realize that MediaFire has little else to offer, at least to free users. The biggest deal breaker for free users is that files vanish after 30 days. (The $9-per-month Pro and $49-per-month Business accounts dispense with the disappearing act and hold on to files "forever.")

The list of negatives is long. You can't place restrictions on shared files, no mobile apps are available, files aren't encrypted in transit or in storage, and MediaFire doesn't keep a history of changes. The final nail in the coffin: Users with a free account can't upload files bigger than 200MB.

SkyDrive. Are you planning to subscribe to Microsoft's Office 365 or buy Office 2013 when the new suites are available later this year? If so, SkyDrive is the file-sharing service for you. To use it, you must have a Windows Live account, and so must any colleagues you authorize to edit files (merely viewing shared documents does not require an account). SkyDrive allots 7GB of storage for free accounts, and you get 20GB more with either version of the Office suite. Even without that commitment, upgrades of 20GB to 100GB cost just $10 to $50 per year, not per month. That's an incredible value.

Unfortunately, Microsoft has been paring down its service. SkyDrive's free storage quota, for example, was once 25GB (existing customers were grandfathered into the original cap if they were using more than 4GB as of April 1, 2012, or if they took advantage of a Microsoft loyalty offer, which has since expired).

The company also zapped a feature that enabled users to publish their photos to SkyDrive through email. The iOS apps pick up the slack here (although the absence of Android support is annoying), but why take away a useful feature that's already built?

SugarSync. As sweet as its name, SugarSync is like Dropbox with extra toppings. Rather than limiting file syncing to one virtual folder, SugarSync lets you sync any folder on your PC, including your Desktop folder. Obsessive-compulsive types will love SugarSync File Manager's ability to organize scattered files and folders from numerous synced devices into a single handy window on your desktop. You can also open a file stored on a remote computer, edit it, and save it back to that computer without consuming permanent storage space on the computer you're using.

Road warriors will appreciate Sugar ­Sync's support for all the major mobile platforms, including BlackBerry and Symbian. You'll even find a mobile app built for the Kindle Fire. And you'll rest easy knowing that your top-secret recipes and revealing photos are securely encrypted in transit and in storage.

The tools for sharing files with other people are equally snazzy, though not as full featured as what you get with Box's Business or Enterprise accounts. SugarSync lets you share folders either as albums that anyone can view and download from (but not upload to), or as synced folders that require a SugarSync account. If you choose the latter, you can set permissions and passwords.

The Winner: Thanks to its rich selection of features, SugarSync takes the prize as the tastiest file-sharing service around, and it happens to boast the best iPad app, too.

Cloud Computing: 10 Ways It Will Change by 2020

Excerpted from ZDNet Report by Jack Clark

Right now we are in the early days of cloud computing, with many organizations taking their first, tentative steps. But by 2020 cloud is going to be a major — and permanent — part of the enterprise computing infrastructure.

Eight years from now we are likely to see low-power processors crunching many workloads in the cloud, housed in highly automated datacenters and supporting massively federated, scalable software architecture.

What form will cloud computing take in the year 2020?

Analyst group Forrester expects the global cloud computing market will grow from $35 billion in 2011 to around $150 billion by 2020 as it becomes key to many organizations' IT infrastructures.

Alongside this increase in demand from enterprise, there will be development in the technologies that support clouds, with rapid increases in processing power making cloud projects even cheaper, while technologies currently limited to supercomputing will make it into the mainstream.

And of course, by 2020, a generational shift will have occurred in organizations that means a new generation of CIOs will be in charge who have grown up using cloud-based tools, making them far more willing to adopt cloud on an enterprise scale.

With all these developments in mind, here are 10 ways in which the cloud of 2020 will look radically different to the way it does today, as gleaned from the experts I've spoken to.

1. Software floats away from hardware

John Manley, Director of HP's Automated Infrastructure Lab, argues that software will become divorced from hardware, with more and more technologies consumed as a service: "Cloud computing is the final means by which computing becomes invisible," he says.

As a result, by 2020, if you were to ask a CIO to draw a map of their infrastructure, they would not be able to, says David Merrill, chief economist at Hitachi Data Systems. "He will be able to say 'here are my partner providers'," he says, but he will not be able to draw a diagram of his infrastructure.

This will be because it will be in a "highly abstracted space", where software is written in such a way that it goes through several filters before it interacts with hardware. This means that front-end applications, or applications built on top of a platform-as-a-service (PaaS), will be hardware agnostic.

2. Modular software

To take advantage of the huge armadas of hardware available via clouds, individual software applications are set to get larger and more complex as they are written to take advantage of scale.

With the growth in the size and complexity of individual programs, the software development process will place an emphasis on modular software — as in, large applications with components that can be modified without shutting down the program.

As a consequence, cloud applications will require a new programming mindset, especially as they interact with multiple clouds.

"Software has to be thought about differently," HP's Manley says, arguing that the management of federated services will be one of the main 2020 challenges. This is because applications are not only going to be based in the cloud, but will hook into other clouds and various on-premise applications as well.

In other words, different parts of applications will "float around" in and out of service providers. Assuring good service-level agreements for these complex software packages will be a challenge, Manley says.

3. Social software

Along with the modular shift, software could take on traits currently found in social-media applications like Facebook, says Merrill. Programs could form automatic, if fleeting, associations with bits of hardware and software according to their needs.

"It will be a social-media evolution," Merrill says. "You will have an infrastructure. It'll look like a cloud, but we will engineer these things so that a database will 'like' a server, or will 'like' a storage array."

In other words, the infrastructure and software of a datacenter will mould itself around the task required, rather than the other way around. Developers will no longer need to worry about provisioning storage, a server and a switch, Merrill says: all of this will happen automatically.

4. Commodity hardware rules

By 2020 the transition to low-cost hardware will be in full swing as schemes such as the Open Compute Project find their way out of the datacenters of Facebook and Amazon Web Services and into facilities operated by other, smaller companies as well. "Servers and storage devices will look like replaceable sleds," says Frank Frankovsky, Facebook's VP of Hardware Design and Supply Chain, and Chairman of the Open Compute Project.

By breaking infrastructure down into its basic components, replacements and upgrades can be done quickly, he says. The companies best placed to use this form of commoditized infrastructure are large businesses that operate huge datacenters. "I would say that between now and 2020, the fastest-growing sector of the market is going to be cloud service providers," Frankovsky says.

5. Low-power processors and cheaper clouds

We're around a year away from lower-power ARM chips coming to market with a 64-bit capability, and once that happens uptake should accelerate, as enterprise software will be developed for the RISC chips, allowing companies to use the power-thrifty processors in their datacenters, and thereby cut their electricity bills by an order of magnitude.

HP has created a pilot server platform — Redstone — as part of its Project Moonshot scheme to try to get ARM kit to its customers, while Dell has been selling custom ARM-based servers to huge cloud customers via its Data Center Solutions group for years.

By 2020 it's likely that low-power chips will be everywhere. And it won't just be ARM — Intel, aware of the threat, is working hard on driving down the power used by its Atom chips, though most efforts in this area are targeted at mobile devices rather than servers. Facebook thinks ARM adoption is going to start in storage equipment, then broaden to servers.

"I really do think it's going to have a dramatic impact on the amount of useful work, per dollar, you can get done," Frankovsky says. This should help cloud providers, such as Amazon Web Services, cut their electricity bills. Moreover, if they are caught in a price war with competitors, they are more likely to pass on at least a chunk of the savings to developers, in the form of price reductions.

6. Faster interconnects

The twinned needs of massively distributed applications and a rise in the core count of high-end processors will converge to bring super-fast interconnects into the datacenter.

Joseph Reger, chief technology officer of Fujitsu Technology Solutions, predicts that by 2020 we can expect communications in the datacenter to be "running at a speed in the low hundreds of gigabits per second".

Reger says he expects that there will be a "very rapid commodification" of high-end interconnect technologies, leading to a very cheap, very high-performance interconnect. This will let information be passed around datacenters at a greater rate than before, and at a lower cost, letting companies create larger applications that circulate more data through their hardware (known in the industry as 'chatty' apps), potentially allowing developers to build more intelligent, automated and complex programs.

7. Datacenters become ecosystems

Cloud datacenters will "become much like a breathing and living organism with different states", Reger says. The twinned technologies of abstracted software and commodified hardware should combine to make datacenters function much more like ecosystems, with an over-arching system ruling equipment via software, with hardware controlled from a single point, but growing and shrinking according to workloads.

Automation of basic tasks, such as patching and updating equipment, will mean the datacenter "will become more like a biological system" he says, in the sense that changes and corrections are automatically made.

8. Clouds consolidate

The Internet rewards scale, and with the huge capital costs associated with running clouds, it seems likely that there will be a degree of consolidation in the cloud provider market.

Fierce competition between a few large providers could be a good thing, as it would still drive each of them to experiment with radical technologies. For example, in a bid to cut its internal networking costs and boost utilization, Google has recently moved its entire internal network to the software-defined networking OpenFlow standard, which looks set to shake up the industry as more people adopt it.

Manley of HP argues there will be a variety of clouds that will be suited to specific purposes. "There's going to be diversity," he says. "I think you would only end up with a monopoly if there was an infrastructure around that was sufficiently capable to meet all the non-functional [infrastructure requirements] of those end services."

9. The generational shift

By 2020, a new generation of CIOs will have come into companies, and they will have been raised in a cloudy as-a-service world. There will be an expectation that things are available "as-a-service", Merrill says: "Our consumption model is changing as a generational issue."

And this new generation may lead to a shake-up in how businesses bill themselves for IT, Merrill says. "We have these archaic, tax-based, accounting-based rules that are prohibiting innovation," he adds.

10. Clouds will stratify

Today clouds are differentiated by whether they provide infrastructure-as-a-service, platform-as-a-service or software-as-a-service capabilities, but by 2020 more specialized clouds will have emerged.

According to Forrester, we can expect things like 'middle virtualization tools' and 'dynamic BPO services' to appear by 2020, along with a host of other inelegant acronyms. In other words, along with some large providers offering basic technologies like storage and compute, there will also be a broad ecosystem of more specific cloud providers, allowing companies to shift workloads to the cloud that would otherwise be dealt with by very specific (and typically very expensive) on-premise applications.

Merrill says clouds will, like any utility, be differentiated by their infrastructure capabilities into a whole new set of classes. "Just as we have power generation from coal, from natural gas, nuclear, hydroelectric, there will be differences," he says. "The economics, in my opinion, help us with differentiation and categorization."

Coming Events of Interest

ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.

ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.

CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.

Copyright 2008 Distributed Computing Industry Association
This page last updated August 12, 2012
Privacy Policy