Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2P Leaders

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

November 8, 2010
Volume XXXII, Issue 11


Facebook Acquires File-Sharing Service

Excerpted from Digital Media Wire Report by Mark Hefflinger

Facebook has acquired most of the technology and assets of online file storage and sharing service Drop.io

Financial terms of the transaction were not disclosed. 

Founded in 2007, New York, NY based Drop.io lets users upload files to its servers, and then share those files via private links. 

The company will discontinue its service for all users as of December 15th, and encouraged users to download all of their account data before then. 

Drop.io said that more than 10 million files have been uploaded to its service. 

The company had raised nearly $10 million in venture capital, from backers including DFJ Gotham Ventures, RRE Ventures and Rose Tech Ventures.

Zennstrom Shares Some of What He's Learned 

Excerpted from Wired News Report by David Rowan

Think of all the giant technology companies that have changed your life, and the chances are that they're American. Facebook, Google, Twitter, Apple, Microsoft - there's something about the US digital economy (not least of which is piles of Silicon Valley cash) that churns out tech billionaires faster than Europe can generate teetering banks

That's why Niklas Zennstrom is such a role model to entrepreneurs on the East side of the Atlantic. Not only did he co-found and run Skype, the London-based Internet-phone start-up which eBay bought in 2005 for $3.1 billion. He also, with business partner Janus Friis, created the game-changing P2P software Kazaa, launched the online video-sharing service Joost, and now runs a Mayfair-based investment firm called Atomico which recently raised $165 million.

Not bad for a 44-year-old Swedish-born Londoner listed in the latest Sunday Times "Rich List" as worth a mere $518 million.

What, then, can the rest of us learn from Zennstrom? I recently spent an afternoon with him in search of lessons he's picked up on the way.

"It's hard work. When a business becomes successful seemingly overnight, no one knows about all the months and years you've invested, all the projects you've tried before that didn't work."

"You shouldn't be afraid of failure - when something fails, you think, what did I learn from that experience, I can do better next time. Then kill that project and move on to the next. Don't get disappointed."

"Often you're the only one who believes in what you're doing. Everyone around you will say, 'Why not give up? Don't you see it won't work?' You then have to find out, are they right or am I right? It took a year to raise money for Skype: we went to 26 different venture capitalists, asking for 1.5 million euros and prepared to give away a third of the company. But no one wanted to invest."

"Surround yourself with smart, dedicated people - to build something isn't a one-man show. It's more important to have smart people who really believe in what you're doing than really experienced people who may not share your dream."

"Try to prove there are people actually interested in your product before you spend money building a business. Test it on your mother, sister, friends - I tried Skype on them very early on. Though you never know with the 'mum test' if they're saying good things because they just want to be nice."

"Think globally. If you don't think big, it's unlikely you'll become big. We made sure from day one that Skype was an international business - we were incorporated in Luxembourg, we had software developers in Estonia, we moved to London. The Internet has no country boundaries."

"If you want to be an entrepreneur, it's not a job, it's a lifestyle. It defines you. Forget about vacations, about going home at 6 PM - last thing at night you'll send e-mails, first thing in the morning you'll read e-mails, and you'll wake up in the middle of the night. But it's hugely rewarding as you're fulfilling something for yourself."

"If you're married, your spouse needs to be into it. My wife's salary could support us while we were founding both Kazaa and Skype. With children it becomes harder."

"Money, for me, was one motivation - but so was the drive to change something, to make something happen. And to prove to the world you can do something real. If you're only driven by making money, you're not going to be as likely to make it."

"None of my family were entrepreneurs - my parents were teachers. But I thought early on, in school in Sweden, that one day I wanted my own company as that was the way to make real money. I wanted to prove to others and myself that I could make it big."

"Don't give up if you meet some resistance. I didn't need to raise this fund - but we continued right through the financial storm and raised $165 million. So don't run for cover."

"Once you're successful, people listen to you more. You get taken much more seriously. And people expect that the next thing you do will be instantaneously successful - which makes everything much more difficult. Just because you had one success, doesn't mean you'll have another."

"I'm doing a lot of philanthropy. I don't feel any obligation to do so, but I'm passionate about the environment and climate change. It's very rewarding."

"With success, you have the ability to inspire people. I feel a public figure to some extent - that comes with the job. I'm comfortable with that."

"The UK is the best country in Europe when it comes to setting up companies. But it's no longer as attractive for entrepreneurs to move here, and David Cameron should reset the conditions to those I found when I moved here: taxes shouldn't be as high for stock options, and there should be taper relief so that if you invest all your savings to build something, you don't get taxed away."

"Of course there's envy, but you have to manage it. I don't see that as a big problem."

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertySince 2008, cloud computing has been the fastest growing trend in distributed computing, as a combining and extending of grid computing, P2P, and virtualization.

Essentially, cloud computing addresses in a fresh way the ever present challenge of adding new services for users, developers, and IT departments quickly and effectively.

The conventional response to this challenge has been to expand infrastructure: buy new servers, increase software costs, and provision more data-center capacity.

A promising alternative is to look to the cloud: pay for the bandwidth and server resources that you need. When your big push is done, turn the whole thing off.

High profile skeptics of this vision of "the cloud as panacea" have included Oracle's Larry Ellison, who said, "We've now defined cloud computing to include everything we already do. Our industry is more trend-driven than the fashion industry. I have no idea what anyone's talking about. It's complete gibberish. It's insane. When is this idiocy going to stop?"

And indeed, we are probably guilty of ascribing the potential benefits of cloud computing to everything and the kitchen sink: databases, app servers, software code, mobile devices, PCs, and all that connects them.

But the fact is that you are probably already in the cloud. Online e-mail accounts (Gmail, Yahoo), social networking sites (Facebook, Twitter), photo-sharing sites (Flickr, Picasa), search engines (Google, Bing), productivity software (Google Docs), and personal finance software (Quicken, online banking and bill-paying tools) have all migrated to the cloud and, if you use them, congratulations, so have you!

There are four basic attributes of cloud computing: it's virtual - the physical location and underlying infrastructure details are transparent to users; it's scalable - able to break complex tasks into pieces to be served across an incrementally expandable infrastructure; it's efficient - using Services Oriented Architecture (SOA) for dynamic provisioning shared compute resources; and it's flexible - able to serve a variety of workload types - both consumer and commercial.

Understanding cloud architecture gets more involved with such considerations as outsourced processes, online storage, platforms, online offices, shared calendars, online resources, third-party integration, online collaboration, and offline access.

A simpler way to look at this is from the perspective of the end user - whose ultimate access to the desired web application is accomplished basically through a user interface (UI), system management, provisioning services, and cloud servers.

One of the best cloud definitions comes from the National Institute of Standards and Technology (NIST). It includes five essential characteristics: measured service, rapid elasticity, on-demand self service, broad network access, and resource pooling; three service models: software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS); and four deployment models: private, public, hybrid, and community.

These deployment models are configured into the cloud ecosystem. Private cloud data centers are internal to an enterprise, and may connect to external public clouds, whose data centers support many customers, and may also include virtual private clouds (VPCs).

Closely related to VPCs are hybrid clouds, which combine private internal clouds with public external ones.

The three fundamental cloud service models typically are geared towards certain types of users. SaaS offers applications to end-users. PaaS supports execution platforms for developers, and IaaS provides infrastructure for IT operations. In each case, the enabling technology delivers the cloud service at scale, and in some cases all of them in the form of unified service delivery.

The classic cloud service pyramid moves down from a pinnacle of SaaS applications that serve narrow niches through PaaS to IaaS with increasing breadth and scope in terms of the services provided.

Looking at SaaS, PaaS, and IaaS more closely, SaaS is increasingly popular with small-to-midsize enterprises (SMEs) because there is no hardware or software to manage and the service is delivered through a browser.

Examples include customer relationship management (CRM), financial planning, human resources, and word processing offered by cloud providers like Salesforce.com and emailcloud.

PaaS is attractive because platforms are built upon infrastructure, which is expensive, and since estimating demand is not even close to a science, platform management isn't exactly fun.

Examples include Google App Engine, Mosso, and Amazon Web Services' S3.

IaaS more broadly involves the infrastructure stack comprising full operating system (OS) access, firewalls, routers, and load balancing.

Examples include Flexiscale and Amazon's well-known EC2.

Common factors for all of these include pay-per-use, instant scalability, security, reliability, and application program interfaces (APIs).

And advantages include lower cost of ownership, reduced infrastructure management responsibility, the ability to allow for unexpected resource loads, and faster application rollout.

Looking closer at cloud economics, these advantages are derived from being multi-tenanted, leveraging virtualization to lower costs by increasing resource utilization, the economies of scale afforded by the technology, and by using an automated update policy.

A core economic benefit to cloud users comes from paying by actual use instead of provisioning for peak usage. By having the available capacity of data center resources more closely match demand over time, the waste of unused resources can be largely eliminated.

In basic terms, cloud economics address the fundamental risk of data center capacity over-provisioning, which is under-utilization.

And what may not be so obvious, they also address what can be two heavy penalties for under-provisioning: lost revenue and, arguably worse, lost users who over time need to find alternatives if their demand for computing resources is not met.

Cloud computing is not without risks, the greatest of which is security - having proprietary and confidential data in the hands of a third party, and also downtime, access concerns, dependency on a mission critical vendor, and interoperability with other computing functions.

But these are mitigated by six compelling properties of cloud computing: it is user-centric, task-centric, powerful, accessible, intelligent, and programmable.

There are four giants among cloud computing vendors today. Amazon, which really started the current trend by getting so good at managing resources for its own e-commerce business that it decided to market this to others; Google, with its Google Apps for $50 per user per year among many more offerings; Microsoft, with its Azure platform for developers; and Salesforce.com, which offers a free version for up to 100 users for a single web app or unlimited apps for unlimited users at $75 per user per month.

There are a number of other players now emerging with very attractive cloud-computing solutions including: Appistry, B-hive, CA, Citrix, CycleComputing, Geschickten, EngineYard, GoGrid, HP, IBM, Joyent, Mosso, Oracle, Path, Q-layer, Sun, Verio, VMOps, VMware, 3teram and many more.

Looking ahead, Google's products and services are nearly all cloud-based already and it has become the first big-name company to release a cloud-based operating system - Chrome OS.

Microsoft is in the process of rolling out a cloud-based version of Microsoft Office and is introducing its own cloud-based OS - Midori.

Apple is making up for lost time in the cloud arena with its web-based services MobileMe and iWorks and is also investing in a several-hundred-million dollar data center.

The DCIA continues to see enormous potential for cloud computing, going back to the 2008 study by Merrill Lynch which projected that the market for cloud computing services will surpass $95 billion by 2013. IBM is spending $360 million on its cloud computing data-center. Microsoft has made the cloud one of its highest priorities. And the Gartner Group has proclaimed cloud computing as this year's most important strategic technology.

When considering cloud computing, it's critical to weigh both the pros and cons: scale and cost versus questions of security; encapsulated change management versus being locked-up with a vendor; next generation architectures versus reliability concerns; and choice and agility in many areas versus a lack of control in some others.

To discuss this more fully, please join me at the Stifel Nicolaus Telecom, Media & Tech Boot Camp this Thursday November 11th at the Newseum here in Washington, DC. Share wisely, and take care.

Technology and Politics after the Midterms

Excerpted from SmartBrief Report by Adam Mazmanian

The Republican takeover of the US House of Representatives is expected to lead to an about-face in the way Congress approaches much of the Obama administration's technology policy. A key panel to watch is the House Subcommittee on Communications, Technology, and the Internet. This is the panel that will take up issues related to the power of the Federal Communications Commission (FCC), network neutrality, the national broadband initiative, and online privacy.

But tech watchers will have to wait for some political jockeying among House Republicans to head the powerful Energy and Commerce Committee before the subcommittee race shakes out. It's shaping up to be a three-way contest.

Congressman Cliff Stearns (R-FL) is the ranking Republican on the subcommittee, but he may be setting his sights higher. The chair of the powerful Energy and Commerce Committee is a plum assignment for any GOP lawmaker, since that committee will steer efforts to roll back national health care legislation. Stearns cites his 96% rating from the American Conservative Union as evidence of his bona fides.

He's also made his case in more tangible ways, cutting a pair of $300,000 checks to the Republican campaign efforts. If Stearns wins the chairmanship (considered a long shot), under GOP rules he'll have to give up his subcommittee post, which will set off a scrum among senior members of the panel.

Stearns faces stiff competition for the chairmanship. Congressman Joe Barton (R-TX) is the ranking Republican on the full committee, but it's possible he'll need a waiver from the leadership to be eligible for a second stint as chairman, because House Republican conference rules limit members to three terms atop a committee. Barton believes that House rules allow him three terms as chairman, but he'll have to convince minority leader and presumptive House Speaker John Boehner of that.

This could be a tough sell, in part because of public reaction to Barton's famous (and eventually retracted) apology to BP's then-CEO Tony Hayward for a government "shakedown." Still, Barton is reportedly confident that he'll secure the chairmanship of the Energy and Commerce Committee. He has also been generous with campaign help, with $1.1 million in GOP donations.

To some, Congressman Fred Upton (R-MI) is the front-runner for the post. An article on the future of health care reform in Politico casually names Upton as the likely chairman of the Energy and Commerce Committee. A piece in Upton's hometown Kalamazoo Gazette points out that Upton will face challenges from the right wing of his party, because he is viewed as a moderate - particularly on environmental issues.

But Upton is the next most senior member after Barton, and if Barton loses his bid either because of conference rules or because he's politically untenable, Upton seems the likely pick. He has also paved the way for his ascendancy with more than $1 million in donations to party coffers.

Once the Energy and Commerce chair is spoken for, tech watchers can hope for a clearer picture of the race (if there is one) to head the communications subcommittee.

On the Democratic side, current subcommittee Chairman Rick Boucher lost his seat in Tuesday's elections, in a tight race with Republican Morgan Griffith. Candidates to lead the opposition on the panel include Congresswoman Anna Eshoo, a California Democrat whose district includes Google headquarters, and Congressman Ed Markey (D-MA).

Reinventing Distributed Computing 

Excerpted from Channel Tech Center Report by Michael Vizard

Distributed computing has been around for 40 years or so and has been very good to the channel. After all, all the components that make up a fully-functional distributed computing environment have to come from somewhere.

But many customers have always balked at the complexity of distributed computing. So what would happen if a vendor came up with a new much simpler approach to distributed computing.

That's the thinking that went into the Translattice Application Platform. According to Translattice CTO Michael Lyle, the Translattice Application Platform consolidates all the information technology (IT) infrastructure required for distributed computing into a single appliance.

Those appliances then manage all the nodes on the network as if they were a single cluster. This not only provides for high availability, said Lyle, it also creates an architecture where application logic can migrate to where it needs to be deployed on that network based on performance and compliance requirements.

The Translattice Application Platform manages various application state and all the required commit mechanisms across all the nodes on the network. This approach, says Lyle, not only reduces the complexity of distributed computing, it significantly reduces the amount of IT infrastructure required to support these applications.

The Translattice Application Platform manages all the various application states across the cluster, along with all the required commit mechanisms across all the nodes on the network, by leveraging an approach based on the Paxos protocol.

Solution providers tend to wince when a vendor promises to take the complexity out of anything IT related. After all, complexity tends to drive services opportunities in the channel. But when it's the solution provider that provides the actual service; the cost of providing the distributed computing environment falls on to the shoulders of the solution provider.

And if that's the case, a more effective approach to distributed computing is going to be too good a thing to ignore.

What Is Personal Cloud Computing?

Excerpted from PC Magazine Report by Rivka Tadjer

Imagine your PC and all of your mobile devices being in sync - all the time. Imagine being able to access all of your personal data at any given moment. Imagine having the ability to organize and mine data from any online source. Imagine being able to share that data - photos, movies, contacts, e-mail, documents, etc. - with your friends, family, and co-workers in an instant. This is what personal cloud computing promises to deliver.

Whether you realize it or not, you're probably already using cloud-based services. Pretty much everyone with a computer has been. Gmail and Google Docs are two prime examples; we just don't think of those services in those terms.

In essence, personal cloud computing means having every piece of data you need for every aspect of your life at your fingertips and ready for use. Data must be mobile, transferable, and instantly accessible. The key to enabling the portable and interactive you is the ability to sync up your data among your devices, as well as access to shared data. Shared data is the data we access online in any number of places, such as social networks, banks, blogs, newsrooms, paid communities, etc.

Ultimately, your personal cloud - which includes everything from your address book and music collection to your reports and documents for work - will connect to the public cloud and other personal clouds. Everything connects. That means every place on the Internet you interact with, as well as every person you interact with can be connected. This includes your social networks, bank, university, workplace, family, friends - you name it.

Of course, you will determine what you show the public and what you keep private. Clusters of personal clouds will form new social networks that will likely have a lot more privacy settings than Facebook, especially if these clusters are family or business oriented. (Privacy will be a huge issue as personal clouds hit critical mass.)

Eventually, like the smart house in the TV series "Eureka," your devices will learn about you and eventually intuit what you are doing, where you are going, and what you intend to do when you get there. Think of all this as helpful, not creepy.

This might all sound a bit like science fiction, but this is exactly where we're headed with cloud computing. We're not quite there yet, though. We're all still creating our personal clouds.

So, what is involved in creating a personal cloud and what can you do with it right now? We'll explain in subsequent pieces.

Level 3 Beefs-Up CDN Network

Excerpted from Von Xchange Report by Kelly Teal

Level 3 Communications has been talking about its content delivery network (CDN) for years, and the investment advisors over at Motley Fool say this is an area where Level 3, anxious to "breathe new life into its stagnant data pipeline operations," must focus.

The Broomfield, CO operator is doing just that.

On Tuesday, Level 3 said it has added 1.65Tbps to its global capacity, and beefed up its locations with two cities in Canada and three in Europe - Montreal, Toronto, Brussels, Hamburg, and Munich, respectively. Level 3 completed most of the work during the third quarter.

This is just the kind of investment the Motley Fool wants to see.

"The faster Level 3 can become a content delivery powerhouse, the better," Motley Fool's Anders Bylund wrote on October 29th. "Being one of three CDN providers to serve up movie streams for digital movie maven Netflix is a good start, but Level 3 clearly can, wants to, and will do more."

To be sure, Level 3 long has provisioned CDN services for entities including Major League Baseball Advanced Media and the Disney group. For Disney, Level 3 oversees the delivery of "major online events as well as video on-demand," said Bud Albers, CTO of Disney Connected and Advanced Technologies.

Level 3 said much of the higher demand it's accommodating comes from content streamed over Adobe, Apple, and Microsoft platforms.

"We have seen a significant increase in demand for our CDN services this year," said Mark Taylor, Level 3's Vice President of Content and Media. "With this rate of growth and with customer needs in mind, Level 3 will continue to develop CDN capabilities, add capacity, and improve performance."

Level 3's CDN business competes against pure-play CDN companies Akamai Technologies and Limelight Networks. Level 3 bought SAVVIS several years ago to enter the CDN market, which has proved a smart move as streaming video mounts a formidable opposition to pay-TV alternatives.

Internet TV Wars: Yahoo & Samsung Step-Up to Google & Sony

Excerpted from Fast Company Report by Kit Eaton

Yahoo and Korea's Samsung aren't going to concede defeat in the Internet TV game to Google and Sony: The two companies are going to sell Yahoo-connected TVs in 26 more nations.

Though Google has recently stolen most of the headlines about net-connected TV tech, thanks to its partnership with Sony and Logitech, Yahoo's actually been operating in this space for a while. It sells Yahoo-connected TVs, enabled with enough computer power to run widgets on screen (revealing the weather, news, finance data, Twitter or Facebook status updates and so on) in 13 nations. This week, Yahoo revealed a partnership with Samsung that will see its TVs sold in 39 nations.

It's an interesting move: Samsung is one of the very biggest players in the HDTV game, and Yahoo's Connect TV system is a third way for consumers who want to embrace the next-gen of TV tech. Yahoo TV sits between the products from Apple and Google.

Apple TV is designed as a super-simple set-top box (STB), with an elegant interface, tightly-controlled functions and Apple's traditional control over what operations you can perform on it.

Google's system is far more sophisticated, something akin to strapping a netbook to the back of your TV--a single glance at the insanely complicated remote control Sony's shipping with its Google TVs reveals that the device really is like the Net injected into a typical household TV experience.

Talking about the new moves, Yahoo's Senior Director of Connect TV Russ Schafer didn't mince words when he confirmed Yahoo's position in the market: "We don't think people want the whole web browser experience crammed in a TV."

The news comes at about the same time that major TV networks are blocking access to their shows via Google's TV system, which highlights how very dynamic the Net TV market is right now.

We know Apple is continuously negotiating with TV and movie content providers to get its TV system up to speed, and that a burgeoning hacker community is tackling the new A4-powered TV unit to expand its powers.

Yahoo's system, which looks much like a normal TV but brings extra functionality, may be the way to appeal to many a consumer who likes the idea of a future-focused TV, but balks at Google's complexity or Apple's walled-garden approach.

Geocities Lives on as Massive Torrent Download

Excerpted from Wired News Report by Scott Gilbertson

Right now, you can download the bulk of "Geocities" in a single, giant 652GB file over BitTorrent

The seminal free web-hosting site has been off the tubes since last year, when its owner, Yahoo, shut it down.

Most of us probably didn't care about Geocities disappearing. Its content was outdated. The design of most pages made MySpace look like something created by Edward Tufte. And the HTML tables - oh, the tables!

However, enough people did care about the demise of Geocities to form a group that calls itself The Archive Team, which began grabbing as much of Geocities as it could before Yahoo killed it. On Sunday, that archive of Geocities was made available as a 652GB torrent.

If you don't want to download 0.65 terabytes of the web equivalent of space junk, you can merely browse one of the several mirrors the Archive Team has set up at geocities.com, geociti.es, geocities.ws and oocities.org.

At once of those sites, you can get your fill of jazz midi files, learn about the totally amazing all-female grunge band L7 and pay a visit to Spanky's mushroom-infested link compendium without downloading the entire payload.

It's easy to joke about Geocities. After all, the pages hosted there look very primitive from this web X.x vantage point. But the archive team is trying to make a point, both about our "digital heritage" and the short-lived nature of popular websites.

What we were facing, you see, was the wholesale destruction of the still-rare combination of words and digital heritage, the erasing and silencing of hundreds of thousands of voices, voices that represented the dawn of what one might call "regular people" joining the World Wide Web. A unique moment in human history, preserved for many years and spontaneously combusting due to a few marks in a ledger, the decision of who-knows for who-knows-what.

But you see, websites and hosting services should not be "fads" any more than forests and cities should be fads - they represent countless hours of writing, of editing, of thinking, of creating. They represent their time, and they represent the thoughts and dreams of people now much older, or gone completely.

There's history here. Real, honest, true history. So the Archive Team did what it could, as well as other independent teams around the world, and some amount of Geocities was saved.

If you'd like a little bit of Internet history (OK, a massive bit of Internet history) head on over to The Pirate Bay. And please, remember to seed.

NIST Talks Simulation and Cloud Roadmap

Excerpted from Government Computer News Report by Rutrell Yasin

The National Institute of Standards and Technology (NIST) is working on a simulation model to understand and predict behavior in cloud computing systems, Dawn Leaf, the agency's Senior Executive of Cloud Computing told attendees at a NIST forum on November 4th.

The cloud computing simulation model project, also known as Koala, focuses on the behavior of infrastructure-as-a-service (IaaS) cloud systems. The objectives are to compare behavior of proposed resource algorithms for IaaS clouds, and discover and characterize complex natures that may occur in those clouds.

NIST officials expect to share the initial findings of the project in early 2011, Leaf said during a presentation at NIST's Cloud Computing Forum and Workshop II held this week in Gaithersburg, MD.

The simulation project is an example of work NIST and agencies such as the General Services Administration (GSA) have been doing since May when the NIST held its first Cloud Computing Summit.

Other work has included the release of a draft special publication that gives security guidelines for virtualization, the release of security controls for the Federal Risk Authorization and Management Program (FedRAMP) as well as development work on a portal designed to foster collaborative development of cloud computing standards known as the Standards Acceleration to Jumpstart the Adoption of Cloud Computing (SAJACC) portal.

NIST is now looking forward to developing a strategic roadmap for cloud computing with the help of federal and industry stakeholders, Leaf said.

The first step is to define target government cloud computing business use cases, Leaf said. These cases would be different from business implementation cases such as the 30-plus cases across federal, state, and local governments published on the Federal CIO Council's website in May.

"The goal here is to identify opportunities for deploying clouds that we have not yet implemented," she said. The aim is to identify the interoperability, portability, and security requirements needed to go forward.

These business use cases are also different from those connected with SAJACC. The 24 SAJACC use cases - announced at the forum - focus on how consumers get data into cloud service providers' environments.

The target business use cases are more operational. A hypothetical example could be determining what requirements are needed to implement a community cloud for export licensing enforcement that supports the Commerce, Defense, Homeland Security, and State departments.

Leaf used this example because in the federal government, "We tend to polarize between public and private cloud." If data is already available on the web, agencies are comfortable putting it in a public cloud. If there are security requirements, the approach is to put the data in a private cloud. However, there are many types of clouds between these two such as community or hybrid clouds that need to be explored, Leaf said.

The next step is to define a neutral cloud computing reference architecture and taxonomy. A hardware manufacturer's reference architecture would be more focused on infrastructure while a data management provider would tend to focus on data management issues.

So it is not clear what a cloud computing reference architecture what look like at this point. The goal is to open the dialogue, Leaf said. What is clear is that the model should not proscribe a particular implementation. Plus, it has to be flexible enough to allow cloud services to be mapped to an overall model so business use cases can be discussed.

The third part of NIST's strategy is generating a cloud computing roadmap. By translating business mission requirements against a cloud reference model, NIST hopes to identify the gaps needed to be filled with regards to standards. "We can figure out what is missing in terms of standards."

Leaf emphasized that ownership of this cloud computing roadmap is community-based involving collaboration between the government IT community and industry delivering its expertise in terms of a reference model, ontology and technology.

Federal CIO Vivek Kundra also emphasized the need for government and industry partnerships to achieve the goals for cloud computing.

Kundra reflected on the momentum building for the cloud. For example, GSA recently awarded 11 contracts to vendors that will provide IaaS including storage, virtualization, and web hosting. The cities of Los Angeles and New York and the state of Wyoming are moving services to the cloud. IBM and Microsoft have announced government clouds.

Government and industry are on a one-way street headed toward the cloud, he said. "We want to make sure as we think about policy and security it is not done so in an abstract, closed ecosystem," Kundra said. The process has to be done in an open, participatory fashion "so we can be beneficiaries of everyone's thinking," he noted.

CIOs have to make sure they have the right security controls in place as they move agency resources to the cloud. That is why the government launched FedRAMP and this week released a set of proposed controls and models to certify cloud solutions government-wide, Kundra said. He urged those from the public and private sector attending the forum to look at those controls and give the government feedback.

"We want to make sure from an economic perspective as cloud vendors sell into the government they are not doing so in a fragmented manner where they are negotiating with every bureau and agency," Kundra said.

It is vital that the government and private sector get standards right from the beginning because once they are hardwired it will be difficult to change, he said.

Kundra noted that he attended the World Economic Forum on Cloud Computing on November 3rd where interoperability and portability were not the only issues discussed. Attendees were concerned about the future of data sovereignty as data moves across multiple boundaries not just at the state and local levels but between nation states.

Governments need to think about governance models to address that issue, Kundra said.

Throughout the first day of the forum held at NIST's headquarters, panel discussions focused on standards, reference architectures, the global community, and other cloud issues. On day two, government and industry cloud computing practitioners rolled up their sleeves and exchanged ideas during breakout sessions held at the Holiday Inn Gaithersburg.

Telecom Argentina Selects BuyDRM's KeyOS for IRIS

BuyDRM, a pioneer in deploying digital rights management (DRM) technologies for pay media operators, announced this week that Telecom Argentina will deploy the KeyOS DRM Platform within Telecom Argentina's IRIS CDN offering and customer billing platform. The KeyOS Platform's expansive feature set allows quick-to-market deployment of premium encrypted video content in the Windows Media and Smooth Streaming formats.

As a result of this enterprise integration, Telecom Argentina will be able to offer KeyOS powered DRM services seamlessly to their customers via their IRIS CDN customer portal. The addition of KeyOS powered DRM services to the IRIS platform will empower Telecom Argentina's customers, drive the usage and revenue of their media services, increase the depth of their service offering and improving customer retention.

"It was important to Telecom Argentina that we enhance the IRIS CDN platform with more modern transparent DRM technologies to support our growing customer needs" said Sergio Galban, Telecom Argentina's Executive Manager in charge of the development of Telecom CDN services. "Our selection of KeyOS was the result of an exhaustive review of the marketplace and we are confident our customers will be well served with this decision."

"The Smooth Streaming and PlayReady technologies are clearly addressing the needs of the major US and European broadband providers deploying VOD to ensure they are in compliance with their agreements with the major content owners, studios and networks" added Martin Ortiz - product manager for CDN / OVP services, Telecom Argentina.

"We worked with Telecom Argentina to evolve their digital media offering incorporating simple and secure customer experiences that can scale in a cost effective manner. This offering represent a professional grade suite of digital media infrastructure the today's media companies can rely on to power their businesses" said Christopher Levy, CEO and Founder, BuyDRM.

Founded in 1990, Telecom Argentina is the leading telecommunications group in Argentina, where it offers directly or through its controlled subsidiaries local and long distance fixed-line telephony, cellular, data transmission and Internet services, among other services such as ICT solutions, data center, video transport and multimedia content delivery. Additionally, through a controlled subsidiary, the Telecom Group offers cellular services in Paraguay.

Cloud Computing and Oracle's Standards

Excerpted from Data Center Journal Report by Rakesh Dogra

More often than not, lack of standardization can add to the complexity of measuring and deploying emerging technologies. Consider cloud computing, for example. The fact that cloud services can give the user on-demand computing resources makes it a very attractive prospect for many businesses. A company can choose to host their services internally (on a private cloud) or remotely (on a public cloud).

Regardless of this choice, monitoring of performance, usage, and security is crucial. Generally speaking, cloud computing services result in a simplification of IT tasks, but sometimes, they can make the application environment very complex. Such complexity leads to the need for stringent, globally applicable, and accepted standards for managing infrastructure and other aspects of cloud computing. 

Although cloud computing is a powerful tool, not many standardized protocols exist for it. Many vendors are seeking to establish standards that could become an industry norm. Oracle has thrown its hat into the cloud ring with its newly announced APIs. Like other companies, Oracle has submitted its APIs to the Distributed Management Task Force (DMTF), which spearheads the move to foster collaboration between IT companies all over the world to develop validated and unified system management standards. The specifications of this API will be considered for inclusion in DMTF's in-the-pipeline standard for infrastructure-as-a-service (IaaS). 

The Oracle Cloud Resource Model Application Programming Interface (Oracle Cloud API) focuses on the management of cloud computing infrastructure. It has two components: one is REST (Representational State Transfer), which outlines the protocol that will be used to communicate with resources, and the other is the Oracle Cloud Elemental Resources Model, which helps specify the volumes associated with storage, virtual machines, and so on. 

The Oracle Cloud API uses dynamic provisioning, clustering, and virtualization to enable easy and efficient management of cloud-based resources. It also has interoperability and extensibility built into its framework. It enhances the commitment that Oracle has made to open standards. REST, for instance, uses HTTP systems to aid in provisioning, modifying, and managing entities. With a higher adoption speed for open standards, there will be a resultant higher adoption rate for cloud computing services. 

The foundation of this API was set by the Sun Cloud API, which defined a RESTful API for the development, management, and operation of cloud resources. It operated on the primary resources of virtual data centers, clouds, virtual machines, and private virtual networks, to name a few. 

What the customer gets is business agility, flexibility, higher utilization, cost reduction, and higher returns on investment (ROI). The simplicity and elegance of this API is evident in that the client does not need to concern himself with what lies below the surface. When clients can manage cloud resources with portability and openness, they can easily move workloads across clouds. More importantly, they can continue to use their existing management frameworks to run the cloud infrastructure for their applications. 

Red Hat and Rackspace submitted Deltacloud and OpenStack, respectively, to DMTF. Eucalyptus Systems along with other companies has also submitted its work for consideration by the DMTF. 

Oracle has taken the step of submitting the cloud management API to its Oracle Technology Network (OTN). This submission is titled the Oracle Cloud Resource Model API, and the one submitted to DMTF is titled Oracle Cloud Elemental Resource Model. The essential difference between the two is that the one submitted to OTN has a few sections less, and the DMTF document is almost solely focused on the technical base for the IaaS framework. 

Interestingly, the API is also structured to take care of the phenomenon of cloud burst-a situation in which companies use a public cloud to manage operations and workloads temporarily but go back to their respective data centers when the public cloud is no longer useful.

Microsoft Speeds-Up its Shift to Cloud Computing

Excerpted from Bloomberg News Report by Dina Bass

Five years after unveiling a plan to shift into cloud computing, Microsoft may finally be making headway.

In 2005, Microsoft's then-Chief Technology Officer (CTO) Ray Ozzie wrote a memo, saying the company was at risk if it didn't reinvent itself as a provider of software and computing services over the web.

Heeding the warning, Microsoft has signed up customers, including Toyota, 3M, and Lockheed Martin, for its cloud product. By March, 90% of the company's engineers will be working on cloud-related products, Chief Executive Officer (CEO) Steve Ballmer said. And the server unit may generate $10 billion in annual revenue from cloud services in a decade, Microsoft President Bob Muglia said.

"We will see those numbers and more," Muglia said in an interview.

Developers learned more about the cloud strategy at a conference last week at the company's headquarters in Redmond, WA, where Microsoft highlighted tools that make it easier to move applications to the cloud. In a report of its first-quarter results released Thursday, Microsoft said net income rose 51% last quarter to $5.41 billion.

Further progress in cloud computing will hinge on whether Microsoft can narrow Amazon's lead. Microsoft in November released its flagship cloud product, Azure, which stores and runs customers' programs in its own server farms.

That came three years after market leader Amazon introduced a suite of cloud services that let companies rent, rather than buy, servers - the powerful machines that run networks and handle complex computing tasks.

Still, Microsoft ranks high in surveys asking chief information officers which cloud vendors they plan to use, said Sarah Friar, a software analyst at Goldman Sachs.

"What Microsoft is doing well from a cloud perspective is they are enterprise class and understand what it means to both sell to large enterprise but also meet all their requirements," said Friar, who is based in San Francisco.

The approach won over Toyota. The automaker is using Azure to track the 2,000 calls daily that come into the Lexus roadside and crash assistance service. The program took days to implement, compared with weeks for an internal database, said Glen Matejka, a Toyota manager.

3M, whose products range from Post-It notes to flu tests, uses Azure to host a new program, called Visual Attention Service, that lets website designers assess which parts of a site catch the human eye. 3M halved costs by entrusting the program to Microsoft's machines instead of its own, said Jim Graham, technical lead for the program.

Cloud computing can make it affordable for companies to tackle projects that previously would have required purchasing and maintaining tens of thousands of servers.

"Look at the so-called quants on Wall Street," Microsoft General Manager Bill Hilf said. "They say, 'I want to ask a question, but it's going to take me 20,000 servers to answer.' Each time that happens, banks don't want to build a new server farm. They want to just access those machines as needed."

Getting customers on board wasn't easy. During Microsoft's midyear reviews, a series of meetings in hotel ballrooms near Microsoft's campus early this year, executives got a sobering message from sales staff, Muglia said. Customers weren't convinced Microsoft took its cloud push seriously.

Microsoft sales chief Kevin Turner decided that the pitch needed to change. Rather than discussing various options, Microsoft's sales force has altered its pitch to "lead with the cloud" and now focuses customer meetings on cloud technologies, Muglia said.

"These are powerful new platforms that are creating a wave of new opportunity," Ballmer said at Microsoft's developer conference.

Microsoft's biggest challenge in cloud computing may come from Seattle-based Amazon. The online-commerce provider generates about $500 million in sales from cloud services, more than five times Microsoft's cloud-related revenue, according to Friar.

Among small and medium-sized businesses, 77% surveyed by Goldman Sachs said they used Amazon, compared with 10% for Microsoft and 17% apiece for Google and Salesforce.com Inc. Larger businesses reported 12% for Microsoft, compared with 18% for Salesforce and 11% for Google. Amazon wasn't included in that survey.

Microsoft may not lag behind rivals long, Goldman Sachs research suggests. When the firm's analysts asked companies about the future, 29% listed Microsoft in their cloud purchasing plans. IBM came in second, with 25%, Salesforce racked up 24%, and Google garnered 20%. Amazon trailed, with 8%.

Coming Events of Interest

2010 Future of Film Summit - November 9th in West Hollywood, CA. Dealmaking in the Age of Digital Cinema with Over 50 industry leaders as panel speakers. Time is running out to reserve your opportunity to network with the best in the business. Make contacts, get the latest on trends and make your voice heard.

KMWorld 2010 - November 16th-18th in Washington, DC. This conference provides you with all the essential pieces of the information engine that powers your enterprise-including knowledge creation, publishing, sharing, finding, mining, reuse and more, which work together to enable business problem-solving, innovation, and achievement.

International CES - January 6th-9th in Las Vegas, NV. With more than four decades of success, the International CES reaches across global markets, connects the industry, and enables consumer electronics (CE) innovations to grow and thrive. The International CES is the world's largest consumer technology tradeshow featuring 2,700 exhibitors.

Content in The Cloud - January 7th in Las Vegas, NV. The DCIA's Conference within CES explores this cutting-edge technology that promises to revolutionize entertainment delivery. Six keynotes and three panel discussions focus on cloud-delivered content and its impact on consumers, the media, telecom industries, and consumer electronics (CE) manufacturers.

Copyright 2008 Distributed Computing Industry Association
This page last updated November 13, 2010
Privacy Policy