Distributed Computing Industry
Weekly Newsletter

In This Issue

Cloud News

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

January 24, 2011
Volume XXXIII, Issue 10


Comcast/NBCU Celebrate Government Deal Approval

Excerpted from Broadcasting & Cable Report by John Eggerton 

Reaction poured in Tuesday to the news that the FCC and Justice had approved the Comcast/NBCU deal, including from the apparently relieved executives involved, who suggested the deal's online access conditions, among others. 

"This is a proud and exciting day for Comcast," said Comcast Chairman Brian Roberts. "We are grateful for the leadership of FCC Chairman Julius Genachowski, Assistant Attorney General Christine Varney, the other FCC Commissioners and their staffs for the months of hard work that went into reviewing an unprecedented number of documents and public comments." 

GE Chairman Jeff Immelt suggested GE wasn't as much saying goodbye to NBCU as sharing the wealth with Comcast, which is buying a 52% stake, but with an option to buy out GE over time. 

"NBCU has been a great business for GE over the past 20 years, generating an average annual return of 11%. Reducing our ownership stake from 80% to 49% allows GE to continue sharing in NBCU's growth while also providing significant cash to invest in our high-technology infrastructure businesses, growing an attractive dividend, and continuing our buyback program," he said, pegging that cash as about $8 billion at closing, which is expected by the end of the month. 

On a conference call with reporters Tuesday, Comcast EVP David Cohen said the conditions and commitments on program access and carriage, including online, and on network neutrality had been expected and had been crafted in such a way to allow the company to operate and grow its business. That suggested the access and carriage conditions did not break new ground, extend broadly into the online space, or require the company to make all its content available to anyone who asked for it. 

"We believe the new company will be positioned to compete, and compete fairly," said Cohen, "and these commitments and conditions while bringing significant public interest benefits will not impede our ability to operate the business or compete at all." 

He said he was grateful for the "integrity and the thoroughness of the process." 

Cohen spelled out some of the key things that were not in the FCC or DOJ orders, including no divestitures of broadcast stations or NBC's stake in Hulu (though it must give up any management authority over the online site), no unbundling or program discounts on the access front, no extension of program access broadly to online, no arbitration condition for program carriage disputes, no wholesale condition on high-speed data (as there was on the Time Warner/AOL merger), and not programming or cable channel quotas - Comcast has volunteered to add programming or channels, but those are commitments rather than FCC conditions. 

And while Comcast will be required to make its programming available to online video providers, there are stipulations, carve-outs and caveats, including for "industry-typical" exclusivity arrangements - an exclusive film content window. Comcast will still also be able to provide an online authentication model for its programming (giving subscribers online access to their video content). It will have to make that programming available to competitors with similar authentication models, but Comcast says it can live with that. 

And the online program access conditions stipulate that it must be similar programming. Say, if Viacom makes its MTV network available to an OVD, Comcast must make a similar channel, not just any channel available, so that the OVD could not say that because it had access to MTV, Comcast must make USA Net available, unless the OVD could convince either Comcast or an arbitrator they were similar enough to invoke the condition. 

Cohen said he was not disappointed with FCC/Justice decision to require Comcast to become a passive investor in Hulu, with NBC no longer having any operational say, including any board member. "We're perfectly satisfied with that," he said. Cohen pointed out that there had been calls for divestiture, and he was pleased Justice did not require that. "We're not uncomfortable with the restrictions on governance rights," he added.

BitTorrent Inventor Demos New P2P Live-Streaming Protocol

Excerpted from TorrentFreak Report by Ernesto Van Der Sar

Bram Cohen, the inventor of the BitTorrent protocol that revolutionized file sharing, is finalizing the code for his new P2P live-streaming protocol. With his efforts, he aims to develop a piece of code superior to all other streaming solutions on the market today. The release of the application is still a few months away, but Cohen has shown a demo exclusively to TorrentFreak.

BitTorrent was the first widely adopted technology that made it possible to download large videos online in a timely fashion. It's needless to say that BitTorrent inventor Bram Cohen unleashed a small revolution here, even though he never envisioned the technology being used to swap video.

However, a key characteristic of the young Internet is that it constantly evolves, and in 2005 video streaming was brought to the mainstream thanks to YouTube. This online video streaming revolution has hugely increased the use of bandwidth by individual consumers. At the same time it's also resulting in huge bandwidth bills for streaming sites.

So as we near the 10th anniversary of BitTorrent, its inventor Bram Cohen is finalizing a new protocol, this time aimed at P2P live-streaming. Although P2P live-streaming is not something new per se, Cohen thinks that his implementation will set itself apart from competitors with both its efficiency and extremely low latency.

"Doing live properly is a hard problem, and while I could have a working thing relatively quickly, I'm doing everything the 'right' way," Bram Cohen told TorrentFreak last year when he announced his plans. He further explained that the BitTorrent protocol had to be redone to make it compatible with live streams, "including ditching TCP and using congestion control algorithms different from the ones we've made for UTP"

In the months that followed Cohen figured out most of this complex puzzle and the technology is now mature enough to show to the public. Although there's still a lot of secrecy around the technical details, the BitTorrent team agreed to show TorrentFreak a demo in anticipation of the official release later this year.

Although it's fascinating to see BitTorrent's inventor waving at a computer, it is impossible to see how this compares to competing technologies without the option of testing a working version and having more technical details.

Over the years we've already seen a few working implementations and adaptations of the BitTorrent protocol that allow for P2P live streaming. Most notable is the SwarmPlayer, which has proven to work well with low latencies in real live tests, but usually supported by high bandwidth 'fall-back peers'.

"The main areas of innovation relate to techniques he is using to manage latency at an unprecedented low while controlling network congestion," BitTorrent's VP of Product Management Simon Morris told TorrentFreak in our quest for more information.

"As outlined in the academic literature on live P2P content delivery, the management of live p2p streaming on the open internet requires split second reconfigurations to reroute content delivery in the fewest possible round trips between peers in the event of network hiccups."

"Bram's methods to manage network reconfiguration wrap rerouting together with a novel approach to congestion control. Obviously we'll be happy to share more technical details in due course, but only once the technology reaches a level of maturity that it makes sense to share."

This means that the wait continues, and we were told that the official release will take at least a few more months. For some reason we think that it might take until July, which makes sense PR-wise because the BitTorrent protocol then officially celebrates its tenth anniversary.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyAs DCINFO readers recall, a so-called network neutrality compromise was enacted through a narrow party-line vote in December by a divided Federal Communications Commission (FCC).

Industry observers called it an ill-conceived power-grab by the FCC over a burgeoning Internet marketplace that has not demonstrated the need for such regulation.

The Commission's move was also sharply criticized by representatives of both parties in Congress as being unauthorized and unwarranted: the FCC does not have such jurisdiction.

And the judicial system had already opined - in a US Court of Appeals for the DC Circuit ruling in April 2010 - that the Commission lacks the authority to regulate the Net in this manner.

With the new Congress now focused on more urgently pressing matters and with an underlying concern that President Obama could fail to approve a Congressional repeal of the FCC's rulemaking, Verizon this week asked that same Court that ruled in April to reaffirm its position by overturning the FCC's latest attempt to overstep its authority.

Indeed, nothing has changed since April that would grant the Commission these new powers. Since then, if anything, the private sector along with two-of-the-three branches of the federal government have weighed in more strongly opposing the FCC's action.

The FCC regulations essentially echo an interim proposal from the House Commerce Committee - but importantly without its critical sunset provision - banning wireline providers from blocking material and also from engaging in unreasonable discrimination - which can include paid prioritization - while also prohibiting wireless providers from blocking sites and competing apps.

The Commission did abandon the most controversial aspect of its original plan - to reclassify broadband access as a Title II telecommunications service under now outdated rules for single-carrier voice telephony from decades ago - but this also brought into sharper focus the question as to the agency's fundamental authority to enforce this order.

Leading House Republicans raised that issue, with Congressmen Joe Barton (R-TX) and Cliff Stearns (R-FL) questioning "the FCC's statutory authority to adopt these rules under Title I. The DC Circuit ruled in April that the FCC had failed to demonstrate authority under Title I to regulate Internet network management."

"In the absence of clear authority, the FCC should defer to Congress in this matter."

Congressman Fred Upton (R-MI), who now Chairs the Commerce Committee, was even more critical, saying, "We have all grown sick and tired of the Chicago-style politics to ram through job-killing measures at any cost, regardless of the consequences or damage to our economy. Rather than put a gun to the heads of our largest economic engines, now is the time for the FCC to cease and desist."

"The FCC does not have authority to regulate the Internet, and pursuing net neutrality through Title I or reclassification is wholly unacceptable. Our new majority will use rigorous oversight, hearings, and legislation to fight the FCC's overt power grab," he said.

Senator Kay Bailey Hutchison (R-TX) also voiced opposition: "I have not seen any evidence to date that would justify this regulatory overreach," she said. "In fact, the Internet has developed and thrived precisely because it has not been weighed down with burdensome government regulations. I am especially troubled that this action would occur without Congressional input," she said.

"I will explore all options available to keep the FCC from implementing regulations that will threaten the innovation and job creation opportunities associated with the Internet."

Within the Commission itself, Senior Republican Commissioner Robert McDowell "strongly opposed" the order as an "ill-advised maneuver. Such rules upend three decades of bipartisan and international consensus that the Internet is best able to thrive in the absence of regulation."

"Pushing a small group of hand-picked industry players toward a 'choice' between a bad option (Title I Internet regulation) or a worse option (regulating the Internet like a monopoly phone company under Title II) smacks more of coercion than consensus or compromise," he said.

"This 'agreement' has been extracted in defiance of not only the courts, but a large, bipartisan majority of Congress as well. Both have admonished the FCC not to reach beyond its statutory powers to regulate Internet access. By choosing this highly interventionist course, the Commission is ignoring the will of the elected representatives of the American people."

And Republican Commissioner Meredith Attwell Baker added that the FCC does "not have authority to act."

At the time the FCC voted to proceed, the DCIA was less critical of this latest attempt by the Commission to strike a practical balance between protecting the openness of the Internet while also encouraging continued investment and innovation in Internet-based services. The removal of the specter of the possibility of reclassifying broadband under Title II, as well as the open and inclusive process that the FCC had taken in drawing up this more workable compromise, were laudable.

However, constructing a legal framework to address network management practices, in an area that is as rapidly evolving from a technology standpoint as broadband, still poses enormous challenges to all concerned, with extremely complex issues - such as the distinctions between wireless and wired access to name one - and growing concerns over the FCC's failure to obtain Congressional approval before codifying such an order and - perhaps most importantly - not including a sunset provision.

In this rapidly changing marketplace, inappropriate and premature regulation can be harmful to consumers and companies alike. Above all, policymakers need to be vigilant to ensure that nothing is done to impair the industry's ability to create new jobs, develop and deploy new services, and continue to be a key driver of economic growth.

As Verizon pointed out before the FCC acted, "In tackling this issue, the FCC is hamstrung by an antiquated communications statute. That's why this issue should be addressed by Congress. Verizon has consistently called on Congress to update and reform the statute and adopt public policies that will encourage an open Internet, as well as promote investment and innovation across the Internet marketplace."

The DCIA continues to believe that, absent a compelling marketplace need that would justify permanently accepting the FCC's rulemaking, a comprehensive review and very careful redrafting of the Communications Act for the digital age by Congress would be a more prudent approach, and the Court should overturn the Commission's order. Share wisely, and take care.

Verizon to Continue Rapid Cloud Ramp

Excerpted from Information Week Report by Charles Babcock

Verizon Business rapidly expanded its cloud capabilities last year, and plans to do the same in 2011. It's not previously been seen as a prominent name in cloud computing, but its enhanced capabilities indicate that may be about to change.

It started out in 2009 offering simple infrastructure-as-a-service (IaaS), like Amazon's EC2. Later this year, it will move beyond infrastructure into platform-based services and start offering customer relationship management (CRM) and enterprise resource planning (ERP) applications online as software-as-a-service (SaaS). Not only is it bringing increased capabilities, but its customers are bringing increased demands, as they broaden the role Verizon's computing-as-a-service (CaaS) plays in their operations.

"We are seeing very broad use-cases," said Patrick Verhoeven, Verizon Manager of Cloud Services, as business workloads become more production-oriented and less dominated by website applications or software test and development.

Verhoeven doesn't look under the lid of customers' workloads, but he suspects a few are making use of Verizon's ability to guarantee Payment Card Industry (PCI) compliant infrastructure to customers. PCI-compliant architectures in the cloud are gaining new credibility as Amazon Web Services (AWS) announced it had achieved PCI compliance on December 7th, which would allow credit card transactions to take place there. Verizon had its own audited and compliant infrastructure in place as of August 18th.

"Customers still have to undergo a third-party audit to ensure their systems connected to the cloud are compliant. But knowing the cloud infrastructure is already compliant makes that easier," said Verhoeven.

Verizon's CaaS cloud is managed following IT Infrastructure Library guidelines and is audited for SAS 70 compliance. Part of Verizon's approach to cloud users this year will give them not only fast, automated provisioning of servers, but also the ability to conduct secure transactions in an environment managed to established standards. One result is that Verizon moved out of the also-ran category on December 22nf into the Gartner "Leaders" so-called magic quadrant. Other leaders include Rackspace, Terremark, Savvis, and AT&T, according to Gartner. AWS with EC2 is shown as leading the pack in the "Visionaries" quadrant.

In addition, Verizon is one of a handful of suppliers named a first-tier implementation partner by VMware, which means Verizon had adopted VMware's vCloud compatibility software and can run VMware ESX Server virtual machines. Since VMware is the most widely distributed virtualization software in the enterprise, a VMware-compatible cloud might have a broader market appeal than ones that run only their own brand name.

Verizon's CaaS is based in data centers in Amsterdam, Hong Kong, and Beltsville, MD. Additional cloud data center capacity will be added by the end of the first quarter of 2011 in London, Canberra, and San Jose, CA. Two additional data centers will come on line in Culpepper, VA and Miami, FL by the end of the first quarter to supply cloud services to the US government. In all, Verizon manages 200 data centers worldwide, with the bulk of them dedicated to its network and telecommunications traffic, hosted services, and co-location services.

Verizon cloud services are different from some people's notion of cloud computing. Verizon CaaS may host either virtual machines or actual, dedicated blade servers for any customer who wants its own physical resources. Verhoeven said some customers seek their own blade when they're running a major database system in the cloud, and 30% of its cloud business consists of the dedicated blade. Savvis and Rackspace also offer servers in either hardware or virtual forms; EC2 is a virtual machine environment.

Cloud users are no longer impressed with the speed with which you can spin up a virtual machine for them. Rather, they may want that virtual machine in a low-cost, plain vanilla, x86 environment one minute and in a highly secure, well-managed one the next. That would allow them to develop and test a system in the cloud at low hourly rates, then upgrade it to a production system with stronger guarantees of high availability. Verizon is also likely to expand its security options as an established supplier of security services to business, such as intruder detection and firewall protections.

"There is no single cloud model that is going to prevail," predicts Verhoeven. What Verizon is seeking to do is provide a variety of offerings, such as CRM and ERP, to appeal to many types of consumers. He wouldn't specify who will supply the base applications.

Verizon has plans to offer platform as a service products as well, where the cloud supplier typically makes development tools and other services available to parties running applications in its environment. But Verhoeven was mum on what direction the platform might take.

He said Verizon can combine network services with its cloud services offerings. For example, a CaaS customer can opt to use Verizon's own private, secure IP network as a substitute for the Internet, for those who want that added measure of security and surveillance.

Many of its earlier cloud offerings appeared to be extensions of Verizon's co-location and managed services offerings, rather than fresh products from a newly minted cloud vendor. Now that it's moving deeper into cloud computing, it's spending less time following the "visionaries" and more time showing what can be done. With its security, availability, and VMware compatibility, it's in a better position than some to offer hybrid cloud computing, with the same workload running sometimes on-premises, sometimes in the cloud.

The Gamification Summit Webcasts Inaugural Event

Even if you weren't one of the lucky people who snagged a ticket to this exclusive sold-out event, co-sponsored by the DCIA on January 20th, you're in luck.

The Gamification Summit teamed up with Fora.tv to bring you streaming coverage of the conference so you can still hear amazing keynotes, case studies, and panels from top thought-leaders in the comfort of your own home or office.

If you register now, you'll get access to the entire day of sessions with 30 days to watch and review featuring speakers like Neal Freeland from Microsoft and Jesse Redniss from Comcast/NBCU.

There were in-depth case studies, raucous panels, and blockbuster keynotes at the summit - the webcast truly is the next best thing to being there.

You simply can't get this kind of Gamification wisdom in any other setting.

Amazon Brings Cloud Computing to the Masses

Excerpted from Forbes Report by Eric Savitz

Amazon this week unveiled Elastic Beanstalk, a new free service to make it easier for application developers to use the company's suite of cloud-based computing services.

Amazon's growing suite of Web-based computing services offer an increasingly popular way for operators of Internet-based businesses to fill their computing requirements. The new service is intended to hide some the complexity involved in using Amazon to manage the complexity of running your business that way.

"Developers simply upload their application, and Elastic Beanstalk automatically handles the deployment details of capacity provisioning, load balancing, auto-scaling, and application health monitoring," the company said in announcing the service.

"At the same time, with Elastic Beanstalk, developers retain full control over the Amazon Web Services (AWS) resources powering their application and can access the underlying resources at any time." Customers pay nothing for Elastic Beanstalk, paying instead for the resources used to run their applications.

Elastic Beanstalk provides a quick way for developers to make use of a range of Amazon services - Amazon EC2 (for computing power), Amazon S3 (for data storage), Amazon Simple Notification Service, Elastic Load Balancing, and Auto-Scaling. "Elastic Beanstalk handles the provisioning and deployment of the infrastructure needed to run the application," the company said.

PacketExchange Launches Cloud Network Solution

Excerpted from the WHIR Report by Nicole Henderson

Network services provider PacketExchange launched its cloud computing networking solution network infrastructure as a service on Monday at the Pacific Telecommunications Council conference in Honolulu, HI.

PacketExchange says its solution routes data traffic across its converged global private and public IP network infrastructure and is ideal for cloud-based applications.

The solution offers software-as-a-service (SaaS) companies a secure and reliable way to access, distribute and share information.

"Our experience in providing private, custom data delivery solutions to businesses that rely on the cloud has allowed us to engineer network solutions that assure the quality of service (QoS) these applications demand," said Grant Kirkwood, CTO of PacketExchange. "With 100% SLA guarantees across our network, PacketExchange provides a reliable and robust infrastructure that offers the bandwidth flexibility needed to meet the dynamic needs of cloud-based data communications."

According to PacketExchange, its solution is a scalable infrastructure enabling resources to be provisioned on-demand.

The company enables customers to choose the best available paths for data through private and public infrastructures.

Joost Video Network Becomes Stand-Alone Business Unit

Excerpted from Online Media News Report by Laurie Sullivan

Adconion Media Group announced Thursday the spinoff of the Joost Video Network into a stand-alone business unit. The newly launched digital media company will provide premium branded solutions for advertisers and brand marketers seeking to target audiences with in-stream and in-banner video advertising.

Nick Higgins will lead the new unit as Executive Vice President. He previously held the position of head of global video at Adconion Media Group. Prior to joining Adconion, Higgins was at MSN, where he held several senior positions during the past 10 years.

Adconion acquired the assets-digital rights management, video player, and content distribution of Joost from ex-Skype founders in November 2009, and then launched the Joost Video Network in February 2010 across North America, Europe and Australia. Since then, Tyler Moebius, CEO of Adconion Media Group, says the company quadrupled revenue to $30 million and expects to triple digital growth in 2011, making it the largest global video player operating in more than seven countries.

Joost is not the top video destination site but was No. 2 early last year in terms of reach, Moebius says, citing comScore. The company serves display ads to more than 400 million viewers monthly. The opportunity to become aggressive and capitalize on the video market led Adconion to spin off the network into a separate business unit.

Moebius believes the move will position the company to capture more of the video market. Unlike other pure-play video networks, he says this will allow Adconion to reach across a broader suite of products such as pre-roll, in-banner video, expandable ads, road blocks, and custom skins and integrations.

Joost also offers branded entertainment services through its partnership with RedLever, a global studio specializing in developing and producing brand-integrated content for the web. The stand-alone business unit plans to leverage its core strengths in its owned-and-operated site, and non-exclusive and exclusive partnerships.

Moebius says Adconion gives advertisers access to premium inventory through exclusive content deals. But the company still has a way to go.

Video ads reached 49% of the total US population an average of 36.8 times in November, according to comScore's latest stats. Americans viewed more than 5.4 billion video ads in November, with Hulu generating the highest number of video ad impressions at more than 1.1 billion. Tremor Media Video Network ranked second overall and highest among video advertising networks, with 477 million ad views, followed by adap.tv at 446 million, and Microsoft Sites at 427 million.

The plan to hire about 30 employees to support the new division this year will increase the dedicated Joost sales staff to about 80 people. The company basically serves agencies such as Fortune 500 and 1000 companies, those buying video solutions from Hulu, ABC, Microsoft, and portals.

Rackspace and Akamai Partner on Content Optimization

Rackspace Hosting, a specialist in the hosting and cloud computing industry, and Akamai Technologies, a provider of cloud optimization services, announced a strategic relationship that will enable Rackspace to offer Akamai's web acceleration and cloud optimization services as part of its dedicated and cloud hosting portfolio.

According to a release, responding to the needs of its customers, Rackspace will work to integrate key features from Akamai such as CNAMEs, Secure Sockets Layer (SSL), and CDN delivery for Cloud Files, a Rackspace service that provides highly scalable online storage for files and media.

Additionally, Rackspace will resell a range of essential Akamai site and application acceleration services to its customers. The goal is to create a one-stop shop for hosting, cloud, and acceleration services for web content and applications.

"The next decade presents new opportunities for two industry leaders, Rackspace and Akamai, to better help customers succeed with their cloud strategies, especially around feature-rich content, SaaS applications, and dynamic websites," said Lew Moorman, President, Cloud and Chief Strategy Officer for Rackspace. "Having independently served hundreds of Akamai's customers, we are excited to streamline the customer experience on dedicated and cloud platforms with the leading cloud optimization services provider, all backed by our trademark Fanatical Support."

Akamai services are designed to reduce latency and provide for high availability in the cloud by leveraging more than 77,000 servers located in 71 countries.

"This is a strategic relationship for Akamai and Rackspace aimed at helping our enterprise customers realize the full potential of cloud services from the data center to the end-user," said Willie Tejada, Vice President, Application and Site Acceleration for Akamai. "By combining two best-of-class solutions for application hosting and cloud acceleration, Rackspace customers can gain improved performance, security, and reliability for their web-based applications."

Rackspace will include Akamai solutions, either as an add-on or as an embedded suite of services, as part of Rackspace's full range of offerings across dedicated, cloud, and hybrid hosting. The initiative will begin with Akamai's content delivery services being integrated with Cloud Files, a Rackspace service that provides highly scalable online storage for files and media.

Boxee May Use Subsidies to Compete with Apple TV

Excerpted from Electronista Report

Boxee may turn to subsidies to help get the price down and better compete in the market, company founder Avner Ronen said in an episode of This Week in Start-Ups recorded on Friday. He mentioned that the $200 street price for a Boxee Box was "way too expensive" to get mass market adoption and floated the idea as one option for future models. Nothing was definite in the talk and might not necessarily come about with a future deal.

Ronen also provided some brief hints at the possible future of Boxee. Gaming was a possibility as he noted that games had often driven technology adoption, but it would most likely occur through the browser in HTML rather than through proprietary apps. A TV tuner had been ruled out in at least the short term as it was a matter of "focus and resources."

DVR had some slight promise. Boxee team members "haven't decided" on whether or not to let the hub record TV, but Ronen acknowledged that many on the official forums had called for it. Development resources were again the primary focus. "We're a small team," he said. Boxee hasn't given out sales numbers but has taken a mixed approach to competing with Apple. The hopes to subsidize the Boxee are a first sign of intent to more directly challenge the Apple TV and Roku's Internet Player, both of which hit or dive below the $100 price mark through their use of cheaper but roughly as efficient processors.

Ronen reiterated, though, that he saw Boxee as filling in gaps that Apple didn't cover, giving users content outside the iTunes ecosystem. "We're fine with owning that niche for ourselves," he explained.

GoGrid Private Cloud Leverages Public Cloud Resources

Excerpted from eWeek Report by Fahmida Rashid

GoGrid introduced an enterprise-grade hosted private cloud platform this week, delivering to customers both the benefits of cloud computing and dedicated server hosting.

The GoGrid Hosted Private Cloud features the same capabilities as GoGrid's public cloud offering, the company said. The hosted private cloud offers the same on-demand, programmable, manageable, and scalable service that customers demand from a public cloud.

GoGrid Hosted Private Cloud "delivers the complete set of benefits and features expected from a public cloud computing environment on private, dedicated hardware," said John Keagy, CEO and co-founder at GoGrid.

GoGrid allows customers to deploy infrastructure components with a self-service and metered pricing model, making the environment more like public clouds than the traditional hosted services.

For its platform, GoGrid integrated the hosted private cloud with its public cloud to speed up the process of allocating new resources. Customers use the public cloud to quickly add or expand capacity as needed while running the applications on the hosted private cloud side, the company said. Customers can automatically deploy individual virtual machines on their own, but GoGrid will need extra time to source additional hardware when needed.

The hosted private cloud is more than just adding virtualization software to dedicated hardware, which can be cost-prohibitive, said Keagy.

Customers typically lose some of the financial benefits of the public cloud because they have to order and pay for an entirely new server or blade in order to scale up, instead of just adding in the little bit of storage or networking that is needed. In return, they gain security and reliability of being on a machine that no one else is using.

Gartner analyst Ted Chamberlin said GoGrid's private cloud platfrom will help the company compete with its larger rivals, such as Amazon and Rackspace Hosting. "Amazon doesn't do this," Chambers says, and that "limits their attractiveness to enterprise customers who have security and privacy concerns."

France Telecom's Orange is one of the first companies to use GoGrid's Hosted Private Cloud, GoGrid said.

A number of other companies have recently rolled out hosted private clouds, including Unisys and Rackspace Hosting. Hosted private clouds are often used to power back-end systems because they are unlikely to need additional resources to accommodate spikes in performance and traffic.

GoGrid recently expanded its a Hybrid Hybrid Cloud service, where customers can use a GoGrid data center to build secure, high-performance infrastructure to power applications on dedicated and virtual servers. The GoGrid Hybrid Cloud combines virtual and physical servers in the public cloud, using hardware-based F5 load balancers, private VLANs, integrated cloud storage and hardware firewall appliances, the company said.

Hybrid Cloud customers can provision dynamically scalable virtual and physical environments via a Web portal, the company said. Users can choose between physical and virtual servers when building out critical business infrastructure, said Philbert Shih, Senior Analyst at Tier1 research. There are benefits to giving IT teams the flexibility to manage both types of servers from a single interface, instead of creating a physical bridge to connect the two types, according to Shih.

Hybrid Cloud Computing: The Future Trend in Cloud

Excerpted from Cloud Computing Journal by Deepak Vohra

Cloud computing has been a boon to conserving resources by providing a farm of servers that are concurrently used by multiple users. Cloud computing precludes the requirement for setting up per-user servers.

In a recent InformationWeek survey of business technology professionals, most have indicated a preference for cloud computing for storage, archiving, and disaster recovery, for business applications, servers, raw computing power, dedicated data center space, databases, and specialized IT services such as security, management, and compliance.

Cloud computing may be public or private. In public cloud computing resources are shared over the Internet on a fine-grained self-service basis such as Amazon EC2. Private cloud computing is the equivalent of public cloud computing on a private network.

Resource efficiencies associated with public cloud computing outweigh those associated with private cloud computing due to the limited scope of private cloud computing as it is isolated to an enterprise. Private cloud computing does provide the benefits of enterprise level security and reliability.

Most applications are designed for the tightly coupled enterprise environment and switching costs are involved to migrate them to loosely coupled cloud environments.

Hybrid cloud computing has emerged to benefit from both the "private cloud" and "public cloud" architectures by spanning enterprise data centers and public clouds (such as the Amazon EC2). Combining virtual and physical collocated assets, such as routers and servers, is the preferred implementation by most enterprises.

The main advantage of cloud computing is scalability. Most enterprises have organization-level resources that may at times be required to be extended to a public cloud. If an application requires more than anticipated resources, more resources may be allocated to the application on a public cloud.

Hybrid clouds also include allocating resources to a different public cloud if the servers at one of the public clouds overloads.

Based on the concept of multiple public clouds, IBM has introduced a hybrid cloud in alliance with Juniper Networks in which users are allocated to a different cloud if a cloud overloads. Juniper shall be providing hybrid cloud computing to IBM's 9 worldwide computing labs.

IBM Cloud Labs provides the world's largest network of cloud computing labs extending from the Silicon Valley, CA to Tokyo, Japan. In collaboration with Juniper, IBM offers drag-and-drop cloud management with its Cloud Management Console to extend computing resources between private and public clouds.

The console shows virtual machines (VMs) as small color-coded (to indicate if they are being used) boxes that may be selected to allocate resources. Juniper provides the remote management of clouds over Multiprotocol Label Switching (MPLS) networks.

IBM's hybrid cloud computing is backed with other computing services; Service Management Center for Cloud Computing, which clients may use to build and deliver cloud services; IBM Rational AppScan 7.8 for securing web services published into a cloud; and IBM Design and Implementation for Cloud Test Environments for testing clouds within client environments.

Hybrid cloud computing in which enterprises extend their resources to public clouds is the trend of the future. In addition to providing reliability and scalability of public clouds, hybrid cloud computing has the appeal of providing the most suitable environment for applications.

Some applications, such as databases, run better on a dedicated server than on a shared server. And other applications run better on a cloud as they are able to avail of the extensibility of the cloud to scale as required.

"Hybrid" implies "public" and "private" clouds, and providing multiple public clouds does not diminish the demand for private clouds. WebSphere Cloudburst appliance provides the ability to create and deploy WebSphere environments to private clouds. Some of the other private cloud products include 3Tera's AppLogic, ParaScale's Virtual Storage Network (VSN), and Cassatt's Collage 4.0.

Hybrid cloud computing is the future direction of cloud computing and increasingly hybrid cloud products are being offered.

The Citrix Cloud Center C3 technologies provide a hybrid cloud. The Elastra Enterprise Cloud Server has been named one of the "100 Coolest Cloud Computing products" and provides federated hybrid cloud management.

For a demo of Hybrid clouds refer to the Elastra Hybrid Cloud Administration demo. As recently as a few months ago, Microsoft emphasized the hybrid cloud at TechEd. OpenNebula recently released a Datacloud adaptor for Hybrid clouds. IBM recently acquired Cast Iron Systems to provide support for hybrid clouds.

Hybrid cloud computing is the future trend in enterprises as noted by Lucas Searle, head of virtualization at Microsoft UK.

Coming Events of Interest

Global Services Conference 2011 - January 27th in New York, NY. Cloud computing has implications not only for IT services but also for business processing; cloud-based delivery models present a discontinuous and disruptive shift that will redefine how IT and BPO services are delivered. The conference will present actionable propositions to leverage cloud-based models.

Digital Music Conference East - February 24th in New York, NY. The 11th Annual DMCE is the only event in the United States that brings together the top music, technology and policy leaders for high-level discussions and debate, intimate meetings and unrivaled networking.

Cloud Connect Conference - March 8th-10th in Santa Clara, CA. Learn about all the latest cloud computing innovations in the Cloud Connect Conference - designed to serve the needs of cloud customers and operators - where you will see the latest cloud technologies and platforms and identify opportunities in the cloud.

Media Summit New York - March 9th-10th in New York, NY. This event is the premier international conference on media, broadband, advertising, television, publishing, cable, mobile, radio, magazines, news & print media, and marketing.

NAB Show - April 9th - 14th in Las Vegas, NV. For more than 85 years, the NAB Show has been the essential destination for "broader-casting" professionals who share a passion for bringing content to life on any platform - even if they have to invent it. From creation to consumption, this is the place where possibilities become realities.

CONTENT IN THE CLOUD at NAB - April 11th in Las Vegas, NV. What are the latest cloud computing offerings that will have the greatest impact on the broadcasting industry? How is cloud computing being harnessed to benefit the digital distribution of television programs, movies, music, and games?

1st International Conference on Cloud Computing - May 7th-9th in Noordwijkerhout, Netherlands. This first-ever event focuses on the emerging area of cloud computing, inspired by some latest advances that concern the infrastructure, operations, and available services through the global network.

Cloud Expo 2011 - June 6th-9th in New York, NY. Cloud Expo is returning to New York with more than 7,000 delegates and over 200 sponsors and exhibitors. "Cloud" has become synonymous with "computing" and "software" in two short years. Cloud Expo is the new PC Expo, Comdex, and InternetWorld of our decade.

Copyright 2008 Distributed Computing Industry Association
This page last updated February 6, 2011
Privacy Policy