Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

aspera

Avid

Chyron

Front Porch Digital

Rackspace

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

April 9, 2012
Volume XXXIX, Issue 1


Aspera Delivers on the Promise of the Cloud at NAB 2012

Aspera, creator of next-generation software technologies that move the world's data at maximum speed, will be demonstrating what is now possible in the cloud for digital media companies at NAB 2012. Aspera will be introducing its Direct-to-S3 transfer software allowing for seamless, high-speed, and secure data transfer of file-based digital content at any global distance through this innovative combination of Aspera's patented fasp transport technology with cloud based "object" storage such as Amazon S3.

Throughout the event, Aspera will be showcasing its industry-standard product line enabled with this new capability for high-speed data access in public and private cloud environments and, with Amazon Web Services , will be showing the efficient, large scale workflows that are now possible.

As part of the official NAB conference program, Aspera President Michelle Munson will be presenting the advanced capabilities, new workflow possibilities, and cost advantages of these solutions at the CLOUD COMPUTING CONFERENCE on Monday, April 16th.

The cloud's promise of virtually unlimited, instant increases in compute, network, and storage resources, is tempered by significant technology challenges in getting large media data to, from, and across remote infrastructures at long distances. Until now, the full potential of the cloud to transform IT processes for digital media - such as transcoding, archiving and distributing entertainment content - has been limited by the inherent bottleneck in moving the data.

"Aspera pioneered the high-speed enablement of data-intensive workflows throughout the enterprise, and has now brought the same level of innovation to the cloud with its high-speed transport capabilities, available on-demand," said Bhavik Vyas, Director of Cloud Services at Aspera. "We believe that big data should move at the speed dictated by business requirements, and not be bottlenecked by conventional technologies."

Aspera is now first in the world to offer seamless, line-speed ingest and distribution of very large media files to and from cloud-based object storage such as the Amazon S3 service, independent of distance, and completely secure. With digital supply chains now spanning the globe and the complexity associated with transferring ever-larger file sizes over longer distances increasing exponentially, digital media companies can now realize the full benefits of the cloud with Aspera-on-Demand solutions for the high-speed transfer, processing, and storage, of their digital content.

"Our customers and partners using the Amazon Cloud, such as Netflix, Zencoder, SendToNews, and others should never have to worry about the tens of terabytes of content they need to ingest, prepare, and deliver every month, nor should they be concerned about content delivery spikes in demand that immediately require more throughput," commented Michelle Munson, President and Co-Founder of Aspera.

"By implementing our joint solution with Amazon Web Services, media companies can focus on what they do best - delivering the best quality content fast - instead of solving IT infrastructure problems."

Sorenson Media to Showcase Enhanced Video Encoding at NAB

Sorenson Media will be actively participating in the 2012 National Association of Broadcasters (NAB) Show, demonstrating the company's full suite of video encoding solutions throughout the event. In addition, the company will be participating in the associated CLOUD COMPUTING CONFERENCE hosted by the Distributed Computing Industry Association (DCIA) on Monday, April 16th.

Sorenson Media will be demoing its full suite of enhanced video workflow solutions at the show, including the gold-standard desktop encoding and transcoding application Sorenson Squeeze 8; the brand new professional level Sorenson Squeeze 8 Pro, which provides professional level support for Avid DNxHD and Apple ProRes formats; three versions of the high-volume Sorenson Squeeze Server encoding platform, including on-premise, cloud-based and hybrid solutions; and Sorenson 360, the company's innovative online video platform (OVP).

In addition, Kirk Punches, Vice President of Business Development, will represent the company on the "Audio/Video Pre-Production, Production, Post-Production Clouds" panel on April 16th at 1:10 PM. The panel will examine leading examples and key case studies of how cloud-computing solutions are accelerating processes, improving quality, and reducing the costs of collaboration, editing, animation, applying metadata, formatting, transcoding and other functions.

"The power and promise of cloud computing is at the center of Sorenson Media's expertise and product development focus," Punches said. "We are honored to share the expertise we have developed as the leading provider of video encoding and workflow solutions with the top-flight video professionals who will be at NAB and the CLOUD COMPUTIJNG CONFERENCE."

WSO2 Executive to Speak on the Future of Cloud Computing at NAB

WSO2 Vice President of Technology Evangelism Chris Haddad will speak as a panelist in the "Years Ahead for Cloud Computing" session at the 2012 NAB Show for the digital media and entertainment industry.

The session is part of the CLOUD COMPUTING CONFERENCE, the second annual DCIA-within-NAB event presented by the Distributed Computing Industry Association (DCIA). The 2012 NAB Show will run from April 14th through the 19th at the Las Vegas Convention Center in Las Vegas, NV.

This panel session will examine credible forecasts and projections and what they suggest about the long-term impact of cloud computing solutions for the audio/video (A/V) ecosystem. It also will explore the significance of cloud computing for the underlying businesses that are based on content production and distribution. Additionally, experts will provide an overview of how cloud computing positively affects each stage of the content distribution chain, from collaboration and post-production to storage, delivery and analytics.

The panel is scheduled for Monday April 16th at 4:40 PM in N232 of the North Hall at the Convention Center.

Chris Haddad works closely with developers, architects, and C-level executives to increase WSO2 technology adoption and maximize customer value. Previously, Chris led research teams at Burton Group and Gartner advising Fortune 500 enterprises and technology infrastructure vendors on adoption strategies, architecture, product selection, governance, and organizational alignment.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA proudly announces the final agenda, principal sponsors, and line-up of industry-leading speakers for our inaugural CLOUD COMPUTING CONFERENCE at the 2012 NAB Show.

This second annual DCIA "Conference within NAB" is scheduled for Monday April 16th in N232 of the North Hall at the Las Vegas Convention Center.

In addition, the DCIA will exhibit in N3222M at the CLOUD COMPUTING PAVILION, a first-ever special section of the NAB Exhibit Floor totally dedicated to cloud computing.

The NAB Show has evolved over the last eight decades to continually lead its ever-changing industry.

While solutions have changed to keep pace with consumer habits and technologies, aspirations to produce and deliver memorable content have remained constant.

The NAB Show will be attended by 90,000+ media and entertainment professionals from over 150 countries. More than $18.8 billion in purchasing power will be represented onsite. 1,500+ companies spread over 745,000 net square feet will exhibit. There will be more than 500 skill-building sessions; and 1,300+ members of the press will cover the event.

For more information or to register, please click here.

This very timely conference will demonstrate how software developers are addressing two major concerns with respect to cloud-based solutions for audio/video (A/V) delivery - reliability and security.

Experts will provide insights into how cloud computing impacts each stage of the content distribution chain, from collaboration to storage and delivery all the way through analytics.

Sponsoring companies include Aspera, Avid, Chyron, Front Porch Digital, and Rackspace.

The agenda for the conference, which begins at 10:30 AM and continues until 6:00 PM PT, will open with "Latest Trends in Cloud Computing Solutions for the A/V Ecosystem" and then move to "Key Pitfalls Associated with Cloud Computing in High-Value Content Implementations."

The conference will then explore "Various Ways that Cloud Computing Is Being Applied to the Content Creation Process - from Pre- to Post-Production," followed by "Alternative Approaches for Implementing Cloud Storage of Content Catalogs and Libraries and Leveraging Cloud-Based Distribution," and then, "New Levels of Media Performance Data Enabled by Cloud Computing - and Impact on Other Sectors."

The agenda will close out with "Navigating the Current Cloud Environment and Planning for What's Next," and finally, "Disruptive Effects of Cloud Computing Will Continue."

Keynote speakers for each of the above areas, respectively, include Bill Kallman, CEO, Scayl; Jim Burger, Member, Dow Lohnes; Mark Davis, CEO, Scenios; Jonathan King, SVP, Joyent; Scott Brown, GM & SVP Strategic Partnerships, Octoshape; Jean-Luc Chatelain, EVP, Strategy & Technology, DataDirect Networks; and James Hughes, VP and Cloud Storage Architect, Huawei.

The first panel will explore "Advanced Capabilities, New Features, Cost Advantages of Cloud Computing Solutions" with Mike Alexenko, Senior Director of Market Development, Cloud & Mobility, G-Technology; Scott Campbell, Principal, Media, Entertainment, and Telecoms, SAP; David Frerichs, Strategic Consultant, Pioneer Corporation; David Hassoun, Founder, RealEyes Media; AJ McGowan, CTO, Unicorn Media; Samir Mittal, CTO, Rimage; Michelle Munson, CEO, President, and Co-founder, Aspera; and Robert Stevenson, EVP, Interactive Entertainment, Gaikai.

The second panel will examine "Privacy Issues, Reliability Questions, Security Concerns in the Cloud Computing Space" with Dave Asprey, VP, Cloud Security, Trend Micro; Tom Mulally, Consultant, Numagic Consulting; Graham Oakes, Chairman, Digital Watermarking Alliance (DWA); Rajan Samtani, SVP, Sales & Marketing, Peer Media Technologies; Dan Schnapp, Partner & Chairman of New Media, Entertainment & Technology, Hughes, Hubbard & Reed; Yangbin Wang, CEO, Vobile; Marvin Wheeler, Chairman, Open Data Center Alliance (ODCA); and Vic Winkler, Author, "Securing the Cloud."

The third panel will focus on "A/V Pre-Production, Production, Post-Production Clouds" with Tony Cahill, Chief Engineer, CET Universe; Guillermo Chialvo, Gerente de Tecnologia, Radio Mitre; Gerald Hensley, VP, Worldwide Entertainment Sales, Rovi Corporation; Chris Kantrowitz, CEO, Gobbler; Ajay Malhotra, EVP, North America, Prime Focus Technologies; Todd Martin, SVP, Strategic Solutions Group, Chyron; Kirk Punches, VP, Business Development, Sorenson Media; and Jostein Svendsen, CEO, WeVideo.

The fourth panel will cover "Cloud Media Storage & Delivery" with Bang Chang, VP, Server and Storage, SeaChange International; Stephen Condon, VP, Global Marketing Communications, Limelight Networks; Thomas Coughlin, President, Coughlin Associates; Gianluca Ferremi, VP Sales & Marketing, Motive Television; Corey Halverson, Product Director, Media Business Solutions, Akamai; Kshitij Kumar, SVP, Mobile Video, Concurrent; Kyle Okamoto, Sr. Mgr. Product and Portfolio Mgt., Verizon Digital Media Services; and Mark Taylor, VP, Media and IP Services, Level 3.

The fifth panel will address "Cloud Measurement, Analytics, Implications" with Sean Barger, CEO, Equilibrium / EQ Network; Steve Hawley, Principal Analyst & Consultant, TVStrategies; Jonathan Hurd, Director, Altman Vilandrie & Co.; Monica Ricci, Dir. of Product Marketing, CSG International; John Schiela, President, Phoenix Marketing International (PMI); Nick Strauss, Director of Sales, Verizon Digital Media Services; and Mike West, CTO, GenosTV.

The sixth panel will forecast the "Years Ahead for Cloud Computing" with Saul Berman, Lead Partner, IBM Global Business Services; Ian Donahue, President, RedThorne Media; Chris Haddad, VP, Technology Evangelism, WSO2; Wayne Josel, Counsel, Media & Entertainment, Hughes, Hubbard & Reed; Steve Mannel, Senior Director, Media & Communications, Salesforce.com; James Mitchell, CEO & Founder, Strategic Blue; David Sterling, Partner, i3m3 Solutions; and Chuck Stormon, CEO, Attend.

Moderators will include Adam Marcus, Technology Advisor, DCIA; Brian Campanotti, CTO, Front Porch Digital; and Sari Lafferty, Business Affairs, DCIA. Share wisely, and take care.

Gartner Predicts Cloud Will Replace PC by 2014

Excerpted from OctechAdvisors Report by Jason Makevich

We all know the important role that cloud computing plays in our lives. Whether a business is destroyed by fire or natural disaster, if its files are stored in the cloud, then business can go on as usual; and many Internet service providers (ISPs) include security along with backup and recovery in their service offerings, so business leaders can rest assured that their mission-critical data is reasonably protected from hackers as well. Consumers have already embraced the cloud, streaming movies and television shows on Hulu and Netflix, downloading e-books from Amazon, etc.

But does that mean that the cloud will soon become the be all and end all of our existence? Gartner seems to think so.

Gartner Research Vice President Steven Kleynhans argued that our current era is not the "post-PC" era that many have named it. Instead, he contends that the cloud will, eventually, unify all the devices that consumers use, including PCs. That means if business leaders want to attract new customers while simultaneously holding on to existing ones, they will have to embrace the cloud.

"When we first started talking about the cloud and the technological benefits we would be reaping from the explosive growth of that medium, who knew that it would be the consumer market taking the lead, leaving business enterprises in the dust when it comes to leveraging the cloud," said Luis Alvarez, founder of Internet service company Alvarez Technology Group. "I am amazed by the way that consumer technology has embraced hosted solutions, from something as simple as Facebook to the complexity of entertainment-on-demand as offered by Apple TV or Hulu."

Change doesn't happen overnight, mainly because people tend to resist it. So, the changes that business leaders have to make to incorporate cloud computing into their business strategies will not happen right away. Nor should they. Such drastic changes in business operations should happen gradually over time. But if small-business owners in particular are committed to the success of their businesses, then they will start making those changes sooner rather than later. This is where Internet service providers can prove their worth to their clients.

"If we are going to be the technology advisors our business clients depend on us to be, then we need to figure out how to leverage the cloud aggressively on their behalf, but do it in a secure and safe manner. If you haven't figured out the benefits of cloud computing and how to fold those benefits into the products and services you are offering your clients, you need to do it now or find yourself out of the game in a few short years," said Alvarez.

Maybe saying that the PC will be completely replaced by the cloud is an exaggeration. But it's no exaggeration to say that the role the cloud plays in our personal and professional lives will continue increase year after year. In one respect, perhaps Gartner is right: by 2014, the cloud will be the "glue that holds all of our devices together."

Dell's Bid for Wyse Validates Cloud Vision

Excerpted from Financial Times Report by Paul Taylor

The trend towards desktop virtualization - running corporate desktop PC software on servers "in the cloud" rather than on a local device - is likely to be given a further fillip after Dell agreed to acquire Wyse Technology.

"Desktop virtualization can help organizations streamline IT management, improve productivity and security, and increase cost efficiency," said Jeff Clarke, president of Dell's end user computing division, announcing the deal.

For Dell, the deal could make it easier for it to migrate from "legacy" desktop PCs towards a potentially more flexible, more secure, and more easily managed cloud-computing model in which both applications and data are stored remotely on cloud servers rather than on a local hard drive.

Krista Macomber and Jack Narcotta, analysts with Technology Business Research (TBR), argue that the acquisition of Wyse represents "an aggressive step to bring Dell's solutions strategy closer to an area of historical strength it has with PCs: end-users' work spaces.

"From the core PC customer base to its newly formed software division, Dell is aligning its corporate strategy to build its reputation as an end-to-end, enterprise-caliber solutions provider to remain relevant in an IT industry migrating to as-a-service delivery models," the TBR analysts said.

For Wyse, the proposed acquisition represents a validation of a strategy it has adopted in recent years to reposition the company as a cloud services pioneer. Wyse, a "thin client" pioneer, was founded in 1981 by a husband and wife team while they were studying engineering at Illinois University. It struggled, however, over the years to win broad acceptance for its thin client technology model despite its claimed advantages.

While the PC industry grew dramatically in the 1980s and 1990s, the slow uptake of thin-client computing reflected a number of factors including relatively slow and unreliable data connections, concerns about the security of data stored remotely and perhaps most importantly, the desire of employees to have their own PCs and local storage rather rely on a "dumb terminal" and unseen server.

However, the growing popularity of cloud computing has enabled Wyse and other desktop virtualization specialists to recast themselves as being in the vanguard of the next wave of corporate computing, enabling access to corporate applications from any device - including smart-phones and PC tablets.

Wyse itself has de-emphasized the hardware element of its business and positioned itself as primarily a software supplier that enables companies to accelerate their move to cloud computing as "the leader in cloud client computing".

Crucially the deal with Dell should help Wyse reach a much broader market. "The combination of Wyse and Dell provides us with tremendous growth opportunities for our core desktop virtualization business, helps us expand into new and fast-growing market segments including mobility and cloud computing, and provides us with reach and scale we did not previously have," said Tarken Maner, Wyse's chief executive.

For Dell, which faces the prospect of continued soft PC sales in developed markets like the US and Europe, the deal represents an interesting hedge and should help bolster sales of higher margin products including corporate servers, storage systems, data center systems and services.

As TBR's analysts point out, "industry-wide PC and laptop sales are flat or struggling as customers transition PC infrastructures to virtualized and cloud-based environments, or seek services and other means to extend the capital investments in hardware client fleets.

"As PC's contribution to Dell's revenue and margins steadily declines, the company needs a solution that securely and efficiently links end points with core datacenter components such as servers, storage and networking."

As a result they argue that the acquisition of Wyse creates an end-to-end virtualized desktop offering that better positions Dell to earn new, lucrative business globally from large enterprises, and which is critical as Dell makes its transition from high-volume hardware manufacturer to trusted provider of comprehensive services and solutions.

"Wyse expands Dell's Virtualized Desktop Infrastructure (VDI) portfolio and enables Dell to sell comprehensive solutions that include higher-margin offerings in its hardware portfolio, such as PowerEdge servers and EqualLogic storage," TBR said.

The research firm added that this should better enable Dell to compete against established desktop virtualization competitors, such as HP and Cisco and smaller software specialists like MokaFive.

Carriers Make Their Play in the Cloud

Excerpted from ComputerWorld Report by Brandon Butler

The wave started last year when Time Warner Cable, the telecommunications company serving much of the Eastern United States, spent $230 million to purchase NaviSite, a provider of cloud services for businesses.

Two months later, in April, CenturyLink, the southern telecom, bought Qwest for $12.2 billion worth of stock. That same month, Verizon responded by purchasing Terremark, the cloud infrastructure-as-a-service (IaaS) provider. CenturyLink followed by purchasing Savvis, another IaaS provider, for $2.5 billion.

It's been a busy past 18 months for domestic telecommunication companies that have aggressively entered the increasingly crowded cloud marketplace. But what's the core strategy for these telecoms? And do they really have a chance of competing against industry leaders?

"The telecoms' core business is changing around them, and the cloud is a very natural place for them to go," says Chris Drumgoole, Senior Vice President of Client Services for IaaS company Terremark, which is now owned by Verizon.

As telecoms iron out their cloud strategy, experts say they need to move quickly. Revenues from their traditional voice offerings are eroding and competitors are moving quickly to diversify their offerings and attract new customers. Amazon Web Services (AWS), Microsoft Azure, and Google have all reduced prices on their cloud offerings in the past month, for example.

Telecoms do offer some advantages - namely, they already have a large nationwide network infrastructure. But some experts believe telecoms don't have a chance to compete against market leaders such as AWS, IBM, and HP, and that telecoms instead need to focus on new value-add services they can provide.

"This is a very new direction for the carriers who have in the past played it close to the vest," says Bob Rosenberg, an independent analyst who has been tracking telecommunications companies since 1990.

The carriers are "gobbling up data centers," he says, which in a sense amounts to a game of catch-up. IaaS competitors already own massive IT infrastructures that telecoms are now trying to build up themselves. But there is one area where telecoms have an edge: "They have the wires," Rosenberg says.

Armed with their newly acquired data centers, Rosenberg says the natural move is to play up the security and network infrastructure strengths their legacy wired networks inherently bring. From an end-user perspective, that appeals to customers looking for low-latency connections for large amounts of data transfer.

"As providers of infrastructure services, we have to be able to improve the value proposition for customers," says Bryan Doerr, CTO at Savvis. "Telecoms help us do that. We can leverage the scale, capital efficiencies and combine them with our expertise." One significant differentiator he sees as now being part of CenturyLink, he says, is security. "It's about more than just hosting now; it's about having a network that can connect these enterprises, which is a big part of the security story," he says.

But other analysts say that instead of competing in the IaaS field, telecoms should find their own niche. "Competing on a like-to-like basis on IaaS is not a winning strategy," says Dan Bieler, a telecom industry analyst at Forrester.

Telecoms, he says, have an opportunity to provide value-added services wrapped around their strength, which is connectivity. Bieler believes the best play in the clouds for telecoms is to offer a platform for hosting software programs on their infrastructure. This could include enterprise resource planning (ERP) and customer relationship management (CRM) services, plus additional complimentary services related to connectivity between multiple locations, data transfer security and software management on top of that.

For example, a telecommunications company could offer to host an enterprise's Oracle or SAP offering on its database. "Then, they can create a whole bundle of services around connectivity, application management, up to and including device management as part of a range of offerings," he says. Vendors -- the Oracles and SAPs of the world -- have an incentive to work with telecom providers because it spreads their products out into the market further, he says. End users don't have to invest in the infrastructure to host the applications on site, and can take advantage of the efficiencies the cloud offers, he says.

On an ever broader scale, telecoms may be able to provide an entire unified communication platform, Bieler suggests. They can offer enterprises an opportunity to purchase communications systems, and use their existing infrastructure to connect a range of devices, from phones to mobile devices and tablets.

Some telecoms have already recognized the opportunity. John Potter, a Vice President at AT&T's business solutions division, says the company doesn't want to just compete on the IaaS-level. Instead, AT&T wants application developers to write programs that run on AT&T's network.

"There is an opportunity for developers to work with us, on our platform, leveraging out toolsets and capabilities to make a truly robust application," Potter says. And, while AT&T is making a play in the platform market, it is also making it easier for enterprises and SMBs to connect into the AT&T offerings. Earlier this year, for example, it announced that enterprise private clouds running VMware can more easily integrate with the company's public cloud offering.

Other telecoms are putting their eggs in the IaaS basket. Drumgoole, the Terremark official, says IaaS is a natural fit for Verizon because Terremark is already a strong player in that field. He expects Verizon, CenturyLink, AT&T and others to "opportunistically" look at software-as-a-service (SaaS) and platform-as-a-service (PaaS) offerings.

Bieler says telecoms are in a good position because they are seen as trusted players and have an existing set of customers. "You can't underestimate the soft factors these companies have going for them: integrity, reliability, security," he says. Plus, another underestimated feature of the telecoms, is the massive salesforce they already have to go out and sell these systems.

Still though, Bieler says telecoms need to move quickly on to their new strategy, be it a straight IaaS play, or a more nuanced platform offering as he described. Telecoms, he says, rely on voice for a large portion of their revenue and cash flow now. Those revenues are eroding, so they need to find a way to replace that cash. The cloud, he believes, could be the answer.

Telefonica and Intel Launch an Innovative Pilot for Hybrid Clouds

Telefonica Digital and Intel have developed a joint pilot to demonstrate that enterprises can easily scale up their computing capacity by moving or growing applications from their internal data centers to the external public or third party cloud.

This solution allows enterprises to move as much of their applications and information technology (IT) as they wish to the public cloud, allowing them to keep their important applications (or parts of applications) running internally. This flexibility allows enterprises to meet temporary spikes in demand for business continuity, and to optimize their data center use and capacity.

Companies can now seamlessly interact between their internal cloud and the external public cloud, with no portability or interoperability issues between these two different environments.

The public cloud provides an integrated portal where enterprises will be able to manage their internal data center resources and the resources deployed on the public cloud.

Enterprises will be able to interoperate transparently between these two environments. The hybrid cloud provides complete IT capabilities to IT managers in a quick, secure, efficient and flexible manner.

This joint pilot has demonstrated successfully the cloud-bursting exchange from a local private cloud (played by the Intel Cloud Builder facility) to the public cloud (played by the Telefonica Virtual Data Center public cloud service). Cloud resource provisioning across the hybrid cloud is achieved under the OpenStack framework, proving the feasibility of distributing workloads from private to public clouds on top of this open source platform.

The next phase of activities are planned to illustrate the trustworthy enforcement of infrastructure management policies across cloud service boundaries. Operational parameters such as power management policies will be deployed using Intel Trusted Execution Technology, Intel Node Manager and Intel Data Center Manager.

"This capability brings a previously unavailable level of transparency and federation between cloud consumers and Cloud providers, helping preserve the service policies across cloud boundaries down to the hardware platform level. Cloud consumers will have an easier time assessing service quality and SLA compliance. Cloud providers will be able to offer differentiated cloud services even when their offering is part of a larger service", says Jim Blakley, Architecture Systems Integration Division Director at Intel.

Moises Navarro, Cloud Global Director for Telefonica Digital says: "Innovations like 'GoToCloud' or 'Real Elastic Cloud', developed by Telefonica, show our clear commitment on taking care of our clients' needs as well as providing value to the cloud service model. From the Telefonica Digital perspective, the hybrid cloud will contribute to the data-center operator's 'Go to Cloud' strategy that aims to boost the migration of corporate applications to the cloud, as an evolution of its Virtual Data Center cloud product."

Huawei Teams with Intel for Server Program in 2012

Excerpted from ChinaTechNews Report

With the launch of its Tecal V2 servers with Intel Xeon E5 processors inside, Chinese telecom equipment maker Huawei has announced its server product plans for 2012, including rack, blade, data center servers, and application acceleration solutions.

Huawei's new server products will reportedly be mainly based on Intel's chips. Tecal V2 servers use the newest Intel Xeon E5-2600 processors and focus on the enterprise information technology (IT) and data center sectors. By adhering to energy-saving concepts, Huawei Tecal servers received a "Restriction of Hazardous Substances Certificate" and the first "China's Environmentally Friendly Product Symbol Certificate".

According to comments made to local media by Chen Shijun, General Manager of Huawei Server Product Domain, Huawei's new servers feature more powerful performance, larger storage capacity, and better scalability. Chen said that server development strategy plays a very important role in Huawei's ICT strategy. Cloud computing technology has brought about profound changes in IT infrastructure.

As the cornerstone of IT infrastructure, server is the ladder to the cloud. Tecal servers have witnessed outstanding growth in the past three years. With a wide range of products including rack server, blade server, high density and scalability data center server, Huawei servers serve clients in various industries such as Internet, government, medical, and energy.

At present, Huawei's major clients are from data center and Internet sectors; however, the Chinese company plans to expand its server market and enter the overseas market.

During the first three quarters of 2011, Huawei reportedly realized contractual value of $185 million for server products, a year-on-year increase of 136%. For the entire year of 2011, the company expects to achieve contractual value of $250 million.

Terremark Offers Private Clouds for Enterprises

Excerpted from InformationWeek Report by Charles Babcock

Verizon cloud unit Terremark is offering "single-tenant cloud computing," which until recently was known as dedicated hosted services - or some other name for outsourced system operation. But the times they are a changin' and the formerly distinct lines of service providers are blurring.

Terremark will offer Enterprise Cloud, Private Edition, a cookie-cutter copy of its public infrastructure-as-a-service (IaaS), except it has additional isolation and security features. Amazon Web Services, Rackspace Cloud, and the Savvis unit of CenturyLink have already launched versions of "private" cloud operation as an auxiliary service of the public IaaS.

Terremark, however, will bring an ease-of-migration capability, due to its acquisition last August of CloudSwitch. With CloudSwitch, Terremark customers will be able to take legacy workloads and move them - without re-engineering - into the Private Edition of Terremark's IaaS. If the user company becomes comfortable with its operation there, it will have the option of moving the workloads again, into the nearby public cloud infrastructure, to take advantage of lower hourly rates.

In Private Edition, "The compute and storage resources are dedicated, so the customer doesn't have to be concerned about sharing a server with a competitor," said Ellen Rubin, VP of Terremark and Co-Founder and former VP of Products at CloudSwitch. A customer is assigned a set of servers and disk drives and is the only customer to use those resources in the Private Edition offering.

Customer isolation is guaranteed even in multi-tenant, public cloud environments, such as Terremark's own public cloud, dubbed Enterprise Cloud, or Amazon's EC2, or Rackspace's Cloud, as best as experts know. Multi-tenant architectures use firewalls and logical boundaries around virtualized applications to keep them from trespassing in others' memory spaces or storage. For example, Salesforce.com CRM applications have operated for over a decade in multi-tenant environments, without publicly known customer data corruption.

Even so, enterprise users put security at the top of their list of concerns about public cloud environments. Terremark's Enterprise Cloud, Private Edition, is meant to meet that heightened level of concern, said Rubin.

Terremark's private cloud edition is based on the virtual machine hypervisor most frequently found in large corporations, VMware's ESX Server, with cloud-enablement software also provided by VMware with Terremark's own enhancements.

Enterprise Cloud, Private Edition isolates customers' network services as well, but the distinguishing feature of Terremark's offering is probably CloudSwitch, produced by Rubin's former firm.

CloudSwitch wraps a legacy workload in a "cloud isolation layer" that contains its existing virtual machine format, security policies, and networking details and moves it to a Terremark facility where it can be run with those attributes intact. In effect, CloudSwitch translates between the old environment and the new, without programming intervention to convert the legacy system into something suitable to the new environment.

With a translation system, such as CloudSwitch, enterprise cloud users will gain a tool that helps them migrate legacy workloads to a Terremark facility. If they start out in the Private Edition portion of the data center, they will have to pay more for dedicated resources than they would in Terremark's public Enterprise Cloud. "Customers who want Private Edition with dedicated servers understand that it will be more expensive than multi-tenant services," Rubin said in an interview.

Terremark doesn't publish Private Edition prices. The starting price of a small VMware virtual machine in its public cloud is 3.7 cents an hour, on the high end of what may be comparable offerings from Microsoft's Azure and Amazon's EC2, whose entry points are 2 cents and 4 cents an hour, respectively. Direct comparison is difficult without knowing all the characteristics of the respective virtual servers.

Rubin said the decision to move from Private Edition to the lower-cost public cloud would be a simple migration, if the customer chooses to make it.

"For some customers, going to the cloud at all is a big deal. Private Edition may serve as their entry point," said Rubin. Once in a remote facility, "workload portability is really important," she added.

Terremark is the cloud operations unit of Verizon telecommunications. It was acquired by Verizon for $1.4 billion in January 2011 and operates 49 hosted services data centers around the world. The Private Edition service will be initially available in Terremark's data centers in Culpepper, VA, Amsterdam, the Netherlands; Sao Paulo, Brazil, and Terremark's NAP of the Americas in Miami, one of the largest data centers in the world. Rubin said the number of facilities with Private Edition will double by the end of the year.

DataDirect Networks to Supply Radio Telescope Storage

DataDirect Networks (DDN), world leader in massively scalable storage, and the International Center for Radio Astronomy Research (ICRAR) are partnering to develop a system architecture capable of processing over one exabyte of data per day to support the upcoming Square Kilometer Array (SKA). a $1.5 billion scientific instrument intended to be the world's most powerful telescope.

At the center of this partnership are engineering and application architecture enhancements to exploit DDN Storage Fusion Architecture (SFA) In-Storage Processing technology to leverage the unique advantages of executing data processing within a scalable data storage platform.

The DDN Storage Fusion Architecture is a unified virtual server and storage appliance featuring DDN's groundbreaking In-Storage Processing technology. Proven over 3 years of market availability, this virtualized environment eliminates externalized data processing systems and allows data-intensive applications to exist within the storage engine accelerating data access, minimizing latency, and significantly lowering the cost and complexity of data-intensive computing.

"DDN is pleased to partner with ICRAR on this project and to be recognized as a technology leader in the design of massively scalable systems which advance the scientific agenda of the worldwide research community," said Alex Bouzari, CEO and Co-Founder, DataDirect Networks. "This project is a great example of the potential of Big Data: cutting-edge researchers turning exabytes of machine-generated data into valuable information. DDN's In-Storage Processing technology is a showcase example of how advanced storage processing tools can maximize the value of our customer's information to achieve unprecedented scientific understanding. Our experience and leadership at enabling the most demanding Big Data programs such as ICRAR uniquely benefits any customer in any industry facing Big Data challenges."

Based in Perth, Australia, ICRAR is a joint venture between Curtin University and The University of Western Australia (UWA) to facilitate research excellence in astronomical science and engineering.

"DataDirect Networks is the recognized world leader in solutions for the type of massively scalable data-processing required for the incredible scale of the SKA telescope," said Professor Andreas Wicenec, head of computing for ICRAR. "This project combines our expertise implementing and using in-storage processing of astronomical data based on commodity hardware with the most advanced HPC storage appliance supporting in-storage processing. With this combination we are very well prepared to design, develop and test solutions for the SKA data challenge."

UWA recently installed a new Fornax supercomputer, which is 10,000 times faster than an average office desktop computer, to help drive Australia's powerful new radio telescopes. DDN's previous work on the Australian precursor radio telescopes, the Murchison Widefield Array (MWA) and the Australian SKA Pathfinder (ASKAP), helped to inform its development of the storage solution for the SKA.

Nodejitsu and Joyent Cloud Join Forces to Deliver Developer Tools

Joyent Cloud, the public cloud designed for real-time, high-performance applications, and Nodejitsu, provider of cloud management software and enterprise development tools for Node.js, today announced a partnership to deliver best-of-breed infrastructure and platform services and software to the growing ranks of enterprise and public cloud users developing and deploying applications for Node.js.

Node.js is the fast-growing, high-performance application environment that simplifies the unified development of front-end and server-side architectures using JavaScript, the language of the Web. The new service will allow thousands of developers to easily build, deploy, scale and manage Node.js applications on Nodejitsu's platform.

With this partnership, Nodejitsu's Platform-as-a-Service and orchestration tools will run in the Joyent Cloud to give Nodejitsu users the best front-end connected to the best back-end environments for running Node.js applications.

Joyent Cloud provides public and private cloud computing software and services to popular Web applications that include LinkedIn, Voxer, Gilt Groupe and TaskRabbit. Joyent is the corporate steward of the Node.js open source project and manages the Node.js code repositories.

"This partnership represents a best-of-breed Node infrastructure in collaboration with a best-of-breed Node application orchestration platform to create the best possible experience for customers," said Nodejitsu CEO Charlie Robbins. "We are very excited about working together with Joyent to guide Node.js into the future."

Nodejitsu has long been dedicated to the UserLand portion of Node.js, maintaining over 200 widely used open source modules that are complementary to their commercial solutions.

"Nodejitsu provides cohesive tools for managing Node.js applications in public, hybrid or private clouds. This mitigates the risk and overhead associated with integrating third party deployment, monitoring, provisioning, and scaling solutions," said Nodejitsu CTO Paolo Fragomeni.

This partnership enables the rapid expansion of Node.js platform services to drive adoption of Node.js. It also enables more developers and operations teams to take advantage of Joyent's unique capabilities for Quality Assurance and performance-tuning Node.js applications. In particular, Joyent Cloud Analytics, powered by DTrace, provides deep application monitoring and performance analytics equivalent to running core dumps and CPU profiling, in a dynamic environment.

Until now, these types of environments have presented challenges for developers seeking deep insights into application and compute processes. Further, Joyent Cloud delivers debugging tools for Node.js that accelerate application development and reduce the cost and man-hours required to achieve stability.

"Nodejitsu is one of the leading players in the Node.js space and our team is very eager to work with them to build out additional capabilities and drive user adoption and awareness of both the Node.js environment and Nodejitsu's tools," said Steve Tuck, general manager of the Joyent Cloud.

What Is Big Data -- And How Does It Matter to Marketing? 

Excerpted from Online Spin Report by Max Kalehoff

It seems as if every new business-tech venture positions itself around "big data."

Not surprisingly, so do VCs. So do business and tech publications and blogs, which are dedicating increasing resources to covering big data. Even the PR spin-sters have caught on, as demonstrated by the pitches I regularly receive.

How big is the buzz? Enough to drive Google searches for the term "big data" upwards of 300% since the beginning of 2011.

The problem is that "big data" is jargon. While it may mean something specific in data and software engineering circles, it is too easy to find myriad definitions and interpretations, even from people who work in the technology industry.

Now, I'm not a software engineer or data scientist. But I can assure you there is substance and weight to big data -- particularly in marketing.

To shed some light on big data for those less technical, I turned to my colleague Sanjay Gupta, who leads our engineering team at my company. He previously spent 17 years at some of the world's top ad-tech venues, including paid search at Microsoft and Yahoo's R&D Center in Bangalore.

His latest challenge is migrating our business to our new technology stack, which includes deployment of Hadoop, a leading database framework for bid-data analysis.

Kalehoff: What is "big data"?

Gupta: "Big data" refers to the huge volume of data that cannot be stored and processed using conventional databases in a reasonable time. While the storage problem has been solved by large offline storage systems, analysis tasks need to combine data in many ways; data read from one source may need to be combined with the data from multiple sources. This is where conventional data processing systems fail. Big-data technology tackles this problem by running logic on large data chunks in parallel and then combining them together in a manner that gives global results. We're now mining data sets that are so large in volume that they become truly representational of the entire problem space. This leads to more accurate predictions and more valuable insights.

Kalehoff: What industries and applications are most suited to big data?

Gupta: Any industry that can tap past data, or data from other organizations, to improve its business, process or predictive insights is well suited for big data analysis. For example, the New York Stock Exchange generates many terabytes of data per day. Any trading company would like to mine these data along with past data to understand stock trends.

In the Internet media industry, for example, companies can use similarly large volumes of search, visitor and click data to better understand audiences. A broadcast network can use viewership data to unlock prime times over different demographic segments. A bank can use its historical fraud data to raise alerts for suspected new fraud. An e-commerce company can use its huge customer and sales data to better predict demand and supply. In short, any company that can use large volumes of data -- internal or external -- to learn and derive value for better decision-making is a strong candidate for big data.

Kalehoff: What is the promise of big data in marketing?

Gupta: The ability to leverage big data has opened opportunities for startups to build tools and platforms that can surface insights around demand, supply, consumer behavior, segmentation, positioning and targeting. Big data is a natural fit for marketing because of the huge volumes of sample data that can drive analysis and more meaningful predictions.

Kalehoff: Do you feel the marketing industry is properly equipped to manage the voluminous amount of data needed to make smart marketing decisions?

Gupta: Big-data analytics require data to be collected from every step of marketing flow, continuously and accurately, so as to mine the "true function" of the output with a high degree of confidence. The biggest challenge the marketing industry faces is that it is not equipped with technology for real-time collection, storage and analysis. Most marketing organizations also lack process, and face significant restrictions that prohibit the exposure of data for analysis.

Marketing departments need better coordination and capabilities to share and analyze data in a more systematic manner. Interpretation is the biggest challenge, and understanding patterns and surfacing insights demands high intimacy with the individual business and larger market. Analysis of big data trends can be outsourced to external agencies, but the closer analysts are to the business, the better. That's why organically developed capabilities are often best.

Kalehoff: How have Facebook's recent moves changed the big data landscape?

Gupta: Facebook's marketing platform continues to emphasize the need for real-time data and insights. Launching, tracking and optimizing brand presence on Facebook, across multiple APIs certainly requires big-data analysis in real time.

Importantly, marketers want not only insights into Facebook, but insights into Facebook that are integrated with data and insights from multiple other channels, including internal customer databases. There is a huge opportunity to integrate and mine these large volumes of data in order to surface hidden relationships and business insights, and present them beautifully and persuasively to key stakeholders.

Kalehoff: Will developments in big data prompt the Federal Trade Commission or other government regulatory bodies to ramp up efforts to protect consumers' online data?

The promise of big-data analysis relies on accessing trends and patterns on a large volume of past data to predict the future. As long as consumers' online data are used to mine insights of entire segments and marketplaces without isolating, storing and tracking personally identifiable information, there should be no issue. A clear mandate is that data from different sources will need to be processed (transformed or encrypted) at the source in real time, and not stored, transported or aggregated offline in original form. This is going to be the biggest challenge for big-data systems.

Kalehoff: What comes after big data?

Gupta: Today's big-data analysis is used to optimize existing business. The semantics of big data is shifting from "data of action" to "data of intention." The future of big data will be to use it as a tool to discover new segments & audiences, and invent new products.

Coming Events of Interest

2012 NAB Show - April 14th-19th in Las Vegas, NV. From Broadcasting to Broader-casting, the NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, the NAB Show has proudly served as the incubator for excellence - helping to breathe life into content everywhere. 

CLOUD COMPUTING CONFERENCE at NAB - April 16th in Las Vegas, NV. Don't miss this full-day conference focusing on the impact of cloud computing solutions on all aspects of production, storage, and delivery of television programming and video.

Cloud Computing World Forum - May 8th in Johannesburg, South Africa. The Cloud Computing World Forum Africa is the only place to discuss the latest topics in cloud, including security, mobile, applications, communications, virtualization, CRM and much, much more.

Cloud Expo - June 11th-14th in New York, NY. Two unstoppable enterprise IT trends, Cloud Computing and Big Data, will converge in New York at the tenth annual Cloud Expo being held at the Javits Convention Center. A vast selection of technical and strategic General Sessions, Industry Keynotes, Power Panels, Breakout Sessions, and a bustling Expo Floor.

IEEE 32nd International Conference on Distributed Computing - June 18th-21st in Taipa, Macao. ICDCS brings together scientists and engineers in industry, academia, and government: Cloud Computing Systems, Algorithms and Theory, Distributed OS and Middleware, Data Management and Data Centers, Network/Web/P2P Protocols and Applications, Fault Tolerance and Dependability, Wireless, Mobile, Sensor, and Ubiquitous Computing, Security and Privacy.

Cloud Management Summit - June 19th in Mountain View, CA. A forum for corporate decision-makers to learn about how to manage today's public, private, and hybrid clouds using the latest cloud solutions and strategies aimed at addressing their application management, access control, performance management, helpdesk, security, storage, and service management requirements on-premise and in the cloud.

2012 Creative Storage Conference - June 26th in Culver City. CA. In association with key industry sponsors, CS2012 is finalizing a series of technology, application, and trend sessions that will feature distinguished experts from the professional media and entertainment industries.

Copyright 2008 Distributed Computing Industry Association
This page last updated April 13, 2012
Privacy Policy