Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

CSC Leasing

Dancing on a Cloud

DataDirect Networks

Extreme Reach

Hertz Neverlost

Kaltura

Moses & Singer

SAP

Scayl

Scenios

Sequencia

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

February 4, 2013
Volume XLII, Issue 8


Cloud-Based DAM Picks-Up Momentum

Excerpted from Technorati Report by Geoff Simon

Early Friday, Wisconsin based Widen Enterprises announced that its cloud-based digital asset management solution (DAM), Media Collective, will be adopted by WAGO Corporation, a German company that manufactures components for electrical connection technology and electronic components for decentralized automation technology.

And the automotive industry isn't the only one choosing to use cloud-based DAM solutions. From marketing, software, governments, and food industries, companies of all kinds are realizing the benefits of a more centralized digital media library.

Media Collective, which offers cloud-based asset management solutions, offers a centralized media library that companies like WAGO can use to manage all their digital media needs. In total, WAGO has over 18,000 individual products, each one with numerous images and other digital media attached to them.

"A lot of other providers have similar kinds of things, but I think what really drew us to Widen more than anything else was how intuitive the system was for the user," said WAGO Marketing Communication Manager John Kenworthy. "For us, it's all about the experience for the customer."

And Widen has no shortage of experience as a 60 year old company which started in traditional pre-media services like color management. And for user experience, version 6.3 of Media Collective is slated for release sometime this month and will include features like 5-star asset ratings, commenting, and other enhancements to asset collection.

There is also the ability to integrate online creative review and collaboration through ConceptShare, a popular tool for creative teams to manage works in progress.

With the ability to manage projects that center around visual components, this solution is perfect for creative teams of all sizes to increase efficiencies in managing media as it allows for easy creation, management, and distribution. It also allows agencies to work with clients to deliver assets on spec. because each has the ability to see changes as they happen.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe're very excited to announce the speakers and agenda for our upcoming 2013 CLOUD COMPUTING CONFERENCE at the NABShow taking place on April 8th and 9th in Las Vegas, NV.

This year's conference has been extended from one to two full-days reflecting the increased importance of and growing interest in its subject matter.

Our 2013 event track will demonstrate the new ways cloud-based solutions are providing increased reliability and security, not only for commercial broadcasting and enterprise applications, but also for military and government implementations.

From collaboration during production, to post-production and formatting, to interim storage, delivery, and playback on fixed and mobile devices, to viewership measurement and big-data analytics, cloud computing is having an enormous impact on high-value multimedia distribution.

Experts will provide a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain.

DAY ONE will begin with an "Industry Update on Cloud Adoption."

How are cloud-based technologies currently being deployed throughout the audio/video (A/V) ecosystem? What file-based workflow strategies, products, and services are now working best?

A panel discussion with Dr. Frank Aycock, Appalachian State University; Jonathan Hurd, Altman Vilandrie; Rob Kay, Strategic Blue; and Patrick Lopez, Core Analysis will thoroughly examine this emerging market segment.

Next, we'll discuss "Outstanding Issues: Reliability & Security." What remaining pitfalls cause producers and distributors to resist migrating to the cloud? How are liability, predictability, privacy, and safety considerations being addressed.

Speaker Shekhar Gupta, Motorola Mobility, will introduce the topic. And then a panel with Lawrence Freedman, Edwards Wildman Palmer; Tom Gonser, Docusign; Jason Shah, Mediafly; and John Schiela, Phoenix Marketing International, will follow-up with further discussion.

Then "Cloud Solutions for Content Creation" will be our subject. How is cloud computing being used for collaboration and other pre-production functions? What do dailies-screening and editing in the cloud offer the content production process?

Speaker Patrick MacDonald King, DAX will explore this area first. And then a panel with Sean Barger, Equilibrium; Morgan Fiumi, Sferastudios; Rob Green, Abacast; and Brian Lillie, Equinix will continue our examination.

"Post-Production in the Cloud" will follow. What do cloud solutions bring to post-production functions such as animation and graphics generation? How are formatting, applying metadata, and transcoding improved by cloud computing?

Our DAY ONE Marquee Keynote Chris Launie of Disney will speak first.

Then a panel with Jim Duval, Telestream; Joe Foxton, MediaSilo; Jim Heider, RealEyes; and Bill Sewell, Wiredrive will delve into this topic in more detail.

Next, we'll discuss "Cloud-Based Multimedia Storage." How are data centers and content delivery networks (CDNs) at the edge evolving? What do business-to-business (B2B) storage solutions and consumer "cloud media lockers" have in common?

Speaker Jean-Luc Chatelain, DataDirect Networks, will address the topic first. And then a panel with Bang Chang, Xor Media; Tom Gallivan, Western Digital; Tom Leyden, Amplidata; and Douglas Trumbull, Trumbull Ventures, will follow up with further discussion.

DAY ONE will end with "Content Delivery from the Cloud." How is cloud computing being used to enable distribution and playback on multiple fixed and mobile platforms? What does the cloud offer to improve the economics of "TV Everywhere?"

Speaker Chris Rittler, Deluxe Digital Distribution, will explore this area first. And then a panel with Scott Brown, Octoshape; Brian Campanotti, Front Porch Digital; and Mike West, GenosTV will continue the examination

DAY TWO will open with four cloud implementation case studies.

How was cloud computing used most successfully during 2012 in the multimedia content distribution chain? What lessons can be learned from these deployments that will benefit other industry players?

Case studies will be presented by Jason Suess, Microsoft; Michelle Munson, Aspera; Keith Goldberg, Fox Networks, and Ryan Korte, Level 3; and Baskar Subramanian, Amagi Media Labs. Then the presenters will join in a panel discussion.

Next, we'll look at "Changes in Cloud Computing." How is the cloud-computing industry changing in relation to content rights-holders? What new specialized functions-in-the-cloud, interoperability improvements, and standardization are coming this year?

David Cerf, Crossroads Systems; Margaret Dawson, Symform; Jeff Malkin, Encoding; and Venkat Uppuluri, Gaian Solutions will join in a panel.

Then the OPEN Group will lead a discussion of cloud standards

"A Future Vision of the Cloud" will explore what to expect next. What do the latest forecasts project about the ways that cloud-computing solutions will continue to impact the A/V ecosystem over the long term? How will the underlying businesses that are based on content production and distribution be affected?

Panelists Lindsey Dietz, ODCA; John Gildred, SyncTV; Mike Sax, ACT; and Sam Vasisht, Veveo will join in the discussion.

"Military & Government Cloud Requirements" will follow. How do the needs of military branches and government agencies for securely managing multimedia assets differ from the private sector? What do these requirements have in common with commercial practices?

Michael Weintraub, Verizon, will speak first. Then Scott Campbell, SAP America; Fabian Gordon, Ignite Technologies; Linda Senigaglia, HERTZ NeverLost; and Alex Stein, Eccentex will go into more depth.

Next, we'll explore "Unique Cloud-Based Solutions." What are cloud solutions providers currently developing to address specific considerations of the intelligence community (IC) in fulfilling its missions? How will these approaches evolve and change during 2013?

DAY TWO Marquee Keynote Saul Berman of IBM, will address this area first.

Then Kris Alexander, Akamai; Rajan Samtani, Peer Media; Ramki Sankaranarayanan, PrimeFocus; and Dan Schnapp, Hughes Hubbard & Reed will continue this examination.

Four relevant cloud Case studies will follow.

How is cloud computing being used to help securely manage sensitive multimedia? What lessons can be learned from these deployments that will benefit military and government organizations?

Grant Kirkwood, Unitas Global; Jack Pressman, Cyber Development Group International.; Randy Kreiser, DataDirect Networks; and John Delay, Harris will present case studies.

These presenters will then join in a panel discussion.

The Conference Closing will tie back to the commercial sector. How do those involved in multimedia production, storage, and distribution leverage cloud-based solutions to their fullest potential? What resources are available for comparing notes and staying current on the latest developments?

Our closing session speakers will be Steve Russell, Tata Communications and Jeffrey Stansfield, Advantage Video Systems.

There are special discount codes for DCINFO readers to attend the NABShow. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04. Share wisely, and take care.

Amazon Offers Cloud Video Conversion

Excerpted from Technorati Report by Rahul Manekari

Amazon has made public the availability of a new video conversion service Amazon Elastic Transcoder, which can be used for conversion and distribution of a video viewable on multiple devices. As cloud has been proven to be the best for hosting any kind of data, Amazon has taken the serious steps to provide almost any kind of heavy data which can be seamlessly integrated to the cloud and can be streamed to any mobile or handheld devices.

Amazon's video conversion feature is new to the world while Microsoft is ruling the cloud with its Windows Azure "Media Services." Windows Azure has more flexibility of handling any kind of media formats. Instead of just video conversion, Windows Azure also provides more features like content protection, On demand and Live Streaming.

Amazon Elastic Transcoder is still in beta and can be used on Macs, PCs, tablets, and Apple iPhone and Android smart-phones. It offers a very handy feature and GUI for conversion where users can convert their video in their desired format in just couple of clicks. Elastic Transcoder uses the Transcoder Pipeline queue that connects to the conversion engine and generated the output to the desired device platform.

"You can easily get started by using the AWS Management Console or the API. System transcoding presets make it easy to get transcoding settings right the first time for popular devices and formats," - AWS website. A business with an app on the cloud platform, was used to operate on Windows Azure, but after the launch of AWS Elastic Transcoder they decided to move to AWS. "Windows Azure is better solution for Windows Platform and .NET, but AWS has wide range of platform including linux distribution and various languages support." - Writingtoserve.net.

"Content producers no longer need to worry about encoding multiple versions of their media to suit the growing number of devices viewers might be using. Instead, each transcoding session is performed in the cloud," Chris Davies on the technology and gadget site Slashgear.

Users can get started with the Elastic Transcoder at the AWS management portal.

IBM Expands Enterprise Cloud Services

Excerpted from InformationWeek Report by Charles Babcock

IBM has established a global network of eight data centers to supply infrastructure-as-a-service (IaaS) and is now offering SmartCloud Enterprise+ for running production workloads, including SAP applications.

Unlike Amazon or Rackspace IaaS, IBM customers will find System Z mainframes and System P servers running AIX in the IBM cloud. That makes SmartCloud more compatible with the workloads of IBM customers. Most services are based strictly on standard Intel or AMD x86 servers, and it remains difficult to migrate Z series or P series workloads to x86 servers.

IBM's IaaS data centers are located in Boulder, CO.; Raleigh, NC, Montpellier, France; Ehningen, Germany; Sydney, Australia; Sao Paulo, Brazil; Tokyo; and Toronto. It will add a ninth at mid- year in Barcelona, Spain. The company has previously offered IaaS in the US and Europe. Offering greater geographic coverage often helps companies find a site in which to store their data. France, Germany and Canada all have regulations on what data may leave the country and what may not.

Tuesday's announcement was the first time IBM said its Enterprise+ services, which offer individually tailored service level agreements, have been available in all major regions of the world.

Unlike Amazon, IBM will let the customer choose to either manage his own workload or let IBM do it, similar to a managed hosting service, for an additional fee. It will also gear the service to the degree of availability and security desired by the customer. SmartCloud Enterprise+ can have service levels that guarantee availability for operating system-instance from 98.5% to 99.9%.

Jim Comfort, general manager of IBM SmartCloud services, called it "a logical evolution of IBM's sourcing business." By giving customers a choice of servers, reliability, and security, IBM is "defining a new enterprise-grade cloud today," one that lets a company hand off the headaches of maintaining IT systems to IBM, he said.

"We don't believe in one size fits all," Craig Sowell, VP of Cloud Marketing, said. "Our service level agreements (SLAs) can be aligned to what the client is trying to achieve. We can provide different levels of management" of SAP apps, he added. Running SAP applications in the SmartCloud as either customer-managed or IBM-managed applications is a key part of IBM's strategy, said Sowell.

IBM manages the elasticity needed by the applications, as demand rises and falls, rather than the customer needing to learn the infrastructure and make decisions to support it.

By automating the provisioning of a standard SAP application, IBM can promise quick delivery of application services along with 99.7% availability, according to Sowell. The service is available for SAP Business Suite and SAP BusinessObjects, running in production environments.

IBM recently claimed it saw an 80% increase in its cloud revenues in 2012. Sowell said the company is well on its way toward achieving its planned $7 billion in cloud revenues by 2015, but IBM doesn't release how much revenue stems from particular sources, such as SmartCloud Enterprise+. Revenues also flow out of consulting to establish private cloud operations on customers' premises and implementation of standard IBM software, such as Tivoli and DB2, to power those operations.

Sowell said another new offering is IBM's Migration Services, which helps to determine which legacy workloads are best suited to be moved to SmartCloud Enterprise+. The migration path has been standardized and given automation assists, meaning a customer may see a return on investment in six to 18 months, he said.

TWC to Launch Cloud-Based UI on IP Boxes

Excerpted from CED Magazine Report by Mike Robuck

In this morning's fourth-quarter and year-end earnings conference call, Time Warner Cable President and COO Rob Marcus outlined some of the cable operator's key initiatives for this year, which included the rollout of a cloud-based user interface on IP set-top boxes and gateways.

Comcast is already out of the starting blocks with its cloud-based X1 platform, and Time Warner Cable made mention of its cloud-based user interface last year in an earnings call and in a Cable Show session. 

"Our cloud-based user interface running on IP set-top boxes (STBs) and next-gen DVRs will deliver the biggest change to the video experience that our customers have experienced in a decade," Marcus said this morning. 

"These are scheduled for introduction in the second half of this year." Also on the front burner for this year, Marcus said Time Warner Cable is looking at doubling its Wi-Fi hotspots — with an emphasis in New York City where it competes with Verizon — after adding around 10,000 access points last year. 

Both Marcus and Time Warner Cable CEO Glenn Britt touted the company's business services revenue, which grew 26 percent to $515 million in the fourth quarter and 29 percent to $1.9 billion for the year. 

Time Warner Cable added more than 1,500 new employees to its business services headcount, which was a 35 percent increase, and nearly doubled the number of commercial buildings that were connected to its fiber. 

"We posted organic growth of more than 20 percent again in 2012, powered by an expanded sales force, more buildings on net and new products," Marcus said. "We think we can achieve that kind of growth in 2013." In addition to business services, Marcus outlined five key accomplishments from last year that he said would provide a solid foundation for 2013:

Time Warner Cable enhanced the capacity of its network by starting its analog-to-digital conversion project. Time Warner Cable completed its rollout of DOCSIS 3.0 services last year and used the reclaimed bandwidth to increase speeds on its other data tiers, including a 50 percent increase to 15 Mbps for its Standard tier.

In November, Time Warner Cable opened the doors on its first national data center in Charlotte, which enabled it to consolidate its video sourcing and infrastructure for data, cloud and phone services and its internal enterprise system. 

"In addition, we built out our own CDN so we can efficiently deliver our managed IP video service without reliance on third parties," Marcus said. "These investments are already paying dividends, but just as importantly, they position us well for the long term."

In addition to the speed increases for existing tiers, Time Warner Cable launched 75 Mbps and 100 Mbps tiers in some markets and continued to upgrade its TWC TV apps. 

In December, Time Warner Cable, an early adopter of TV Everywhere services, added 4,000 on-demand assets to the iOS version of its TWC TV app, while offering as many as 300 channels via the apps to a range of devices in customers' homes.

"The really good news is our customers are starting to make use of and appreciate the apps," Marcus said. "In December, over three-quarter of a million customers used the TWC TV app, and used it almost four million unique times. We're now focused on adding out-of-home capabilities to the apps to make them even more valuable to customers."

On the customer service front, Time Warner Cable introduced one-hour service windows across most of its footprint and a 30-minute window for the first appointments of the morning in some areas, including New York City. 

Time Warner Cable is also experimenting with real-time appointments. Time Warner Cable has nearly doubled its self-installation rate; last month, almost 30 percent of its installations were performed by customers, which Marcus said increased customer satisfaction while reducing truck rolls.

The integration of the former Insight Communications system has gone smoothly, Marcus said, and Time Warner Cable expects to fully realize the $100 million annual run rate for synergies that it identified before closing on the deal last year.

The 2012 election drove a record year for Time Warner Cable's advertising sales. Time Warner Cable's fourth quarter increased 29 percent to $313 million, and 20 percent to $1.1 billion on the year.

On the video subscriber front, Time Warner Cable lost about 126,000 subscribers in the fourth quarter, which was relatively flat to the 129,000 it lost in the same quarter a year ago. Marcus said the video subscriber numbers were disappointing after the company hoped for a turnaround last year, but better retention efforts, along with better services, will hopefully reduce churn. 

Time Warner Cable added 75,000 new data subscribers in the quarter and 34,000 telephony customers. "Underlying our 2013 priorities are greater focus and better and faster decision-making, and to that end, the organizational changes we announced last week marked the final step from decentralized geographical operating units to a more centralized structure that we're internally calling 'One Time Warner Cable,'" Marcus said.

"We expect that over time, this more streamlined organization will deliver new products faster and better, in addition to more reliable services. It will give our employees greater clarity in their roles and responsibilities, and, of course, we believe these changes will help us deliver the operational efficiency and profitable growth that our shareholders demand."

Interoute CloudStore Is "Game Changer" 

Excerpted from CloudTech Report by James Bourne

Cloud service provider Interoute has announced the launch of CloudStore, an enterprise app store which doubles as a compute and storage facility, hosting OS and databases, as well as business apps.

Interoute, known for its virtual data center and owning Europe's largest cloud platform, is further focusing its efforts on the enterprise — understandable, given that's where most of its client base lies — but featuring a much more rounded product than just hosting apps.

Counting Microsoft, RedHat, and Ubuntu among its cast list, the Interoute CloudStore integrates several assets, with its integrated network, compute storage platform underneath a layer of appliances, from which secure platforms can be built.

The store also comes with an online knowledge center, designed to help enterprises choose a bespoke model to suit their business needs.

Interoute calls the solution a "game changer" and "the next generation of cloud computing; one that doesn't compromise on data sovereignty, privacy concerns or performance."

But what was the genesis of the CloudStore? Speaking to CloudTech, Interoute CTO Matthew Finnie said that the idea of 'we're just going to fill it up with apps' swiftly altered after consultation with clients.

"We very quickly realized with most of our customers — who are not consumers — it's all about getting the appliance integrated with the piece of infrastructure they can trust and use," said Finnie.

Finnie continued, "It's highlighting the idea that people have got very comfortable conceptually with the idea of 'I've got a smart-phone, I've got a tablet, I can download an app, it does something cool'. The problem for most people who run big infrastructure, internal IT or otherwise, is that their internal infrastructure is not designed to operate like that.

"So we've seen this year, a bunch of customers come to us and say 'the edge of my user base is so diverse, the amount of platforms at the edge is so diverse, I've just got to standardize everything'.

"For us, it's a big shift to the cloud, but it's one which is based on solid, sensible, rational thinking."

The network and underlying infrastructure is pivotal to Interoute's ethos, and it's an often overlooked facet of cloud computing, according to Finnie.

"The biggest challenge for most people with cloud computing isn't necessarily computing — it's always the network," noted Finnie. "It's the one thing that the big incumbent cloud computing providers kind of ignore.

"It's under that list of hard problems to solve, but for someone like us it's a natural bread and butter thing."

Of course, from VMware, to Amazon, and even the UK government's GCloud, there are plenty of cloudy app stores in the market — but Interoute's offering is a more integrated solution. Does this change the app store paradigm?

Telefonica US Partner Program for Cloud Services

Excerpted from FierceMobileIT Report by Fred Donovan

Telefonica today announced its first ever US-based Channel Partner Program, giving resellers and partners the opportunity to offer a range of solutions including connectivity, cloud, and machine-to-machine (M2M) platform services, from Telefonica USA and Telefonica Digital, the digital innovation arm of Telefonica.

Telefonica has more than 314 million corporate and domestic customers across North America, Europe, and Latin America, while serving primarily Fortune 500 companies in the US. This Channel Partner Program is the first opportunity for resellers to access the wide range of state of the art services offered by Telefonica.

With one of the US' largest Data Centers, located in Miami, FL, Telefonica's Cloud services are ideal for companies operating in North America and Latin America. In particular, Telefonica's Instant Servers provides infrastructure-as-a-service (IaaS), delivering on-demand, high performance cloud computing for developers, digital businesses, and large enterprises. The service is optimized for mobile, enterprise, and M2M applications and backed by Telefonica's world-class infrastructure management and service capabilities.

The Channel Partner Program has been designed to assist partners through all stages, from implementation to post-sales. Partners will have full access to the online portal that provides marketing materials, training and tools to enhance the sales and service process.

There are multiple levels of partnership from which to choose, including full technology partners, becoming an agent, or serving as a referring partner. Each program has a unique compensation structure.

Telefonica USA CEO, Marcelo Caputo said, "This is one of the few programs that allows you to sell a wide range of top-notch telecommunications and value-added solutions, from International Services to Data Center and Cloud Solutions, on a worldwide basis. Together with our dedicated innovation arm, Telefonica Digital, we are constantly evolving state of the art services in high demand areas so our partners have attractive and differentiated offerings. Being able to offer the full program in such an important market as the US allows us to significantly expand our footprint."

Telefonica Digital is a global business division of Telefonica. Its mission is to seize the opportunities within the digital world and deliver new growth for Telefonica through research & development, venture capital, global partnerships, and digital services such as cloud computing, mobile advertising, M2M, and eHealth.

It is also driving innovation in over the top communications under a new umbrella brand called TU and in Big Data through Telefonica Dynamic Insights. Telefonica Digital will deliver these new products and services to Telefonica's 314 million customers as well as entering new markets. It is headquartered in London with regional centers in Silicon Valley, Sao Paulo, Spain and Tel Aviv. Jajah, TokBox, Terra, Media Networks Latin America, 48, and giffgaff are all managed under the Telefonica Digital umbrella.

SAP Expands Partner Program for Cloud Apps

Excerpted from InfoWorld Report by Chris Kanaracus

SAP's burgeoning portfolio of cloud-based applications has prompted it to make some changes to its PartnerEdge program.

More than 500 partners who have relationships with SuccessFactors, the cloud human capital management (HCM) vendor SAP finished acquiring last year, will be added to the program during the course of 2013, SAP said this week.

Partners "now have the opportunity to tackle the fast-growing cloud market and offer best-of-breed cloud solutions and suites to their installed base customers and prospects," SAP said.

SAP is also trying to make life easier for partners on the contractual side of things. A single deal will allow partners to sell all of SAP's cloud offerings, rather than requiring separate agreements for each one, according to the statement.

PartnerEdge also includes the usual sort of things involved with partner programs, such as technical support, certifications and training, and demand generation services.

"SAP's always had a good partner program," said analyst Ray Wang, CEO of Constellation Research. "Extending it into the cloud should help with adoption."

In fact, SAP's stated goal is to generate 40 percent of its revenue through indirect sales by 2015.

But in the long run, SAP will need to build out a more comprehensive platform-as-a-service (PaaS) offering, according to Wang. "What they have now allows a light build. Partners will want a longer term development kit that's easier to use than classic SAP dev tools."

SAP has been doing just that, using the HANA in-memory platform as a foundation for a next-generation PaaS.

However, SAP's relative newness to the PaaS market means it could take some time for its offering to be widely adopted by customers and partners, particularly to the degree of rival platforms such as Salesforce.com's Force.com.

PaaS can be seen as a multiplying force for a vendor like SAP, since it creates a community of developers and systems integrators with a shared interest in its success.

WANdisco Wins Patent for Distributed Computing

WANdisco, a provider of high-availability software for global enterprises to meet the challenges of Big Data and distributed software development, announced that US Patent and Trademark Office (USPTO) patent number 8,364,633 will issue today.

The patent, entitled "Distributed computing systems and system components thereof," claims the fundamentals of active-active replication over a Wide Area Network ("WAN"). The patented technology is incorporated in WANdisco's product range, including Subversion MultiSite, which is used by leading companies such as HP, Intel, Barclays, McAfee, Honda and Wal-Mart for globally distributed software development, and will be incorporated into the company's upcoming Big Data products. Wikibon forecasts the Big Data market to grow at a CAGR of 58% between now and 2017 or from $5 billion to over $50 billion.

"This patent, combined with our recent AltoStor acquisition, highlights WANdisco's unique ability to meet the non-stop availability, scalability and performance needs of enterprises," said David Richards, CEO of WANdisco.

"Our patented active-active replication technology is an outstanding differentiator in the software development market, something no other company can achieve over the Wide Area Network. We're excited to port this patented technology to Big Data and have already received very positive feedback. I'm proud of the innovative team we have at WANdisco and look forward to exciting days ahead."

The underlying patent application was allowed by the USPTO in November 2012, after a thorough substantive examination. In accordance with standard Patent Office procedures, following the timely payment of the Issue Fee, the application was assigned US patent number 8,364,633 and scheduled for issuance and publication on January 29, 2013.

This patent will be enforceable for a period of 20 years from filing, plus 912 days (an adjustment period attributed to a delay incurred by the USPTO and added to the patent enforceability term). Therefore, the patent will be enforceable until July 11, 2028.

Huawei Finds Favor at CERN: UDS Cloud Storage

Excerpted from GigaOM Report by David Meyer

Huawei has become an official partner of CERN openlab, with the physics research facility giving the thumbs-up to the Chinese firm's exascale-targeting, mass object-based storage infrastructure.

China's Huawei may find business tough in the US due to suspicions over its motives, but its cloud efforts are clearly appreciated elsewhere. A year after it started working with CERN on cloud storage — something of a priority for a research organization that generates more than 25 petabytes of physics data each year — Huawei has become an official CERN openlab partner, with at least three more years' collaboration now assured.

The new arrangement was announced on Thursday, along with confirmation of Russia's Yandex becoming an openlab associate in the field of data processing. Huawei's involvement is a bigger deal than that, as it puts the Chinese firm on a par with Intel, HP, Oracle and Siemens, all of which work particularly closely with CERN to see how their technologies can help with the Large Hadron Collider experiments.

In Huawei's case, the company is contributing its self-healing UDS cloud storage system for use and validation. UDS is targeting the upcoming exascale (an exabyte is roughly a million terabytes) era with a mass object-based storage infrastructure that uses ARM's energy-efficient processor architecture alongside cheap SATA disks. It also offers Amazon S3 API compatibility and claims eleven-nines (99.999999999 percent) reliability, so users theoretically don't need to back up data stored in a UDS-toting cloud.

UDS provides a bit of insight into how openlab works. Huawei first delivered a 384-node version of UDS to CERN in early 2012, after which the researchers played around with it for three months. In September of that year, Huawei released UDS to the general enterprise market (in more normal eight-node configurations). The benefits for both sides of this partnership are clear: CERN has to push technological limits in order to handle the very big data generated by the LHC, and Huawei gets both valuable feedback from the researchers and a glowing report card to show off to the wider world.

As for the next steps in this partnership, CERN has now hired two computer scientists to work with Huawei on its implementation there, and more UDS storage systems will be deployed at the Swiss facility in the next few months.

PaaS & IaaS: Rising Champs of Cloud Computing

Excerpted from CloudTweaks Report by Arthur Nichols

In the cloud conversation, platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) appear much less than the famed software-as-a-service (SaaS). This is not surprising when you consider that a world already populated with built platforms and infrastructure has but to operate on them.

However, offering platforms and infrastructure through the cloud has been a boon to the software development field, especially with the recently growing push toward increased collaboration between developers and admins, commonly referred to as development operations (DevOps).

Though somewhat easy to confuse the two, it may be best to view PaaS as a subset of IaaS with fewer responsibilities in terms of provision of infrastructure. Specifically, choosing PaaS over IaaS moves the responsibility of managing the database, runtime, and middleware from an organization to the vendor of the service.

PaaS and, to a lesser degree, IaaS are a direct route via the cloud to facilitating the work of software developers, though IaaS may be more desirable in cases where an organization has a suitable existing infrastructure and prefers more control over those elements.

Over recent years, the rise of DevOps has highlighted a push toward agile development, which is a model that favors multiple updates to software over a shorter period of time as opposed to the traditional waterfall method. Cloud computing and virtualization enhance this approach by speeding processes and increasing flexibility. In a quickly evolving business environment, agility is a most sought after quality.

Demand for agility requires a streamlined approach that lowers costs in terms of manpower and time. As such, many large companies have taken to the automation in the cloud by employing application deployment software that takes care of application delivery and updates. In order to understand the path application development (AD) is on and how this gives rise to PaaS and IaaS, it is best to look at the market as a whole.

According to Gartner's principle research analyst, Asheesh Raina, "Application modernization and increasing agility will continue to be a solid driver for AD spending, apart from other emerging dynamics of cloud, mobility and social computing." Cloud and Mobile technologies are also cited as largely driving the direction of AD.

The prolific use of smart-phones and tablets has led to widespread adoption of Bring Your Own Device (BYOD) to increase efficiency in the workplace. As such, much of the AD market is expected to focus on mobile devices. Gartner predicted that applications developed for mobile devices would outnumber those developed for PC at a 4:1 ratio by 2015.

The vast majority of AD is expected to happen in the cloud in coming years as well, largely thanks to BYOD and the cloud. Even development that takes place on premise is largely aimed at being cloud-ready. It is undeniable that AD will be central to the growth of cloud computing.

The Bureau of Labor Statistics reports for projected employment growth through 2020 show a growth in software development related jobs that is approximately twice the national average. According to Forrester's VP and Principal Analyst John Rymer, "PaaS holds the key to the full use of cloud computing." This is because applications are the driving force of our technological advancement into cloud computing and PaaS offers, "easy development, fast deployment, focus on your business problem, reach, elasticity, and self-service," Rymer says.

Still, PaaS has only recently gained significant recognition as a cloud computing category. It is built on IaaS, so it stands to reason that it has come into play only as we delve deeper into the cloud. Businesses have to take a pragmatic approach to emerging technologies and IaaS is often a logical step in migrating an existing application stack into the cloud.

Questions and concerns have to be addressed before becoming deeply invested. However, just as with cloud computing in general, businesses become increasingly comfortable utilizing PaaS as they attempt to take advantage of its proposed benefits. Given the growth of industries that stand to reap the benefits of IaaS and PaaS, early projections of the growth of these services may be modest in terms of investment in cloud technology.

Cloud Code Speeds Processing of Microscopy Data

Excerpted from Scicast Report

Microscopic images of podosomes, cellular structures thought to be involved in cancer, show the differences in clarity produced by conventional microscopic techniques and super-resolution imaging.

Salk researchers have developed a method of utilizing cloud computing to significantly reduce the time necessary for processing super-resolution images. They have reported a how-to secret for biologists: code for Amazon Cloud that significantly reduces the time necessary to process data-intensive microscopic images.

The method promises to speed research into the underlying causes of disease by making single-molecule microscopy of practical use for more laboratories.

"This is an extremely cost-effective way for labs to process super-resolution images," said Hu Cang, Salk Assistant Professor in the Waitt Advanced Biophotonics Center and coauthor of the paper. "Depending on the size of the data set, it can save over a week's worth of time."

The latest frontier in basic biomedical research is to better understand the "molecular machines" called proteins and enzymes. Determining how they interact is key to discovering cures for diseases. Simply put, finding new therapies is akin to troubleshooting a broken mechanical assembly line-if you know all the steps in the manufacturing process, it's much easier to identify the step where something went wrong. In the case of human cells, some of the parts of the assembly line can be as small as single molecules.

Unfortunately, in the past conventional light microscopes could not clearly show objects as small as single molecules. The available alternatives, such as electron microscopy, could not be effectively used with living cells.

In 1873, German physicist Ernst Abbe worked out the mathematics to improve resolution in light microscopes. But Abbe's calculations also established the optical version of the sound barrier: the diffraction limit, an unavoidable spreading of light. Think of how light fans out from a flashlight.

According to the Abbe limit, it is impossible to see the difference between any two objects if they are smaller than half the wavelength of the imaging light. Since the shortest wavelength we can see is around 400 nanometers (nm), that means anything 200 nm or below appears as a blurry spot. The challenge for biologists is that the molecules they want to see are often only a few tens of nanometers in size.

"You have no idea how many single molecules are distributed within that blurry spot, so essential features and ideas remain obscure to you," says Jennifer Lippincott-Schwartz, a Salk non-resident fellow and coauthor on the paper.

In the early 2000s, several techniques were developed to break through the Abbe Limit, launching the new field of super-resolution microscopy. Among them was a method developed by Lippincott-Schwartz and her colleagues called Photoactivated Localization Microscopy, or PALM.

PALM, and its sister techniques, work because mathematics can see what the eye cannot: within the blurry spot, there are concentrations of photons that form bright peaks, which represent single molecules. The downside to these approaches is that it can take several hours to several days to crunch all the numbers required just to produce one usable image.

"It's like taking a movie, then you go through some very complex math, so what you see is the end result of processing, which is extremely slow because there's so many parameters," Cang says. "When I first saw PALM, I was shocked by how good it was. I wanted to use it right away, but when I actually tried to use it, I found its usefulness was limited by computing speed."

Even using statistical shortcuts, processing these images was still so intense that a supercomputer was required to reduce the time to a practical level. "Calculating an area of 50 pixels can take nearly a full day on a state-of-the-art desktop computer," says Lippincott-Schwartz. "But what you'll have achieved is the difference between a guess and a definitive answer."

In their Nature Methods paper, the researchers offer other scientists the tools they need to use an easier alternative-the Amazon Elastic Compute Cloud (Amazon Elastic EC2), a service that provides access to supercomputing via the Internet, allowing massive computing tasks to be distributed over banks of computers.

To make PALM more practical for use in biomedical research, the team wrote a computer script that allows any biologist to upload and process PALM images using Amazon Cloud.

As a demonstration, Cang, Lippincott-Schwartz and post-doctoral researcher Ying Hu reconstructed the images of podosomes, which are molecular machines that appear to encourage cancer cells to spread. In one instance, they dropped the time needed to process an image from a whole day to 72 minutes. They also imaged tubulin, a protein essential for building various structures within cells. In that case, they were able to drop the time from nine days to under three and a half hours.

Their new paper provides a how-to tutorial for using the code to process PALM images through Amazon Cloud, helping the other labs achieve similar increases in speed.

Staggering Revelations About Big Data

Excerpted from Baseline Report by Dennis McCafferty

It is the stuff of e-documents, blogs, Tweets, news footage, and fan forums. It is also a massive collection of family photos, music recordings, corporate webinars, and podcasts, and, yes, Angry Birds and homemade videos of kittens. This and so much more continue to shape what we now know as big data.

And big data is only going to get bigger and bigger, according to a recent report Big Data, Bigger Digital Shadows, and the Biggest Growth in the Far East from IDC. The report conveys staggering numbers about the size of our digital universe. Even more telling are details that reveal the wealth of good data out there that remains unexploited by organizations for business purposes.

The report is sponsored by EMC, a cloud-based provider of information storage, management, protection, and analytics services.

(Note: The findings measure big data in a unit of measurement known as the zettabyte [ZB], which equals 1,000 exabytes. That's enough to store 250 billion DVDs, according to an estimate from Cisco.)

Airmobs P2P Wi-Fi Tethering Creates Hotspot Market

Excerpted from Talk Android Report by Jeff Causey

Ever been in one of those locations where your data service has disappeared, but you notice someone on another network chugging away? It would be nice if they could create a mobile hotspot that you could jump on to use instead of resorting to roaming or just doing without service.

Later on you could return the favor to someone else. The Viral Spaces research group at MIT is hoping to address these types of situations with the creation of a new community-based peer-to-peer (P2P) Wi-Fi tethering market running on an app they have dubbed Airmobs.

The concept is fairly simple. When you have network access, you fire up the app and let people use your mobile hotspot for their own connection needs. As you do so, the app starts to credit your account. You can then use those credits to "borrow" someone else's connection when the roles are reversed.

The app has controls built in to check for battery life, signal strength, motion, and any data limits you may have set to minimize the impact on your device and your data plan.

One question remains though and that is carrier reaction to the app. The apps creator, Eyal Toledano, recognizes carriers would probably not be interested in users running their own mobile hotspot markets, which is why he is hesitant to release Airmobs to the Google Play Store.

HR Blocks Spotify Citing P2P Security Concerns

Excerpted from The Verge Report by Amar Toor

Concerns over data security have led the House of Representatives to ban the use of Spotify among its members. As Politico reports, the music streaming service appears to have run afoul of an older ban on peer-to-peer (P2P) technology. Implemented during the Napster era, the rule was originally designed to guard against unauthorized file sharing and to prevent malware from infecting House computers.

"To help protect House data, our IT policy generally prohibits the use of P2P technologies while operating within the secure network," a spokesman for the Office of the Chief Administrative Officer (CAO) told Politico. "While Spotify is currently not authorized, the CAO has and will continue to work with outside vendors to enable the popular services that improve member communication capabilities."

"Music is a common language that all political parties speak."

"It is a sad day when a few bureaucrats can block our nation's leadership from enjoying free, secure access to over 20 million songs," a Spotify spokesman said. "Music is a common language that all political parties speak and should be used to bring the legislators of this great country together so they can solve the serious issues facing our nation." The spokesman also pointed out that both Barack Obama and Mitt Romney used Spotify as part of their voter outreach efforts during the 2012 presidential campaign.

"We truly hope the House of Representatives will see the error of their ways and stop blocking Spotify so that all of America can benefit from their collective joy of music," the company added.

Interestingly enough, even the RIAA thinks the House's ban is unjust, as Chairman and CEO Cary Sherman explained in a letter to the CAO last week. "These services are safe and secure, and assuring access to them not only respects the contractual relationship users may have with these services, but also achieves an important public policy goal of promoting legal, safe digital providers," Sherman wrote.

"We appreciate your need to ensure that the House network is secure, and we would welcome the opportunity to work with you to develop a new policy that ensures that users of the House network will be able to gain access to these new legal services."

Coming Events of Interest

2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.

NAB Show 2013 - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.

Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.

CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Copyright 2008 Distributed Computing Industry Association
This page last updated February 10, 2013
Privacy Policy