Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

July 21, 2014
Volume XLVIII, Issue 12


Plan Now to Attend CLOUD DEVELOPERS SUMMIT & EXPO 2014

The DCIA & CCA are very pleased to welcome SoftServe as a Gold Sponsor of the upcoming CLOUD DEVELOPERS SUMMIT & EXPO (CDSE:2014) in Austin, TX on October 1st and 2nd.

Founded in 1993, SoftServe is now a leading global provider of high quality software development, testing, and technology consulting services. The company combines its unmatched experience with best practices delivering SaaS/Cloud, Mobility, and SDLC innovative solutions to help its customers drive their businesses and differentiate themselves within their markets. SoftServe has successfully completed over 2,500 projects for over 150 global companies.

Representing SoftServe as CDSE:2014 speakers will be Neil Fox, Executive Vice President & Chief Technology Officer, and Roman Pavlyuk, Solutions Service Product Manager, Enterprise Technology.

Neil brings more than 30 years of technology leadership to SoftServe, having served in the roles of Chief Information Officer, Software Development Vice President, CEO, and COO at companies including TRW, Macromedia, Management Recruiters International, PeopleFlow, Lawson Software, and Red Hat. For the past decade, he led and provided a variety of consulting services to leading global companies as VP of Consulting and Chief Innovation Officer with Symphony Services and Ness Software Engineering Services.

Roman has over ten years of experience implementing operations practices for major IT deployments. He is deeply familiar with integration of ITIL practices, open source, Agile, and SaaS operations automation. At SoftServe, Roman leads teams which provide configuration and deployment management for thousands of production and stage instances, for several SaaS providers, across multiple IaaS platforms. He is also the product owner for an operations automation framework developed from open source tools.

The DCIA & CCA are thrilled also to announce this week that Dell will have a major role at CDSE:2014 with a keynote presentation and two workshops.

Cloud computing not only improves business processes and operational efficiency — it reinvents the role of IT. And when aligned with organizational strategy, it can give you a competitive edge. Contact a Dell expert to find out how Dell can help simplify your path to the cloud. Backed by a decade of experience building cloud environments, Dell matches business needs with the right secure, enterprise-class solution.

Access secure, scalable as-a-service offerings for infrastructure, data, applications and email — anytime, anywhere and from any device. Simplify deployment and management of infrastructure and workloads with Dell cloud-enabled servers, storage, networking and software. Operate, monitor, and manage your cloud infrastructure across legacy, traditional and virtual-based infrastructures.

Keynoting on "The Current State of Private Cloud" will be Michael Elliott, Enterprise Cloud Evangelist. Leading a workshop on "Managing Hybrid Cloud Environments — The New Reality for Delivering IT as a Service" will be James Urquhart, Director of Product, Cloud Management Systems and leading a workshop on "Digital Transformation and the Essential Role of Cloud and Developers" will be Barton George, Director, Developer programs; Dell Services.

Michael has over 20 years of enterprise technology experience. In this capacity, he consults with companies throughout North America on their cloud architecture and represents Dell at industry conferences. Michael started his career as a mainframe programmer for General Electric and held the role of adjunct professor of marketing at the University of Akron. Michael has a mathematics degree from the University of Cincinnati and an MBA from Pennsylvania State University. View his blog here.

James is a seasoned field technologist with 20 years of experience in service-oriented architectures, cloud computing, and virtualization. James has been noted as one of the ten most influential people in cloud computing by both the MIT Technology Review and The Next Web. James joined Dell Software through the acquisition of Enstratius, in May of 2013. Prior to Enstratius, James held leadership roles at Cisco Systems, Forte Software, and Sun Microsystems. James is also a popular contributing author to GigaOm, and graduated from Macalester College with a Bachelor of Arts in Mathematics and Physics.

Barton is the Director of Developer programs in Dell Services and founder/leader of Project Sputnik, a client-to-cloud platform for developers. Before Dell Barton spent 13 years at Sun Microsystems in a variety of roles last of which as Open Source evangelist, and driver of Sun's Linux strategy. Barton began his professional career with a four year stint in Tokyo where he worked for Sony working with ISVs for their UNIX-based workstation. Born and raised in Honolulu, Hawaii, he headed east for higher education and attended Williams College and Harvard Business School.

According to the research firm IDC, cloud computing was an estimated $47.4 billion industry in 2013 and is expected to more than double by 2017. The cloud's 23.5% compound annual growth rate is five times faster than that of the broader technology market.

At CDSE:2014, highly focused business strategy and technical keynotes, breakout panels, workshops, and seminars will thoroughly explore cloud computing solutions and offerings, and ample opportunities will be provided for one-on-one networking with the major players in this space.

Register now for CDSE:2014 to take advantage of early-bird rates.

To learn more about conducting an instructional workshop, exhibiting, or sponsoring CDSE:2014, please contact Don Buford, Executive Director, or Hank Woji, VP Business Development, at the CCA. If you'd like to speak at this major industry event, contact Marty Lafferty, CEO, at the DCIA.

BitTorrent in the News

BitTorrent generated a number of interesting media reports this week including: BitTorrent Works with NAS Vendors to Boost File Sharing, BitTorrent to Try a Paywall and Crowdfunding , BitTorrent Looking for Testers for Mysterious New Product, and A Quick Look at uTorrent — A file-Sharing Protocol for Data Exchange.

Cloud after the Aereo Ruling

Excerpted from Inside Sources Report by Richard Davis & Ross Freedman

Some of the after-effects of the Supreme Court's ruling in ABC v. Aereo are beginning to surface.

Aereo was a service that allowed subscribers to view broadcast television signals over the Internet. Aereo did so by assigning to each subscriber an antenna that would relay those signals to Aereo servers that would convert, store, and stream broadcast television shows over the Internet to the subscriber on demand. Network television broadcasters claimed that Aereo's service violated federal copyright laws. Aereo claimed in response that it was simply enabling viewers to view and store television broadcasts, much like any other seller of television antennas and video recorders.

On Wednesday, June 25, 2014, the Supreme Court ruled 6-3 that Aereo's service violates federal copyright laws because it "transmits" copyrighted material to the "public." The Supreme Court concluded that the Aereo service was similar to the service of cable television providers and, thus, subject to the same re-transmission requirements established by federal law for such efforts, including, without limitation, copyright licensing and fee agreements with broadcasters. The Supreme Court acknowledged that there were "behind-the-scenes technical differences" between Aereo and the cable television services that Congress targeted with the current copyright requirements, but found that such differences did not matter based on Congress' underlying objectives.

The concern for cloud services providers of all types, from cloud storage providers to content streaming services, is that the Aereo decision could be extended or extrapolated to other cloud-based business models. The Supreme Court appeared to go out of its way in the Aereo decision to assuage this concern, expressly and repeatedly instructing that its view of the Aereo service was not to be extended automatically to any other technologies or business-models, including, specifically, other cloud-based services.

Despite this, many, including the dissenting three Supreme Court justices, believe that the door has been opened for the impact of the Aereo decision to extend beyond the specific business model and technologies involved in the Aereo service. Indeed, almost immediately after the court handed down its decision, Fox, who has been battling with Dish Network over retransmission of the cable giant's broadcasts, submitted the ruling as part of its case against Dish. Fox argues that Dish has violated copyright laws through two of its product offerings: Dish Anywhere, which allows users to view live or recorded television content from mobile devices or computers, and The Hopper, which allows users to record live content from anywhere and then transfer it to other devices.

The dust has barely settled on the Aereo decision, and much still needs to be gleaned from it. What is certain, however, is that it has not resolved many of the lingering questions for the cloud computing industry and how to navigate an evolving and uncertain legal landscape for these new service and technological offerings.

DCINFO Editor's Note: Richard Davis is an Attorney with Edwards, Wildman and Palmer. Ross Freedman is an Associate with the Distributed Computing Industry Association (DCIA).

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe're sharing a summary of the "COMMENTS OF DISTRIBUTED COMPUTING INDUSTRY ASSOCIATION (DCIA)" submitted to the Federal Communications Commission (FCC) this week in accordance with the public notice period for the FCC's proposed rule-making regarding Net Neutrality.

The DCIA kept our comments VERY brief in consideration of FCC readers who need to review some 800,000 submissions on this matter — certainly a record number — and perhaps qualifying this undertaking as a Big Data analytical project in and of itself.

We also made our points in plain language, avoiding both formal legalese and technological jargon, in an effort to communicate as simply and directly as possible with the Commission.

Established in 2003 as an international trade organization, the DCIA advocates commercial advancement of distributing computing technologies through such activities as business development, market research, conferences and expos, industry communications, working groups, standards setting, and related endeavors.

The success of our mission is closely aligned to the satisfaction of end-users, ranging from professionals within large enterprises utilizing solutions for their work to individual consumers accessing services for their personal use.

Our Membership has grown from two to 150 Member Companies, and today is comprised of broadband network operators, software developers and distributors, and cloud computing solutions providers and services.

This cross-section of private sector participants affords the DCIA a unique perspective, while also challenging us to find common positions among all constituencies on important and controversial topics such as Net Neutrality.

Our comments in this matter do not represent the views of our Member Companies individually or explicitly -- and indeed some may disagree with our recommendations -- but rather represent an informal consensus among DCIA participants.

Our experience has provided us with an approach for solving complex problems in this space — joint public-private sector working groups -- made-up of affected parties, facilitated by the DCIA, constituted for periods of short duration, and tightly focused on solving specific problems.

Three examples of this include our facilitating representative software suppliers working with the Federal Trade Commission (FTC) in the Consumer Disclosures Working Group (CDWG) to avoid being characterized as malware by employing ethical business practices; another group of companies working with the Department of Justice (DoJ) in the Peer-to-Peer Parents and Teens React OnLine (P2P PATROL) working group to provide tools for law enforcement to combat redistribution of criminally obscene content, and another assemblage of private sector organizations working with the US Congress and various federal agencies in the Inadvertent Sharing Protection Working Group (ISPG) to protect consumers from unintentionally divulging their private and sensitive data on the Internet.

We do not believe that concerns regarding Net Neutrality have risen to the level of seriousness of the issues addressed in the above examples, and in fact the Internet has been and continues to be a shining example of technological advancement, economic progress, and cultural enhancement due at least in part to the lack of heavy-handed government intervention.

If and when such problems become as widespread or as critical, however, as may become the case in the near term with the subject of our final recommendation (# 4 below), we would encourage the FCC to consider such an approach to yield a swift and fair resolution.

In any consideration of Net Neutrality, it's important to bear in mind that the Internet is not controlled by individual Internet service providers (ISPs) that connect users directly to the web: it's made up of a series of interconnected networks.

It's also essential to note the growing use of the Internet by social networks for multimedia user-generated content (UGC) as well as by over-the-top (OTT) Internet protocol television (IPTV) services for streaming high-definition (HD) professional audio/video (A/V) content.

Whether to transmit video to an end-user or to communicate via email, the digital data being transferred is packetized and traverses multiple networks before it reaches its final destination.

While slight delays do not materially impact the perceived quality of text-based and static-image communications, for streaming multimedia, packet delays are very noticeable in the form of degraded performance with anomalies that include buffering and stuttering.

Various techniques have been developed and services gradually brought forth to mitigate such delays and enhance quality, but these do not treat all data packets equally, nor should they do so in order to benefit the overall performance of the Internet among all users.

We strongly urge the FCC to take no action that would discourage the ongoing investment and innovation in many quarters of the private sector, nor the good judgment of key industry players at multiple levels that are so vital to continuing this advancement.

Here are our four recommendations.

1. Take a Holistic Approach. For whatever regulatory guidelines the FCC finalizes regarding the delivery and accessing of content, applications, and services over the Internet to be meaningful and effective, a complete and thorough end-to-end understanding of data flow is essential. This inspection needs to start at the point of origin of data packets and continue through inter-network peering arrangements, transit providers, remote data center storage, and content delivery networks. This examination must consider the impact of techniques employed on the other parties in the distribution chain and the related economics. While important in this consideration, the way data is treated in the so-called last mile of the broadband network operator is only a part of the equation.

2. Treat Wireline and Wireless Equally. Consumers, who are the ultimate constituency for the FCC, increasingly demand greater access to content, software, and services on an ever expanding array of devices that connect to the Internet through a wider variety of access methodologies. Viewers, for example, seek to access TV programs originally developed for delivery to their static analog television sets reformatted also for their desktop computers, as well as their mobile laptops, tablets, and cell-phones. And conversely, users seek to access apps originally developed for their smartphones reconfigured also for their tablets, laptops, desktops, and smart TV sets. To avoid interference with technological progress and business practices to serve this growing demand for cross-platform interoperability, the FCC's regulatory guidelines should be seamlessly applicable to wired and unwired Internet access providers.

3. Continue Using a Light Touch. As noted above, the Internet is not broken and there is no overwhelming need to fix it by means of new heavy-handed government intervention or resorting to seriously outdated common-carrier classifications. Whatever regulatory standards the FCC finalizes should be in the form of overriding parameters rather than detailed regulations. The FCC must be careful not to discourage investment and innovation by unintentionally stipulating certain technological approaches while prematurely declaring others unlawful. Its focus rather should be on preventing anti-competitive behavior and unfair business practices. This can be accomplished with a regimen that combines general guiding principles with case-by-case investigations of specific alleged violations.

4. Focus on Cross-Ownership as the Area for Greatest Potential Abuse. Related to the above, the FCC should be especially vigilant in instances of content ownership by Internet access providers in the distribution channel. Vertical integration of a motion picture studio, major broadcast television network, and several cable programming services, for example, under common ownership, should be of exponentially greater concern when that owner is also a cable multiple system operator and major Internet access provider. If prevention of cross-ownership is not possible at this juncture, then extreme vigilance to ensure equitable treatment of third-party content, applications in app stores, etc. is mandated.

To hypothetically illustrate our concern informing this final recommendation, imagine the following scenario: two competing subscription streaming sports services acquire rights to present alternate live sporting events from a league's regular season. Each arranges comparable game coverage for its offering generating technically equivalent streams from comparable access points requiring identical bandwidth and using similar variable bitrate optimizers.

One service, owned by a large broadband network operator, however, enjoys anomaly-free delivery to its customers, while the other, independently owned by a third party, consistently suffers from delays and service interruptions. What will be the impact on retention?

Above all, it's imperative that the FCC's involvement in Net Neutrality contribute to an Internet environment where content, applications, and services — regardless of ownership interest — receive equitable treatment.

In short, the ultimate driver for the FCC's proposed rule-making, in the DCIA's view, should be to ensure competition. Share wisely, and take care.

Copyright Office Denies Aereo Request to Be Classed as Cable System

Excerpted from Wall Street Journal Report by Keach Hagey

Internet broadcast streaming company Aereo's long-shot chances of surviving a Supreme Court ruling by classifying itself as a cable system suffered a setback.

The US Copyright Office sent a letter to Aereo on Wednesday saying it didn't buy the company's argument that it should qualify for the kind of compulsory copyright license available to cable systems just because the Supreme Court recently ruled Aereo was "substantially similar" to a cable system. A copy of the letter was reviewed by The Wall Street Journal on Thursday.

But the Copyright Office accepted Aereo's application to pay royalties pending a ruling by a federal judge on the issue or legislative action, the letter shows.

Aereo last month suspended its broadcast TV streaming service after the Supreme Court ruled in favor of TV broadcasters that had sued Aereo claiming copyright violations.

Last week, however, Aereo sent a letter to US District Judge Alison Nathan, in whose court the Aereo case originated, arguing that it should be classified as a cable system, and therefore qualify for a compulsory license. That meant it could pay limited royalties for rights to stream broadcast television content in the same way that cable systems do.

Cable systems can get compulsory copyright licenses for the channels they rebroadcast, meaning they don't have to seek individual permission for every copyrighted piece of content. Compulsory license fees are paid to the Copyright Office and are generally considered inexpensive, according to media lawyers. Cable systems then typically pay broadcasters for the right to broadcast their feeds through negotiated retransmission-consent agreements.

In its letter last week to Judge Nathan, Aereo argued that the Supreme Court effectively overturned an earlier judicial decision that had prevented online video firms from obtaining compulsory licenses. Aereo said it was proceeding to file the paperwork required to pay royalty fees.

But the Copyright Office disagreed. "In the view of the Copyright Office, Internet retransmissions of broadcast television fall outside the scope of the Section 111 license," wrote Jacqueline Charlesworth, general counsel and associate register of copyrights, referring to the section of copyright law that allows cable systems to get a compulsory license. "We do not see anything in the Supreme Court's recent decision…that would alter this conclusion."

Despite its decision, the Copyright Office accepted Aereo's filings to pay royalty fees on a provisional basis, wrote Ms. Charlesworth, depending on the outcome of the case now before the lower court. Aereo wanted to pay fees totaling $5,310.74 for the content it broadcast between January 2012 and December 2013.

Other online video services have tried the compulsory-license argument before, without success. FilmOn and Ivi two other online television streaming startups, argued in their defenses when they were sued in 2010 for streaming broadcast content without permission that they should qualify for the compulsory copyright license available to cable systems. Ivi lost and shut down, while FilmOn, which serves and array of on-demand and live content, decided to settle and develop new technology.

Following the Supreme Court decision on Aereo, FilmOn also applied for a compulsory license with the Copyright Office. It has yet to hear back, according to a FilmOn spokesman.

FCC's Extension of Net Neutrality Deadline to July 18

Excerpted from Variety Report by Ted Johnson

A surge in traffic to the FCC's website on Tuesday compelled the agency to extend its deadline for filing comments on a proposal to establish rules of the road for the Internet, otherwise known as Net Neutrality.

The deadline had been on Tuesday, but an FCC spokeswoman said that the new deadline would be on Friday, July 18th at midnight.

"Not surprisingly, we have seen an overwhelming surge in traffic on our website that is making it difficult for many people to file comments" through the FCC's electronic filing system, said spokeswoman Kim Hart. "Please be assured that the Commission is aware of these issues and is committed to making sure that everyone trying to submit comments will have their views entered into the record."

The FCC is taking comments via its Electronic Comment Filing System and at openinternet@fcc.gov. An FCC spokesman said that as of late afternoon on Tuesday, it had received about 780,000 comments.

The FCC's system also slowed in the aftermath of a segment on HBO's "Last Week Tonight" in which host John Oliver urged viewers to weigh in on net neutrality. FCC chairman Tom Wheeler has urged Congress to provide funding to upgrade the FCC's IT infrastructure, and other Commissioners have also complained about its site.

Among the companies filing comments on Tuesday was Mozilla, which reiterated its proposal in which the FCC would define only a part of the Internet as a telecommunications service.

Its proposal would classify the local access networks offered to "edge providers," or sites like Netflix and Dropbox, as separate services. Mozilla argues that such a move would give the FCC the kind of firm legal footing it needs to adopt rules to prohibit Internet providers from blocking or discrimination, as well as from entering into deals for paid prioritization.

Mozilla also argues that the rules should apply to both wired and wireless providers.

Major portions of the FCC's previous Net Neutrality rules were struck down by the DC Circuit Court of Appeals in January, forcing the Commission to come up with a new approach.

The DC Circuit did outline one way to do so, in which the FCC would prohibit blocking and discrimination that is "commercially unreasonable." But Mozilla and other Internet firms argue that such an approach would be too weak to prevent Internet providers for essentially charging them for speedier access to the consumer.

While groups like the Writers Guild of America have argued that the FCC should also move toward reclassification, Hollywood studios are staying silent for now. A spokesman for the MPAA said that they would not be filing comments.

What Cloud Computing Customers Want: Clarity, Simplicity, Support

Excepted from Forbes Report by Joe McKendrick

The lines between cloud providers and cloud consumers keep getting fuzzier every day. Many enterprises are engaged in both offering services through their own clouds, as well as subscribing to cloud services from public providers. To meet the consumption side of the equation, there is now a mega-industry of public cloud service providers that are aggressively competing for this online market. Vendors within this mega-industry are overwhelming customers with a bewildering assortment of pricing plans and service level agreements.

A new study from Enterprise Management Associates (EMA) finds market confusion abounds. The survey of 415 executives finds a great deal of interest in the cloud, but at the same time, confusion about options and services. For many organizations, clouds represent an entirely new platform, the EMA report, underwritten by iland, states. Many executives admit they lack the expertise to oversee and understand pricing models and operational metrics, and relayed a feeling of "just making it up as we go along."

The average enterprise in the survey subscribes to the services of at least three cloud vendors, EMA analyst Dennis Drogseth said in a webcast related to the EMA study. The reason companies are hedging their bets across numerous vendors, he surmises is because "cloud is still an experiment… people are looking to optimize their resources with cloud, many are not sure what that means yet. They're still exploring and looking for that fit." Another factor is departmental-level fragmentation — parts of the business subscribe to the cloud for their own reasons, such as development, data storage or continuity.

The cloud market is a highly fluid one as well, EMA finds. "When asked about their future cloud strategies, almost 60% indicate an interest in adding cloud vendors and 25% plan to switch vendors. Another 20% plan to eliminate cloud providers due primarily to security, cost, compliance, or complexity issues."

Pricing is one of the greatest sources of discontent with cloud services — not necessarily because they are too high, but because pricing models are too confusing. Vendors' innovative pricing models often end up looking like the exotic financial instruments created by financial services firms, says Lilac Schoenbeck, VP of product management and marketing for iland, who joined Drogseth in the webcast. "While all of this innovation is happening, people that might be paying the price of this math is customers," she says. "Pricing can be very confusing or tricky in the cloud…. there's been a great deal of innovation on the vendors for pricing models. But at the end of the end of the day the cloud customers are sort of stuck in the mud with this mess."

Along with pricing confusion, cloud-consuming enterprises are often surprised — to the downside — by performance issues (cited by 38%). Issues with "noisy neighbors" who drag down cloud provider server performance. Annoyance with lack of vendor support crops up just as frequently — executives were not aware that the monthly or annual subscription they purchased did not include full support. "The realities of cloud support contracts often take customers by surprise," the EMA report states. "Simple email or ticketing support may only be available to customers at lower tiers. Customers purchasing higher-end support may still have difficulty getting access to adequate levels of hands-on expertise."

The learning curve is often steeper than expected, and vendors just aren't willing to do a lot of hand-holding. An enterprise cloud engagement brings with it a lot of complexity, says Drogseth. "Some of the marketing around cloud has suggested that it's as easy as waving a magic wand, which isn't the case. Performance is very much a shared requirement between the cloud provider and the IT service team."

The EMA report points to four things cloud customers are increasingly demanding from the vendor community:

1) Transparent pricing: Confusion over the assortment of pricing models "inevitably lead to challenges communicating pricing to management, estimating costs and making sound investment decisions," says EMA. "The clearer the pricing model of the cloud platform, the more likely the cost objectives will be met."

2) Ease of management: "Accept that cloud represents a new platform for IT to manage and select a cloud that eases that transition — whether by presenting familiar metrics, easy-to-use portal environments, or shareable reporting."

3) Support: "The cost and quality of support, as well as the medium by which it is delivered (ticketing system, email, phone), can significantly hamper or accelerate cloud success. Consider the support model and its pricing when making a cloud selection."

4) Services: Understand your own strengths and limitations first, EMA advises. "Survey respondents were remarkably self-aware in identifying areas where their in-house expertise fell short. With an honest assessment in hand, cloud vendors can be evaluated on the basis of their service offerings, from disaster recovery to onboarding."

The bottom line is that despite its name, cloud computing doesn't magically happen in the sky — it comes from a someone's server somewhere. It requires proactive management. Cloud services "must be selected, workloads must be migrated and usage must be tracked," says EMA. "Not unlike other complex IT systems, these cloud infrastructures must be monitored and managed. Capacity and performance management continue to be paramount, as the cloud's much-touted 'easy scalability' depends on a watchful eye identifying and correcting problems."

This is good advice for cloud consumers, and it is also something for organizations rendering cloud services to others to keep in mind as well. Even if you aren't charging for the service you're putting out there, you need to make it a positive experience for your end users.

Mobile, Big Data Development Moves to the Cloud

Excerpted from Application Development Trends Report by David Ramel

While not a new trend, the migration of mobile and Big Data development to the cloud has shifted into overdrive lately.

Witness a new report from ABI Research predicting that the growth in cloud application development and management platforms will drive $3.6 billion in mobile enterprise application revenues by 2019.

The research firm said more companies are looking to mobile application platforms -- deployed either on-premises or in the cloud -- for the development and management of their enterprise apps. Cloud-based solutions, however, are expected to outpace their on-premises counterparts, to the tune of 42.5 percent compound annual growth rate (CAGR) from this year to 2019.

"Mobile applications allow for greater flexibility when mobilizing content and employees," said ABI Research senior analyst Jason McNicol. "However, the cost in terms of time and resources is fairly expensive to generate a single app for multiple platforms like iOS, Android and Windows Phone. Fortunately, new cloud-based development solutions have evolved and are now gaining traction to reduce the app development time while permitting cross-platform deployment."

These cloud-based development solutions are divided into two camps, the company said: front-facing and back-end integration.

An example of the back-end trend -- or Mobile Backend as a Service (MBaaS), because everything has to be an "X as a Service" these days -- was just revealed yesterday. Amazon Web Services announced a new mobile SDK and associated services to simplify some of the back-end plumbing like identity and data management, hooking up to storage, sending push notifications, analyzing user behavior, and so on. Think Microsoft Azure Mobile Services or the similar Google initiative just announced two weeks ago.

On the front-end, as ABI Research noted, it's pretty much all about cross-platform app development, an exploding market as evidenced from recent tools announced by Oracle, Embarcadero Technologies, Xamarin, AppGyver, Microsoft and many others.

Meanwhile, in the Big Data arena, things are happening fast as the trend to move large-scale analytics to the cloud accelerates.

The advantage of running your Big Data analytics in the cloud rather than on-premises -- especially for smaller companies with constrained resources -- are numerous and well-known. Oracle summed up some of the major business drivers in the article, "Trends in Cloud Computing: Big Data's New Home:" cost reduction, reduced overhead, rapid provisioning/time to market, flexibility/scalability.

"Cloud computing provides enterprises cost-effective, flexible access to Big Data's enormous magnitudes of information," Oracle stated. "Big Data on the cloud generates vast amounts of on-demand computing resources that comprehend best-practice analytics. Both technologies will continue to evolve and congregate in the future."

In fact, they will evolve and congregate to form a $69 billion private cloud storage market by 2018, predicted Technology Business Research. That's why the Big Data migration to the cloud is picking up pace recently -- everybody wants a piece of the multi-billion-dollar pie.

As Infochips forecasted early last year: "Cloud will become a large part of Big Data deployment -- established by a new cloud ecosystem."

The following moves by industry heavyweights in just the past few weeks show how that ecosystem is shaping up:

IBM last week added a new Big Data service, IBM Navigator on Cloud, to its IBM Cloud marketplace. With a reported 2.5 billion gigabytes of data being generated every day, IBM said the new Big Data service will help organizations more easily secure, access and manage data content from anywhere and on any device.

"Using this new service will allow knowledge workers to do their jobs more effectively and collaboratively by synchronizing and making the content they need available on any browser, desktop and mobile device they use every day, and to apply it in the context of key business processes," the company said.

The new service joined other recent Big Data initiatives by IBM, such as IBM Concert, which offers mobile, cloud-based, Big Data analytics.

Google last month announced tools to "help developers build and optimize data pipelines, create mobile applications, and debug, trace, and monitor their cloud applications in production." One such tool added to the company's Cloud Platform suite is Cloud Monitoring, designed to "let developers understand, diagnose and improve systems in production."

Another is Google Cloud Dataflow, "a fully managed service for creating data pipelines that ingest, transform and analyze data in both batch and streaming modes."

Dataflow is a successor to MapReduce, a programming paradigm and associated implementation created by Google that was a core component of the original Hadoop ecosystem that was limited to batch processing and came under increasing criticism as Big Data tools became more sophisticated.

"Cloud Dataflow makes it easy for you to get actionable insights from your data while lowering operational costs without the hassles of deploying, maintaining or scaling infrastructure," Google said. "You can use Cloud Dataflow for use cases like ETL, batch data processing and streaming analytics, and it will automatically optimize, deploy and manage the code and resources required."

EMC on Tuesday acquired TwinStrata, a Big Data cloud storage company. The acquisition gives traditional storage vendor EMC access to TwinStrata's CloudArray cloud-integrated storage technology.

That was just one of a recent spate of moves to help EMC remain competitive in the new world of cloud-based Big Data. For example, when the company announced an upgrade of its VMAX suite of data storage products for big companies,The Wall Street Journal reported: "Facing Pressure from Cloud, EMC Turns Data Storage into Service."

The same day, EMC announced "a major upgrade to EMC Isilon OneFS, new Isilon platforms and new solutions that reinforce the industry's first enterprise-grade, scale-out Data Lake." But wait, there's more: EMC also yesterday revealed "significant new product releases across its Flash, enterprise storage and Scale-Out NAS portfolios" to help organizations "accelerate their journey to the hybrid cloud."

EMC's plethora of Big Data/cloud announcements make it clear where the company is placing its bets. As financial site Seeking Alpha reported: "EMC Corporation: Big Data and Cloud Computing Are the Future."

That was in March, and the future is now.

83% of Healthcare Organizations Are Using Cloud-Based Apps Today

Excerpted from Forbes Report by Louis Columbus

HIMSS Analytics' recent survey of cloud computing adoption in healthcare provider organizations found that 83% of IT executives report they are using cloud services today, with SaaS-based applications being the most popular (66.9%).

These and other findings are from the 2014 HIMSS Analytics Cloud Survey recently published by HIMSS Analytics, a subsidiary of the Health Information and Management Systems Society. HIMSS Analytics provides analytical expertise and data analysis to healthcare IT companies and consulting firms. The methodology is found on page 6 of the 2014 HIMSS Analytics Cloud Survey. As the sample population of healthcare organizations not presently using cloud services is small, this data needs to be considered informational and not representative of the industry as a whole. Despite the limitations of the methodology, the findings provide a fascinating glimpse into how healthcare organizations are adopting cloud computing.

Key take-aways from the 2014 HIMSS Analytics Cloud Survey include the following:

83% of IT healthcare organizations are currently using cloud services, 9.3% plan to, and 6% do not intend to adoption cloud-based applications at all with the balance not knowing the plans of their organizations. In aggregate, 92% of healthcare providers now and in the future see the value of cloud services for their organizations.

67% of IT healthcare organizations are running SaaS-based applications today, with 15.9% running on an Infrastructure-as-a-Service (IaaS) platform, and 2.4% using Platform-as-a-Service (PaaS) applications.

Augmenting technological capabilities or capacity (48.2%), making a positive contribution to financial metrics (46.4%) and time to deploy (44.6%) are the three most common ways healthcare organizations measure the value of cloud services. The following table shows how healthcare organizations measure the value of cloud services.

Hosting of Clinical Applications and Data (43.6%), Health Information Exchange (38.7%) and Backups & Data Recovery (35.1%) are the most common applications of cloud-based applications today. The following table compares cloud applications currently in use, planned for use, and least likely to be used by healthcare IT organizations.

37.1% of IT healthcare organizations chose to deploy their cloud applications on a private cloud architecture, 36.3% choose a hybrid cloud model and 23.4% chose public clouds.

Cloud-based applications are most commonly used for administrative functions including hosting financial, operational, HR and back office applications and data (73.4%), followed by IT functions (73.4%), and clinical applications and data (52.4%) third. 21.8% of the respondents also mentioned that cloud-based applications are being used in more than five of their departments or divisions, further illustrating how pervasive cloud adoption is in the above categories.

Top three reasons for adopting a cloud solution include less cost than current IT maintenance (55.7%), speed of deployment (53.2%) and solving the problem of not having enough internal staff and/or expertise to support on-premise alternatives (51.6%).

Security concerns (61.4%), IT Operations being a completely internal function (42.3%), and availability and uptime concerns (38.4%) are the top three reasons why IT healthcare organizations don't adopt cloud services today.

88.7% of all IT healthcare organizations are relying on preparation plans to increase the probability of success of their cloud adoption efforts. 49.2% upgraded their network infrastructure and/or monitoring capabilities, 42.7% engaged with a solution provider for assistance with metrics, problems and outages, and 37.1% created or modified businesses to get more value from their cloud-based applications.

48.3% of IT healthcare providers have had performance and downtime issues with their cloud service providers. Slow responsiveness of hosted applications and data (32.5%), downtime and unavailability of applications & data (23.3%) and response rate too slow for data back up in the cloud (3.3%) are the three most common performance areas IT healthcare organizations cite in the survey.

Please click here for an infographic that summarizes the key points of the 2014 HIMSS Analytics Cloud Survey.

Bringing the Cloud to Data that Cannot be Moved

Excerpted from Federal Times Report by Jane Snowdon

Many regulations, laws, and industry guidelines govern how sensitive personal information and regulated data are managed in the healthcare, banking, and financial industries. Improper release of regulated or sensitive information can result in significant consequences and damage, making compliance with government regulatory acts and industry guidelines paramount.

Cloud computing has many benefits but cases exist where some data cannot be moved to the cloud for of a variety of reasons. For example, security concerns or regulatory compliance requirements might limit the use of the cloud. In some cases, data may be generated at rates that are too big to move or at rates that exceed transfer capacity, for example in surveillance, operations in remote areas, and telemetry applications.

Micro-Cloud technology is a new model for cloud computing especially suited for organizations that are unable to move data onto the cloud due to insufficient bandwidth, latency, location-specific processing needs, Big Data, security, or compliance reasons. Micro-Cloud allows organizations to realize the benefits of cloud computing as they create new insights from their premises by moving computations and analytics to where the data resides, dynamically and intelligently.

The challenge of not always being able to move data to the cloud is faced by many industries. In the banking and financial industry, huge amounts of data are generated in each of their locations and data centers, where regulatory restrictions apply on how and where data can be moved across global locations. In retail, data is generated at numerous branches and outlets, but the amount of network connectivity many have in remote branches can be fairly limiting.

In education, almost all universities are globalizing, but as they globalize, hosting and delivering the IT infrastructure that is needed for many of the overseas campuses is not as good because overseas link capacity affects performance. The government and travel and transportation industry have ships, airplanes, and tanks moving into areas where the only connectivity is by satellite. In those cases, the satellite connectivity has both high latency and limited bandwidth compared to most other IT applications. It's clear that managing regulated and sensitive data and handling big data is a cross-industry challenge.

Micro-Cloud technology allows solutions developed in a cloud environment to be brought seamlessly to on-premise systems for execution, allowing the benefits of cloud computing to be realized for data that cannot be moved to cloud for processing. A Micro-Cloud has three components: a self-managing, on-premise appliance; a traditional cloud for software-as-a-service (SaaS); and Application Programming Interfaces (APIs) to move SaaS to the appliance. Micro-Cloud hosts a suite of existing applications in either a public or private cloud environment and enables SaaS components to be downloaded and configured to run on an on-premise appliance, which could be either virtual or physical. Computations and analytics are executed in a safe and controlled environment within the appliance. Micro-Cloud allows a user from a single location to move computations and analytics to on-premise data in multiple different locations and essentially enables regulatory compliance or regulation as a client service.

In our discussions with government and industry leaders, true IT transformation within federal agencies and enterprises will involve the use of Micro-Cloud that is customized to their unique needs and interests. Several Federal agencies operate in a loose conglomeration and are unable to share data freely with other agencies due to legislative and separation of duty concerns. Federal agencies often are required to provide services in remote areas where network connectivity may not be suitable for exploitation of traditional cloud computing models. Agencies considering Micro-Cloud can realize benefits when dealing with instances of high data generation rates, low bandwidth or low latency requirements, location specific processing needs, compliance and security.

Cloud Computing Is Killing the Traditional Office

Excerpted from Newsweek Report by Kevin Maney

Two of the hippest places to work operate out of anti-offices.

At Airbnb's headquarters in San Francisco, CA, every meeting area is decorated to look, in remarkable detail, like some Airbnb rental somewhere in the world. One conference room is modeled on the War Room in"Dr. Strangelove." In New York City, product innovation company Quirky's offices in a former warehouse look like a cross between a hip nightclub and a giant preschool, outfitted with a conference table made from industrial fans and a giant map that shows where your colleagues are going on vacation. (All employees can take as much vacation as they want. Yay!) The sign on the front entrance says: "Deliveries & humans: 7th Floor. Suits: Go away."

Technology is giving the office an identity crisis. Even the word office now sounds like something your father went to. "We're going through a 100-year shift in work," says Adam Pisoni, co-founder of Yammer, which is now part of Microsoft. "There's a real tension today between old and new."

Or as a recent Herman Miller research project concluded: "Long-established workplace norms are giving way to disruption and uncertainty."

Twenty years ago, the office existed because it was the only place to get real work done. The reason to go to the office was to access information and technology…and other employees. This is why so many offices looked like shit, with their Dilbert cubicles and fluorescent lighting. Like an old, single man with a fortune, offices didn't need to look good to attract talent.

Cloud computing is throwing the last shovelfuls of dirt on the traditional office. All the information and software that used to be locked inside offices can be tapped into from anywhere. Think of all the other things you used to have to go to the office for: a computer, a long-distance phone line, copiers, fax machines, files, mail, an art department that could make foils to go in the overhead projector for presentations in pre-PowerPoint days. Now you can get all of that on a laptop while sitting in a Starbucks. Private offices, surveys show, are empty 77 percent of the time.

Starbucks, by the way, has long billed itself as the "third place" in American life. Home is the first place; office the second. Maybe Starbucks is going to suffer its own identity crisis when the third place becomes the second place.

Companies such as Yammer, which makes a kind of intra-company Twitter, and Herman Miller, the furniture maker that invented the cubicle, have been trying to understand the next-generation office. It helps to start with historical context.

If you go back long enough, there were no real offices. The Egyptians constructed pyramids, not office towers. In the Middle Ages, people in Europe erected cathedrals. In London in 1729, the East India Company built perhaps the first office building. Still, in those days most professionals worked at home, in what they called a library. Thomas Jefferson had a library at Monticello. There was no "home office" at Monticello.

The offices of the 20th century reflected the technology driving business and society. The middle part of the century was all about industry and production, so offices looked like Jack Lemmon's workplace "The Apartment" — rows of desks strung out like an assembly line. The 1960s gave birth to the Information Age, and workers were expected to hunker down and think. Companies gave them cubicles.

So what now? Information is a commodity. Technology is available everywhere to everybody. Employees don't have to go anywhere to access other employees—not in the age of Yammer and Skype video calls and Google Hangouts. Companies aren't even made up of just employees anymore. In this ultra-networked age, a lot of business gets done by a core group from the company connected to a matrix of contractors and freelancers.

For many companies, then, the most valuable assets have become creativity and culture. The companies with the best ideas win. And the companies that can carve out an identity and image win.

As designers look at those changes in business, they're thinking that offices have to be someplace you'd want to go for the same reasons you want to go to a bar, even though you can make a good whiskey sour at home: connections to people, a pleasing place to hang out, and maybe a getaway from your spouse or from that laundry basket crying out to be emptied. Desks and offices are going away in favor of funky gathering spaces and nooks where you can take a laptop and think on your own. It has to feel like a place where employees and outside partners enjoy bonding and collaborating, says Ryan Anderson, Herman Miller's director of future technology.

Companies used to spread the corporate culture by infusing it into employees through training, memos, and gatherings. IBM even had company songs in the 1930s and '40s. But if a company is now more of a constantly morphing band of insiders and outsiders, the office might be one of the most important tools for creating culture. If a bunch of strangers gather in an Irish bar, they'll start singing Irish songs. If a mixed bag of people gather at Airbnb, the surroundings need to help them feel Airbnb-ish.

So this is why Quirky's space is quirky and Airbnb's looks like somebody's kitchen in Helsinki and Etsy's feels as if a flea market might break out at any second. These young companies aren't just eccentric outliers building fanciful work spaces. They seem to be leading the march out of our cubicles, toward the next thing in working together.

Either that or I'll see you at Starbucks.

Coming Events of Interest

Silicon Valley Innovation Summit — July 29th-30th in Mountain View, CA.AlwaysOn's 12th annual SVIS is a two-day executive gathering that highlights the significant economic, political, and commercial trends affecting the global technology industries. SVIS features the most innovative companies, eminent technologists, influential investors, and journalists in keynote presentations, panel debates, and private company CEO showcases.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.

Copyright 2008 Distributed Computing Industry Association
This page last updated July 27, 2014
Privacy Policy