Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

QinetiQ

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

June 2, 2014
Volume XLVIII, Issue 5


The Freedom of the Cloud

Excerpted from Computer World Report by Chris Poelker

I recently attended the CLOUD COMPUTING EAST 2014 conference put on by the DCIA and the CCA in Washington DC, and as usual, I was pleasantly surprised at the quality of the speakers at these cloud events.

The information imparted was on par and sometimes surpassed what I have seen in much larger and much more expensive events. Kudo's to the DCIA and CCA in sponsoring such an informative conference!

If your life is crazy like mine, then when you are away from your office for a few days, your emails and voice mails probably have a tendency to pile up. When I returned, I began my usual process of sorting through all my messages to see if I missed anything important.

As I was separating the internal stuff from all the vendor inquires, I began to notice a common thread. All the messages I received from the vendor community were trying to sell me their services or solutions. All the vendors were competing with each other to offer their solutions as the best way to solve a particular information technology (IT) problem.

As an example, there was a call from a technology firm asking me if I was interested in purchasing a solution to reduce the power requirements in our data center. There was another informing me of the unique advantages their solutions could provide in managing our virtual server environment. Yet another informed me that they would be able to make our applications go faster by using their disk technology.

Going through all these calls and messages made me realize the actual tangible freedom that cloud computing has to offer.

I very politely informed each of these vendors that I was sure their products were great, but we no longer required any of their products or services as our entire technology infrastructure is now either in the cloud or provided by cloud-based applications.

Each one of the vendors I spoke with seemed a bit disoriented as they realized the solutions they were trying to sell me were no longer relevant to my situation. I also began to realize that if you are a vendor, and your solutions are not embracing the cloud, you may soon find yourself trying to sell stagecoach wheels in a Ferrari world.

The cool thing about technology is it NEVER becomes boring, as change is constant, so you better keep up!

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWikipedia defines cloud computing as "Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid."

In some ways, cloud computing comprises a new client-server mechanism, one that opens up resources from multiple Internet-connected devices with lower barriers for entry.

For web developers, this means access to hosted applications and data, along with cloud-based development services, enabling them to create web applications that have access to data and services like never before.

To help developers learn how to capture their share of this huge opportunity, the Distributed Computing Industry Association (DCIA) and Cloud Computing Association (CCA) are partnering to present the CLOUD DEVELOPERS SUMMIT & EXPO (CDSE:2014) in Austin, TX on October 1st and 2nd.

Microsoft's cloud platform Azure and Apple's iPad, which utilizes cloud-based application services heavily, are driving developer demand, and developers are going to be inclined to use these services, which provide enhanced performance, scalability, and additional security that their own web hosts just can't provide.

At CDSE:2014, highly focused business strategy and technical keynotes, breakout panels, and seminars will thoroughly explore cloud computing solutions and offerings, and ample opportunities will be provided for one-on-one networking with the major players in this space.

Google's App Engine enables developers to build -- and host -- web apps on Google own servers, which allow for faster development and deployment processes, an easier administration process, without the need to keep up with hardware upgrades, patches, or back-ups.

Amazon Web Services (AWS) provides many cloud services for developers, including Amazon CloudFront for content delivery, Amazon Fulfillment Service for ecommerce, Amazon CloudWatch for monitoring, Amazon Simple Storage Service (S3) for storage, and Amazon Virtual Private Cloud (VPC) for networking.

Codeita allows developers to do everything from a cloud perspective, from design, to coding to publishing, using the same web technologies that power the web today. These technologies are Linux, Apache, MySQL and PHP (LAMP). Codeita includes auto-highlighting code editor, in-browser page content editing, advanced svg image editor, project specific task management, flexible cloud storage, and in-browser FTP publishing.

CDSE:2014 will feature co-located instructional workshops and conference sessions facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

All aspects of the cloud computing sector will be represented: storage, networking, applications, integration, and aggregation.

Attendees will, see, hear, learn and master critical skills in sessions devoted to the unique challenges and opportunities for developers, programmers, and solutions architects in six distinct topic areas.

Three tracks will cover mobile, logistics, and big data considerations that cut across nearly every enterprise vertical migrating business functions to the cloud.

And three tracks will zero-in on three of these economic sectors that are now experiencing the most explosive growth: media and entertainment, government and military, and healthcare and life sciences.

The CCA & DCIA will debut the all new Cloud Computing Competency Certification (CCCC) program will be introduced with opportunities to qualify and receive Level One Certification (CCCC-L1) on site.

Workshops provide cloud solutions providers with the opportunity to present hands-on and how-to sessions to groups of customers and prospective customers on various aspects of using their services.

These sessions can also be dedicated to training third-party developers on opportunities for creating added-value plug-ins and related complementary offerings to be used and marketed in conjunction with the primary solutions.

An interactive expo will augment strategic conference business sessions and in-depth technical workshops by providing extended exhibit hall hours to allow participants to spend time viewing demos and exhibits and hearing first-hand from the industry's leading service and solution providers.

According to the research firm IDC, cloud computing was an estimated $47.4 billion industry in 2013 and is expected to more than double by 2017. The cloud's 23.5% compound annual growth rate is five times faster than that of the broader technology market.

Cisco, Forrester and Forbes predict that all the material on the Internet for marketing — video, text, audio, and everything in between — will be encompassed within the cloud and will be fully integrated by 2016. Based on these estimates, marketing needs for virtual space will balloon to 3.3 terabytes per user in two years.

The needs have never been greater for developers, programmers and architects to advance their knowledge, capabilities and skill-sets in order to profit from this revolutionary transformation in the business processes of the future.

The DCIA and CCA are very proud to have joined together to respond to these needs by presenting CDSE:2014 in one of the nation's most vibrant and technologically innovative cities.

To learn more about conducting an instructional workshop, exhibiting, or sponsoring CDSE:2014, please contact Don Buford, CEO, or Hank Woji, VP Business Development, at the CCA.

To review conference topics and apply to join the speaking faculty for this event, please click here. If you'd like to speak at this major industry event, please contact me at the DCIA, at your earliest convenience. Share wisely, and take care.

Bill to Limit FCC's Title II Powers Introduced

Excerpted from CED Report by Brian Santo

Congressman Bob Latta (R-OH) introduced legislation that would explicitly bar the Federal Communications Commission (FCC) from reclassifying broadband under Title II of the Communications Act.

The FCC is in the process of establishing a new regulatory framework for Network Neutrality. FCC Chairman Tom Wheeler has consistently threatened to reclassify broadband as a communications service (Title II) should communications service providers fail to accede to whatever the new rules are.

Latta said his legislation is intended "to ensure the Internet remains open and free from government interference."

"In light of the FCC initiating yet another attempt to regulate the Internet, upending long-standing precedent and imposing monopoly-era telephone rules and obligations on the 21st Century broadband marketplace, Congress must take action to put an end to this misguided regulatory proposal," said Latta on his website.

"At a time when the Internet economy is thriving and driving robust productivity and economic growth, it is reckless to suggest, let alone adopt, policies that threaten its success. Reclassification would heap 80 years of regulatory baggage on broadband providers, restricting their flexibility to innovate and placing them at the mercy of a government agency."

"These businesses thrive on dynamism and the ability to evolve quickly to shifting market and consumer forces. Subjecting them to bureaucratic red tape won't promote innovation, consumer welfare or the economy, and I encourage my House colleagues to support this legislation, so we can foster continued innovation and investment within the broadband marketplace."

As a practical matter, the bill is a promise that Title II reclassification would be resisted, and few want to resist that more than the leading ISPs in the US — cable operators.

The NCTA quickly issued a statement praising Latta's bill: "Since the late 1990s, policymakers and regulators have established a bipartisan consensus that a light regulatory touch provides the best path for ensuring that the Internet will become an engine of economic growth and social prosperity. We support the efforts of Vice Chairman Latta to codify current policy and to ensure that the Internet continues to grow and remains open and free from the burdens of outdated, public utility regulation."

"Light regulatory touch" has become an NCTA mantra, but in his remarks at the recent Cable Show, Wheeler told cable operators that the regulatory constraints they are currently subject to are not just "light" — they're "barely discernible."

Tech Firms Press US Congress to Bolster Privacy Protection

Excerpted from Business Standard Report by Elena Schneider

A law that allows the government to read email and cloud-stored data over six months old without a search warrant is under attack from technology companies, trade associations, and lobbying groups, which are pressing US Congress to tighten privacy protections.

Federal investigators have used the law to view content hosted by third-party providers for civil and criminal lawsuits, in some cases without giving notice to the individual being investigated. Nearly 30 years after US Congress passed the law, the Electronic Communications Privacy Act (ECPA), which government officials have interpreted to cover newer technologies, cloud computing companies are scrambling to reassure their customers, and some clients are taking their business to other countries. 

Ben Young, the General Counsel for Peer 1, a web hosting company based in Vancouver, British Columbia, said his customers were keeping their business out of the US because the country "has a serious branding problem." "We've enjoyed a competitive advantage in Canada," he said, "because the public perception in the business community is that American law enforcement has more access to data than in other parts of the world." 

Places such as Germany, Iceland, and Switzerland are trading on a reputation of stronger protections for companies, but such safeguards are not universally tighter than those in the United States. "Some countries are stricter on privacy, and some of them are not," said Mark Jaycox, a legislative analyst at the Electronic Frontier Foundation (EFF), a technology advocacy group. 

Privacy has been an increasing concern since Edward J Snowden's revelations last year about bulk data collection by the National Security Agency (NSA), but an overhaul of the ECPA has failed to break into the national conversation. 

"Because it's not sexy," said Katie McAuliffe, the Executive Director for Digital Liberty at Americans for Tax Reform. 

The US' image problem has caused "real, tangible harm" for businesses, said Christian Dawson, the Chief Operating Officer at ServInt, a web hosting company based in Reston, VA. "It's very easy for providers outside the country to say, 'Hey, move your business offshore into an area that cares more about your privacy.' They don't have better laws necessarily. They have a better marketing department." 

Silicon Valley giants like Facebook, Twitter, and Google say they will no longer hand over their customers' data without a search warrant. But smaller web hosting and cloud computing companies may be outmuscled by law enforcement officials as they try to protect their customers, said Ron Yokubaitis, the co-chief executive of Data Foundry, a data center company based in Texas. "Mostly, they are going to comply because they don't know their rights or can't spend the money to resist," he said. 

A coalition of technology companies, trade associations and lobbying groups, called Digital Due Process (DDP), is pushing Congress to bolster privacy rules. Bipartisan bills in the House and the Senate have brought together a hodgepodge of supporters, including liberals and Tea Party favorites. 

Senator Mike Lee, Republican of Utah, co-sponsored the Senate bill. He said in a recent interview that "like most Americans," he was shocked to find that the 1986 statute was on the books. "Almost every American thinks that it is frightening that we have a law that suggests that the government has the right to read your email after only 180 days," Lee said. "It's an easy issue in which to achieve bipartisan compromise and consensus." 

The bill would require a search warrant for access to electronic communications, with exceptions for some emergency situations. It would also require the government to notify individuals within 10 days that their information was being investigated. 

However, it does not address rules for location data, like GPS information from an individual's cellphone. The Senate Judiciary Committee approved the bill a year ago, but it has since stalled. 

One reason is resistance from federal investigating agencies that use subpoenas to gain access to electronic communications in civil cases, particularly the Securities and Exchange Commission (SEC). "The SEC cannot get a search warrant, so a bill that requires a warrant to obtain emails from an ISP would undermine the SEC's ability to protect American investors and hold wrongdoers accountable," said Andrew Ceresney, the Director of the Division of Enforcement at the SEC, referring to Internet service providers. Instead, the SEC would have to rely on an individual's voluntary disclosure of digital content. 

But some legal experts, and at least one appeals court, do not find that argument compelling. "The courts say that email on a server somewhere is like email in your virtual home," said Orin Kerr, a Professor at George Washington University Law School. "We wouldn't say the SEC should have the power to tell your landlord to break into your apartment and get evidence. The same rule should apply." 

The United States Court of Appeals for the Sixth Circuit, in Cincinnati, ruled in 2010 that part of the ECPA was unconstitutional. Since the decision, most major technology companies have required a search warrant for customers' content.

FedRAMP Cloud Standards Deadline: What Comes After

Excerpted from InformationWeek Report by Brian Burns

The June 5th deadline for federal agencies to certify operations to the cloud is just the beginning. Agencies must still stay on top of daily cloud security.

Back in 2011, when the Office of Management and Budget (OMB) set June 5, 2014, as the deadline for agencies and cloud service providers to meet a new set of cloud security standards called FedRAMP, government agencies had only just begun creating plans to migrate to these cloud platforms.

At the time, FedRAMP -- the Federal Risk and Authorization Management Program -- was still evolving, but it at least prompted agencies to start thinking about cloud security and keep it in the forefront of their tech decision-making.

But thinking and planning are far different from actually executing -- and that is where we are now with June 5 approaching. That's raising lots of questions, including who will be ready? Is there enough time to get ready? What does FedRAMP-compliance actually mean? Why does it matter? The short answer: It depends. Here's why.

Because no two cloud service providers (CSPs) offer the exact same product or service -- and given the risk of standing up an application within a non-FedRAMP cloud -- government agencies have turned to systems integrators for help. They can identify the CSPs best qualified to meet their needs for migration -- and for managing daily service operations, which is an extremely important part of the successful deployment.

FedRAMP compliance primarily guarantees that the CSP's infrastructure, from the physical data center through and including the hypervisor, is secure and meets a specific set of standards. Think of this as the securing of the cloud. What's not included in these standards is securing within the cloud.

What securing within the cloud means is designing, deploying, and managing the specific security controls crafted around the agency's applications. This can include patching operating systems, setting up the firewalls, intrusion protection and detection, anti-virus and anti-malware software, and connecting external agency networks such as NIPRNet and SIPRnet, as well as the remediation of potential security threats within the cloud, actual breaches, or both.

The responsibility for these types of operating issues typically belongs to the agency, or the systems integrator managing the application for the agency. That responsibility is sometimes referred to as the missing link in the cloud. Moving an application to a FedRAMP-compliant cloud does not alleviate the ongoing daily management responsibilities. If anything, moving to a cloud-based solution means accepting more responsibility for the security of the applications.

When June 5 rolls around, if any agency's CSP is not FedRAMP-certified, that agency is taking a big risk. An agency's IT leaders can opt to obtain a waiver, if they have reason to take that step. But there is a real possibility the agency might be denied the waiver, meaning it would not receive authority to operate an application in the cloud service.

According to GSA's FedRAMP website, as of May 16, 2014, there were 11 FedRAMP-certified cloud services available for government agencies to select from. There are more than 20 additional CSPs close to being granted authority to operate, and even more CSPs in the queue waiting to go through the certification process. That's remarkable when one considers these CSPs made the investment to deploy cloud services capable of meeting FedRAMP's rigorous controls in a span of just two years.

How well agencies meet the spirit, if not the intent, of the deadline remains an open question. Given where agencies stood when FedRAMP was first conceived, there's little question agencies are better prepared to move to the cloud today than they might have been without FedRAMP.

The larger question to ask, though, is will FedRAMP be the bridge to help rebuild citizen confidence in government computing and technology deployments? The answer to that still lies in the clouds.

Octoshape & Mobibase Team on OTT Content Distribution

Excerpted from Rapid TV News Report by Michelle Clancy

Octoshape and Mobibase have teamed up to provide mobile publishers, Internet protocol televisions (IPTV) providers, and over-the-top (OTT) broadcasters with a consumer-focused content distribution service delivered to connected devices.

The global joint offering combines the Mobibase rich media library, which includes more than 200 themed channels and 15,000 on-demand videos, together with Octoshape's Infinite HD-M OTT distribution platform to offer programmers and OTT service providers increased content on a per subscriber, per channel basis. 

With the integrated service, existing and new Octoshape customers will be able to subscribe to tailored, linear and on-demand content. Existing and new Mobibase customers will be able to offer an increased quality of service to their end customers by offering a broadcast TV viewing experience similar across broadband connected devices. 

"Our partnership with Octoshape accelerates the move from broadcast distribution to broadband distribution," said Vincent Roger, CEO of Mobibase. "We can now offer a modular and unique pricing model that includes content, asset management, set-top boxes and distribution across public networks for a fixed fee, per subscriber, per month."

Telefonica Building NFV Reference Platform 

Excerpted from Light Reading Report by Mitch Wagner

Telefonica is developing a network functions virtualization (NFV) reference platform and lab in conjunction with Red Hat and Intel as part of a two-year roadmap for deploying NFV internationally.

Telefonica SA is looking to deploy NFV "as soon as possible," says Enrique Algaba, Network Innovation and Virtualization Director for its R&D arm, Telefonica I+D. "We need it instantly."

The company needs NFV to shorten time to market for new services, reduce costs, reuse infrastructure, and avoid vendor lock-in, Algaba says. This year, it plans deploy an NFV infrastructure, and it is testing customer premises equipment virtualization and moving customer premises hardware equipment to the network. It also plans to launch commercial virtual CPE service this year.

Telefonica is running proof-of-concept trials for virtual private clouds, IP Multimedia Subsystems, and remote access services, and it hopes to deploy those technologies by next year. "We need to check on whether the industry is mature enough to provide these new network functions."

Part of the testing process is measuring business value. "We will virtualize if these new functions bring us more benefits than the solution we have today," Algaba says.

As part of the road to NFV, Telefonica announced this week that it is working with Red Hat and Intel on developing a virtual infrastructure management platform based on open-source software running on standard Intel servers. The platform is part of Telefonica's recently created NFV Reference Lab, designed to help partners and network equipment providers test and develop virtual network functions and orchestration services.

The NFV reference platform will be based on the Intel Xeon processor E5-2600 V2, Red Hat Enterprise Linux, a Kernel-Based Virtual Machine hypervisor, Red Hat Enterprise Linux OpenStack, and OpenFlow enabled switching. Telefo nica, Red Hat, and Intel will contribute engineering and testing, collaborating with partners and the open-source community.

The infrastructure will need to link with the current OSS/BSS systems to provision and manage networks.

Also this week, Telefonica, Red Hat, and Cyan announced that they are jointly developing orchestration software for precision VNF deployments -- deploying virtual network functions precisely in the datacenter, rack, or blade for best performance, rather than simply deploying to the cloud and allowing the cloud orchestrator to deploy the service automatically.

Telefonica disclosed its ambitious NFV plans in a closed-door briefing in February with global CTO Enrique Blanco, saying it will roll network virtualization out to its international operations starting this year in an initiative it calls UNICA. Telefonica is a founding member of the European Telecommunications Standards Institute NFV Industry Specification Group.

The company launched NFV trials in Brazil in October, following deployment of virtual home gateway functions with partner NEC. In late March, Telefonica described a series of pilot programs to implement SDN across its network, working with vendors such as Huawei, Infinera, Cisco Systems, and Juniper to achieve a more flexible, automatic networking provisioning system.

Huawei's Agile Data Center Cloud Connect Solution

Excerpted from Converge Network Digest Report

Huawei introduced its Agile Data Center Cloud Connect Solution that ties together its CloudEngine series of data center switches, the Huawei Agile Controller and cloud applications. 

The solution, which was launched at the Huawei Network Congress 2014 (HNC) held in Beijing this week, helps IT administrators to provision network resources. Each type of service can be represented by an independent application profile. 

The Agile Controller is capable of interpreting three types of perspectives: the application profile perspective, the logical network perspective, and the physical network perspective. The Agile Controller automatically converts application profiles into the required logical networks, and delivers the associated configurations to physical network devices, allowing network resources to be dynamically migrated or adjusted on-demand and based on service requirements. 

Huawei said it is actively building a cloud computing data center ecosystem. 

Its Cloud Connect Solution connects to VMware's vCloud cloud management platform and NSX network virtualization platform to provide automated network policy migration and VxLAN based hardware gateway solutions. 

"In 2012, we launched the Cloud Fabric Data Center Solution and the industry's highest performance data center switches, the CE12800 series, allowing us to build scalable, virtualized, and open cloud data center networks for customers. To date, more than 360 global customers have implemented the Cloud Fabric solution and around 1,800 CE12800 switches have been deployed in cloud computing data centers", said Mr. Liu. 

"Today, we've introduced the Agile Data Center Cloud Connect Solution and we want to work with our partners to build a fully integrated cloud service system. The solution will integrate network, compute, and store resources in data centers to unify the virtual and physical network worlds, implementing multi-cloud connectivity and cloud-based network automation to make cloud computing simpler."

In May 2013, Huawei introduced its CE12816 CloudEngine (CE) switch for the data center core, boasting the a 64 Tbps total capacity. The new switch uses Huawei's next-generation VRP8 software to deliver high-performance L2/L3 switching capabilities. Like all the switches in Huawei's Cloud Engine 12800 family, the CE12816 provides support for 1, 10, 40 and 100 GE connectivity. Densities supported on the CE12816 include up to 192 100GE, 384 40GE, or 1,536 10GE line-speed ports.

The CloudEngine series provides high bandwidth of up to 2 Tbps per slot(scalable to 4 Tbps) and switching capacity of 64 Tbps. Huawei's CloudEngine series incorporates a Cluster Switch System (CSS) feature to virtualize multiple switches into one logical switch, as well as the Virtual System (VS) feature to virtualize one switch into multiple independent logical devices. 

Huawei said its CSS and VS capabilities turn the network into a resource pool, allowing network resources to be allocated on demand. The CloudEngine series also supports virtual machines by allowing network administrators to build large-scale Layer 2 networks with over 500 nodes based on TRILL, allowing for fast migration and flexible service deployment. Combined with the usage of the nCenter, network management system, the CloudEngine series is able to achieve over 10 times the virtual parallel processing capability of the industry average.

CenturyLink Plans New Technology Center

Excerpted from the News Star Report by Greg Hilburn

Glen Post spent much of his address to CenturyLink's shareholders Wednesday talking about the Monroe, LA based company's future.

"We believe we're well-positioned," CenturyLink's chief executive said during the annual shareholders meeting at the company headquarters on US 165.

And one of the reasons is the company's new facility that will soon share the headquarters' campus, where Post said innovation and new technology will be created and tested.

"We're excited about getting it complete," Post said. "We believe we can attract the top technical talent in the world. Vendors and employees will be testing the latest technology in our labs and network operations center."

Construction on the 300,000-square-foot CenturyLink Technology Center of excellence will be completed late this year and open in early 2015.

The addition will also house 800 new employees for CenturyLink, which is the largest company based in Louisiana.

"It's an exciting time for the company and our employees and shareholders," Post said following the meeting.

And most of the shareholders who attended the meeting agreed.

Bob Garst Jr., whose family is the largest non-institutional stockholder with about 1 million shares, said he's "told everybody to buy all of the shares they can. We haven't sold any and we're buying more," he said.

"If you're asking if we have confidence in the company, you better believe it," he added.

Members of the Garst family, who sold their Webster Country (Missouri) Telephone company to CenturyLink in 1981, have attended 33 straight shareholder meetings. Former patriarch Bob Garst Sr., who had been a fixture at the annual shareholders meeting for decades, died September 2nd at 93.

"Dad loved coming to Monroe and meeting with the leadership," Bob Garst Jr. said.

Harvey Perry of Monroe, a director on CenturyLink's board, said the board "has great confidence in the company's leadership and direction moving forward. It's been amazing to see the evolution of this company."

CenturyLink employee Douglas Schmidt brought his sons Stephen, 9, and Christopher, 14, to the meeting.

"They're both stockholders," Douglas Schmidt said. "Chris is working on a Boy Scout merit badge learning about stocks and bonds so I felt like this would be a great experience for him."

Christopher Schmidt said he would pay close attention to Post's address.

"I believe in the company, and I'm confident we'll see increased growth in the future," Schmidt said.

Government Cloud Use Hits Inflection Point

Excerpted from InformationWeek Report by Michael Biddick

New standards, security, and architectures mean the Cloud First stars are finally coming into alignment.

Sometimes we're our own worst enemies. When White House officials announced the Cloud First mandate in 2010, it created big expectations. Cloud would help rein in the $80 billion (and growing) federal IT budget while delivering efficiency and reuse and off-loading repetitive tasks from federal staff. Cloud vendors leaped into action, spending millions developing new offerings.

Only a few agencies grabbed the ball and ran. The General Services Administration (GSA) is a prime example, having moved email and other apps to the cloud, and it has reaped rewards. The National Oceanic and Atmospheric Administration (NOAA), the Department of Agriculture (DoA), and most recently the Interior Department also have forged ahead to the cloud, especially for public-facing websites and data. The Federal Energy Regulatory Commission is modernizing a decade-old eLibrary application by moving it to the cloud.

But too many federal IT teams resisted and, in the process, shot themselves in the foot as budgets got tighter and working conditions more strained. We're not surprised that about half of the 532 federal government IT professionals responding to InformationWeek's 2014 US IT Salary Survey are looking for new jobs in the wake of a three-year salary freeze. Maybe if cloud adoption hadn't been confined to the low-hanging fruit of email and similarly easy-to-convert systems, CIOs could have freed up money to invest in human capital.

It's not all the agency leaders' fault. They've been handicapped by slow progress in acquisition reform, thorny legal issues surrounding data ownership, and privacy concerns. As happens so often in Washington, a daunting list of policy challenges, armies of lawyers, and stifling bureaucracy beat sparks of progress into submission.

However, while cloud standards are by no means complete, there's enough progress that this is a great time to start or revive a cloud project. As they do so, agency leaders must make sure they address the biggest problems to cloud implementation: a jumbled vendor landscape, the reality that ''hybrid'' clouds might just create new information silos, and the fear of getting locked in or surrendering control.

The federal cloud computing market is fiercely competitive, with significant IT infrastructure capacity chasing relatively few federal buyers. We see two primary types of cloud vendors: call them contemporary cloud providers and traditional government contractors.

In the contemporary cloud provider category, think Amazon, Microsoft, and Google. They emerged out of the commercial market and still mainly target the private sector, but they're retrofitting their environments to meet federal security frameworks. Their strengths tend to be granular pay-for-use, immediate availability of resources, and rock-bottom commodity pricing. It's easy to grab a credit card, sign up for a service, and go. Unfortunately, this pay-and-play experience doesn't transfer to the federal market because of governance, acquisition, and security requirements.

The second group, which emerged from the traditional federal-managed service-provider pool, includes providers that typically custom build systems against detailed specs with strict physical and logical security zones. Lockheed Martin, CGI, American Systems, Hewlett-Packard, and IBM fall into this group; they might offer something as a cloud service that incorporates only some aspects of cloud computing, such as virtualization. You're not getting many pricing wars. They do, however, understand their customer base. These providers are working to make their features more similar to commercial contemporary cloud environments in an effort to be more competitive.

Amazon Web Services is often held up as the gold standard for infrastructure-as-a-service features and capabilities. Microsoft and Google are fiercely battling for the office-automation software-as-a-service market. Meanwhile, traditional hardware vendors including NetApp, Dell, EMC, IBM, and HP are advocating that agencies simply modernize their existing data centers, making them more ''cloud-like,'' and avoid the security and control concerns surrounding public cloud altogether.

And then there's a third category: the government provider. Agencies such as the Defense Information Systems Agency, the Navy's SPAWAR Command, and the Treasury and Health and Human Services are making their cloud services available to other departments and might strike interagency agreements to bypass complex procurement challenges. Most haven't figured out how to bill based on usage, but that will come in time. Sharing is the future -- the Office of Management and Budget's (OMB's) Digital Government Strategy demands that agencies function more like data service providers.

But most of the 155 federal government technology professionals responding to our InformationWeek 2013 Federal Government IT Priorities Survey didn't get the memo. Cyber security and disaster recovery lead the list of 32 priorities, interagency collaboration lands in 13th place, and cloud is way down at No. 21.

Federal IT managers have to forge smart partnerships, and there are no easy answers. Each agency and department needs to develop a strategy based on its applications, data, and infrastructure requirements. Generally, for commodity IT resources and applications, the public cloud makes sense. For sensitive and data-driven applications, agencies might need to use on-premise data centers, but make sure they're highly efficient and optimized to take advantage of larger pools of IT resources.

Cloud Computing, Big Data, and Healthcare IT: The Trifecta

Excerpted from Wired Report by Sarah McMullin

When my daughter was born, she was given all the standard tests, pricks, and prods given a newborn, and I was sent on my way with a stack of paperwork and records. I was informed that the state of Texas would keep track of her immunizations in their database, but there was also a small slip of paper, a lab slip, for me to bring to our first appointment with her pediatrician. This lab slip ordered a follow up blood test, standard procedure in the state to check for certain disorders and conditions. We went to that appointment and the pediatrician informed me that her office didn't have a lab, so I needed to take the slip to another facility.

My daughter is now three, and I still don't know what happened to that stupid lab slip. As a sleep-deprived mother of a newborn I was expected to cart around one small slip of paper, the size of an index card, from location to location, call the lab to set an appointment, then call the office to get the results. Happily, my daughter saw another doctor later who did the test in office and everything came back clear, but as a dazed mother, freshly home from the hospital, it was clear the system had a gaping hole for human error.

This hole, handing slips to patients and pharmacists and practitioners, is not just inconvenient, it can be deadly. In February of 2012 a British man died of an allergic reaction to penicillin "because a sticky note was covering a warning in his drug records." A little slip of paper was the difference between life and death.

So what is the solution to this plague of papers clogging our healthcare arteries? While perhaps there is no perfect solution, the market is bringing two ideas together in a way that can drastically reduce mistakes, improve outcomes, and cut costs. Those two concepts are cloud computing and big data.

The last dentist I saw did all my x-rays digitally, and when another specialist needed to see inside my teeth he simply opened the secure digital files from my dentist. As a consumer, the convenience was great but even more important was saving a few hundred dollars on repeat, redundant x-rays. If, when my daughter was born, her medical records were all kept electronically on a secure cloud, I wouldn't have had a lab order slip to lose. Instead of handing over a necessary, tiny piece of paper, doctors have the ability to access patient instructions, send lab requests straight to the lab through secure connections, and take out that one point of human error. Extrapolate that over the entire medical, dental, and pharmaceutical industry and the potential cost savings are astronomical, the potential for error reduction, spectacular.

The cloud is a game changer for several reasons. First, cloud computing allows easy access to information. Potential life-threatening allergies can be flagged in bright red from iPad to Android device, from the hospital to the care facility, assuring that sticky notes aren't impeding communication of life-saving facts. Second, the cloud lowers the barrier to entry for smaller entities. Whether a practice owns a thousand wireless devices or two, the data can be accessed using the same interface.

Equally exciting to the healthcare industry is the possibility of big data being used to improve patient care outcomes. Regulatory agencies are increasingly asking institutions to utilize the power of big data to reconcile patient medication history. This reconciliation stands to reduce dangerous medication interactions as well as identify issues in effectiveness. Raw data by the terabyte, through robust technology and wise analytics, can identify trends that would otherwise be invisible or at least hard to track. The bigger the data the better. Consider the possible public health ramifications of hospitals being able to identify in real time the occurrence of patients with a highly contagious illness walking through their doors? Previously undiscovered negative drug interactions could be identified almost immediately if big data is properly mined and managed.

Perhaps the greatest hurdle for healthcare IT to overcome is the creation of a meaningful way to combine cloud storage and access with big data in a way that is intuitive and useful. Ease of use assures that data is easy to input and share for every person on the chain of health information, regardless of tech skill, an issue faced by doctors and practitioners being asked to adopt and implement new technologies and best practices without training. For decades they have been trained to hand a slip of paper to a patient needing lab work, and even though a digital request is faster and more secure, the movement toward digitization in the healthcare industry has been slow. In order for the transformation to be successful, the industry has need not just for technology but for technology that is easy to use, unquestionably secure, and affordable to implement. Because the core of healthcare is people driven, the core of healthcare IT must also be people driven.

Healthcare IT may be slow to change, but as big data and cloud computing continue to grow in ubiquity, the change will continue its inexorable march forward. Combining these two ideas will lead to fantastic increases in efficiency and improved patient outcomes so long as the technology developed is created with usability and ease of implementation in mind.

Now, if only I could find that lab slip.

Global Healthcare Information System Market Forecast to 2019

Research and Markets has announced the addition of the "Global Healthcare Information System (Application Based - Hospital and Pharmacy Information System and Laboratory Information System Delivery Based - Web Based, On-premise and Cloud Based, Software and Services) Market Forecast to 2019" report to their offering. 

A healthcare information system is an extensive integrated system which captures, stores, manages and transmits information related to the health of individuals or the activities of organizations that work within the healthcare sector. 

Globally, increase in aging population is playing a major role in increasing the demand of healthcare information system. Older people have less regenerative abilities and are more prone to disease, syndrome and sickness. 

Some of the key driving factors for the healthcare information system market are aging population, rising healthcare cost, rising government initiatives, rising need for integrated healthcare system and rising investments by healthcare IT players. 

However, the market faces some restraints such as lack of experienced professionals, high maintenance & service expenses and interoperability issues. North America has the largest healthcare information system market and Asia is the fastest growing healthcare information system market. 

Some of the fastest growing markets for healthcare information system are China, India, Japan and the US. Adoption of wireless and cloud computing is constantly on the rise, which is resulting in reduction in operational costs. 

For instance, the number of patients who used home health monitoring systems was about 2.8 million in the world in 2012. The growth rate for home health systems is projected to increase to 26.9% in the near future. Similarly, About 5.7 million patients are expected to be monitored with a wireless medical device by 2014. Hospital information system is the largest application segment in healthcare information systems market and it is expected to grow at a CAGR of about 6.9% during 2013 - 2019. 

Based on delivery mode, the healthcare information system market can be classified into web based technology, on-premise technology and cloud based technology. In the segment of geographic analysis, the report identifies and analyzes market sizes and forecast of North America, Europe, Asia Pacific and Rest of the World (RoW). North America region covers the scenario of the US. Europe region covers the scenario of France, Germany and the UK. Asia Pacific region highlights the scenario of India, China and Japan. 

GE Healthcare is the leading player in the hospital information system market. Other major players of healthcare information system market include Philips Healthcare, McKesson Corporation and others.

Mobile Cloud Computing Is on the Rise

Excerpted from Rickscloud Report by Rick Blaisdell

The widespread adoption of cloud computing and mobile is changing our lives, the way we do business and how we handle our day-to-day chores. In many ways, mobility and cloud computing play a significant role from both a consumer and enterprise user standpoint.

While mobility provides worldwide customer experience allowing users to access feature rich applications on the go, cloud provides a robust, scalable and reliable platform to host the data and applications. Basically, cloud with its highly scalable and economical model forms an effective backend for mobile devices and apps for their anytime, anywhere access.

Actually, several instances of such applications can be found today that rely on cloud in the backend and provide an intuitive user experience on mobile devices such as Dropbox for Storage, Amazon and iBooks for literature, Pandora for music, Flickr and Instagram for photographs, YouTube for videos and Google Docs, Office 365, and SkyDrive for document editing and storage.

According to a study from Juniper Research, the cloud-based mobile market will generate annual revenue of $9.5 billion in 2014 from $400 million in 2009, at an average annual increase of 88%. With this mind, let's take a look at the factors that are fostering the adoption of mobile cloud computing:

Trends and demands — customers expect the convenience of using a company website or application from anywhere and at anytime. Mobile devices can provide this convenience. Enterprise users require always-on access to business applications and collaborative services so that they can increase their productivity from anywhere, even when they are on the commute.

Improved and increased broadband coverage - 3G, 4G and LTE along with WiFi, femto-cells, fixed wireless etc. are providing better connectivity for mobile devices.

Enabling technologies - HTML5, CSS3, hypervisor for mobile devices, cloudlets and Web 4.0.

With the increasing use of our smartphones, mobile applications are storing your data in the cloud as opposed to on the mobile device, and the applications become more powerful as processing power is also offloaded to the cloud.

Of course with developments in technology come potential problems. The lack of speedy mobile Internet access everywhere is the primary problem. Depending on your coverage, you could be stuck in an area where 3G may be a bit spotty which leads to slow connection speeds. Battery life is also an annoying problem. So, although significant research and development is available in the mobile cloud computing industry, the following domains still need further research:

Architectural issues - Reference architecture for a heterogeneous environment is a crucial requirement for unleashing the power of mobile computing towards unrestricted computing.

Mobile communication congestion - Mobile data traffic is tremendously hiking by ever increasing mobile user demands for exploiting cloud resources which have an impact on mobile network operators and demand future efforts to enable smooth communication between mobile and cloud endpoints.

Energy-efficient transmission - Mobile cloud computing requires frequent transmissions between cloud platforms and mobile devices, due to the debatable nature of wireless networks. This is why the transmission protocol should be carefully designed.

Context-awareness issues - Context-aware and socially-aware computing are inseparable attributes of current computers. To achieve the vision of mobile computing among heterogeneous converged networks and computing devices, designing resource — efficient environment-aware applications is an essential need.

Trust, security, and privacy issues - Trust is an essential factor for the success of the evolving mobile cloud computing industry.

Mobile cloud computing is gaining stream, and enterprise mobility solutions are mostly driven around the focus on how they can provide the business with agility, enhanced productivity and a platform for innovation using cloud platforms. So, is your business ready?

Coming Events of Interest

2014 Akamai Government Forum — June 5th in Washington, DC. The 2014 Akamai Government Forum will present a new path forward for the future of cloud-based solutions and cybersecurity in federal government. The event will be produced by Government Executive Media Group, the government and defense division of Atlantic Media.

Enterprise Apps World — June 17th-18th in London, England. EAW is a two day show, co-hosted with Cloud World Forum, that will look at all the implications of going mobile in the workplace and how enterprise apps can help.

Silicon Valley Innovation Summit — July 29th-30th in Mountain View, CA.AlwaysOn's 12th annual SVIS is a two-day executive gathering that highlights the significant economic, political, and commercial trends affecting the global technology industries. SVIS features the most innovative companies, eminent technologists, influential investors, and journalists in keynote presentations, panel debates, and private company CEO showcases.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

International Conference on Cloud Computing Research & Innovation - October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development.

Copyright 2008 Distributed Computing Industry Association
This page last updated June 8, 2014
Privacy Policy