Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Aspera

Citrix

Oracle

Savvis

SoftServe

TransLattice

Vasco

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

March 17, 2014
Volume XLVII, Issue 6


How to Get in Shape for hCLOUD Growth?

Annual spending on cloud solutions by the US healthcare sector will grow from $1.8 billion in 2013 to an astonishing $6.5 billion by 2018.

Driven by needs for greater efficiency, the healthcare cloud market is already growing at a 20% CAGR.

New products and government support for electronic medical records (EMR), computerized physician order entry (CPOE), clinical decision support (CDS), medical imaging information systems (MIIS), and pharmacy and laboratory information systems (PALS), along with declining implementation expense of healthcare information technology (IT) are the new reality.

With so many opportunities, the question is — how to strengthen your company during this time of massive change?

Not to mention -- how to prevent the horrific blunders that struck healthcare.gov?

CLOUD COMPUTING EAST 2014 (CCE:2014) delegates will go away with the tools needed to operate and thrive and the tips needed to avoid pitfalls after networking with leaders in this space.

And it will start with basics from the top players in equipment, software, and networking. Create it, use it, transport it.

These three actions form the basis of the hCLOUD and Dell's Enterprise Technology Strategist Dennis Smith, Microsoft's Technical Evangelist, and Verizon's Distinguished Member of Technical Staff Igor Kantor, will be leading related plenary and track sessions at CCE:2014.

The CCA & DCIA will present CCE:2014 on May 15th and 16th at the Doubletree by Hilton Hotel in Washington, DC.

Please click here to register; here to learn about exhibiting and sponsorship opportunities, and here to apply to speak at this must-attend strategic business summit.

Dennis Smith is Enterprise Technology Strategist at Dell. As enterprise liaison for Dell, Dennis excels in tech marketing for Dell Server, Networking, Storage, and Virtualization solutions. His experience extends to online communications including public/private resolution of social media and related platforms. Dennis is focused on enterprise networking, server, and storage for large organizations. He is also responsible for customer interaction on DellTechCenter.com and the enterprise forums of the Dell Community Site.

Yung Chou is the Technical Evangelist for Microsoft's US Developer and Platform Evangelism team. Prior to Microsoft, Yung had senior technology roles that demonstrated his capabilities in system programming, application development, consulting services, and IT management. His recent technical focuses have been in virtualization and cloud computing with strong interests in private cloud with service-based deployment and emerging enterprise computing architecture. He is a frequent speaker at Microsoft conferences.

Igor Kantor is Distinguished Member of the Technical Staff at Verizon Communications, with responsibilities for large-scale, distributed system architecture, cloud-based, self-healing system engineering, SDN, and agile/continuous delivery software methodologies. His work entails globally replicated storage (Gluster/Ceph), linearly scalable NoSQL databases (Cassandra, Mongo, Couch), and transcoding optimization. Previously he served for five years in senior technical roles at Deloitte Consulting.

Health Care Revs Up for Big Drive into the Cloud

Excerpted from InfoWorld Report by David Linthicum

According to a report from Markets to Markets, we're looking at a cloud computing market for North American health care of about $6.5 billion by 2018, up from $1.8 billion in 2013. The predicted growth is no surprise, given the number of reforms in the Patient Protection and Affordability Care Act (PPACA), the changes in other laws including one protecting our private health data, emerging payment models, and so on.

The use of cloud computing provides those building and deploying healthcare systems with new and more affordable options for creating highly responsive and scalable systems. The use of cloud computing also means systems are more flexible and able to adapt to the changing market dynamics in healthcare.

Of course, cloud computing -- particularly public cloud computing — is not a fit for every field. But considering the challenges in healthcare and medical providers' small IT budgets, providers and payers often have no choice but to use cloud computing.

Five years ago, this notion would seem strange to people in healthcare IT — in fact, they pushed back hardest against the use of public clouds. Their rationale involved the usual FUD around security and privacy. Today, most of those people have either changed their tune or moved on. The uses of cloud computing in healthcare simply make sense to those charged with keeping those systems both running and compliant.

What's unique about the cloud's projected market growth in healthcare is that the uptick is mostly driven be pure need, and not a desire to improve IT through the use of cloud-based resources.

That's a big difference from other industries, whose cloud motivations are less desperate. When you're small, you try to punch above your weight. When it comes to sophisticated IT today, healthcare is small.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyCLOUD COMPUTING EAST (CCE:2014), the upcoming strategic summit for business leaders and software developers in our nation's capital, is shaping up to be a must-attend event.

Plenary sessions will feature principal representatives of such leading organizations as Amazon Web Services, Google, Microsoft, and Verizon, providing delegates with a real insider's view of the latest issues in this space from those leading our industry's advancement.

CCE:2014 will focus on the gCLOUD (The Government Cloud) and the hCLOUD (The Healthcare Cloud).

Jointly presented by the DCIA&CCA, CCE:2014 will take place on Thursday and Friday, May 15th-16th, at the Doubletree by Hilton Hotel in Washington, DC.

This important gathering of thought leaders and first movers will thoroughly examine the current state of adoption and the outstanding challenges affecting two major and increasingly related sectors of the economy, whose principals are currently engaged in migrating to the cloud.

gCLOUD case studies will expose the truth about cloud computing and the public sector. What really happened when a local municipality tried to streamline operations by moving just a few basic functions to the cloud? Why was the FedRAMP experience of one major cloud provider with government bureaucracy such a total shocker?

gCLOUD speakers will include representatives of such organizations as ActiveStateASG Software, Aspiryon, Clear Government Solutions, CyrusOneGlobe Inc., IBM, Kwaai Oak, NASA, NetApp, QinetiQ-NA, SAP America, Tech Equity, Unitas Global, V2SolutionsVirtustream, and WSO2.

hCLOUD sessions will range from revelations of the astonishing experience of a medical imaging company new to this arena to a generous sharing of the deep wisdom from a patient-records-storing firm that was doing cloud computing before the name cloud was even coined.

hCLOUD speakers will include representatives of such organizations as Apptix, AVOA, BrightLine, CSC Leasing, Dell, DICOM Grid, DST, Johnson & Johnson, Level 3Mobily, MultiPlan, NTP Software, Optum, OracleThe PADEM Group, ServerCentral, SYSNET Intl., SoftServe, Stratus Technologies, VeriStor, Vuzix, and WikiPay.

Other featured speakers will include authors, analysts, industry observers, and representatives of such organizations as Aspera, CDAS, Edwards Wildman Palmer, Expedient, FalconStor, Hewlett-Packard, IBM, Intuit, Juniper Networks, M*Modal, MarkLogicNumecent, Rackspace, SAP America, SOA Software, Trend Micro, Trilogy Global Advisors, and Visionary Integration Professionals.

The gCLOUD will examine the ways that local, state, and federal governments can improve services and protect citizens with cloud-based tools. It will also address liabilities and challenges that need to concern government agencies regarding cloud-based services, countering NSA-fallout gloom with energized and confident approaches that overcome concerns raised by the Snowden scandal.

The explosion of data, advances in security and reliability, and options for redundant storage; challenges to natural resource management, transportation, and utility grid monitoring; and the impact of cloud services on law enforcement and emergency responsiveness will be featured topics.

The hCLOUD will explore progress being made by the healthcare industry in adopting cloud-based solutions to become more efficient, collaborative, and interactively connected. It will also address legitimate concerns that healthcare organizations must address in implementing cloud-based services.

Managing private patient records; collecting clinical research data; big-data imaging, and remote patient monitoring will be covered.

Speakers will include end-user organizations, public-sector thought-leaders, and private-sector cloud vendors, representatives of hospitals, clinics, multi-physician practices, and healthcare solutions providers, and executives and innovators from the cloud computing industry.

Please contact Don Buford, CEO, or Hank Woji, VP Business Development, at the CCA to learn more about attractive conference exhibition and sponsorship opportunities.

To review conference topics and apply to join the speaking faculty for this event, please click here. If you'd like to speak at this major industry event, please contact me, Marty Lafferty, CEO of the DCIA, at your earliest convenience. Share wisely, and take care.

Cloud Computing and HIPAA: Achieving Compliance

Excerpted from Tech Page One by Dennis Smith

The first stages of cloud computing have been successfully adopted by individual users and corporations alike. Service providers are now shifting to create industry-specific structures for their clients, and the healthcare industry is a primary target. Healthcare providers are an important segment for industry-specific structures, due to regulations which can't be met in standard public cloud frameworks.

Here are some reasons why healthcare providers are increasingly utilizing cloud computing for their patient information resources:

The healthcare industry requires secure clouds, which provide increased security, flexibility and accessibility. Regulations such as the Health Insurance Portability and Accountability Act and the Health Information Technology for Economic and Clinical Health Act lay out very specific requirements for anyone wishing to adopt a virtualized system.

New HIPAA guidelines are placing even greater pressure on healthcare cloud providers. The recent HIPAA Omnibus Rule, passed in September, holds all parties accountable for data breaches that threaten online patient records. If a patient's Protected Health Information (PHI) is leaked to an unauthorized party, the healthcare provider and the cloud provider are both held liable. Is it thus essential that all parties are on the same page when migrating confidential patient information to the cloud.

Having a secure cloud-based infrastructure can streamline the patient information process, utilizing Electronic Medical Records (EMR) and Electronic Practice Managements to provide flexibility and efficiency for IT operations.

When it comes to critical data, is crucial for both patients and healthcare providers. Healthcare organizations that have successfully moved to a cloud infrastructure ultimately improve the overall experience of the patient, while providing monetary benefits to the organization.

Here are some case studies of healthcare providers leveraging the cloud:

Moses-Ludington Hospital faced a dire situation when their archives crashed and their tape-based backup system malfunctioned. They installed an advanced Picture Archiving and Communications System, but it didn't have sufficient backup functionality. They eventually opted for a unified solution which provided offsite archiving capabilities.

The Daughters of Charity Health System faced an issue well known to many health care organizations: the combination of paper and IT-based processes. Patient data was recorded on paper charts, various applications were used between different departments, and different formats were used to store information. They opted for an EMR, which provided single-sign-on for 200 applications, virtualized desktops, and the ability to implement a bring-your-own-device (BYOD) policy.

A sophisticated healthcare BYOD policy allows physicians to access data anywhere, without having to contact a hospital. Healthcare providers are thus able to focus on their core business, rather than worry about optimizing their IT operations.

The need for industry-specific clouds is strong, and healthcare providers are leading the way. Markets and Markets predicts that the global healthcare cloud computing market could be worth $5.4 Billion by 2017.

Cloud Computing Priorities for Life Science Companies

Excerpted from MedCity Report by Stephanie Baum|

A new IMS Institute for Healthcare Informatics report predicts that some of the largest life science companies need to cut $36 billion in operating costs through 2017. Cloud computing offers an attractive way to reduce costs and make it easier to transmit and receive information But what areas are getting attention now and where do the growth opportunities exist? A survey of 70 life science companies highlighted areas where they are tapping the cloud across departments to make multichannel marketing easier and for developing better care plans.

About 74 percent of respondents said they are looking to derive greater value from healthcare information that includes de-identified electronic medical records and other real-world data. The priorities are customer relationship management, social media and integrated multichannel marketing solutions.

Analytics: Although nearly all respondents or about 98 percent have analytic systems that tend to focus on retrospective analysis, the majority of respondents don't yet have the ability to go beyond that. One main area with a lot of growth potential is predictive analysis capabilities, which are only used by 37 percent of respondents and tend to be delivered by alerts and triggers within applications. Another 28 percent said they had prescriptive analysis, which involves assessing optimal response to trends and recommending an action. IMS sees a lot of potential for these analytical tools to impact healthcare:

"As the mix of new medicines brought to market by pharmaceutical companies is skewing toward those with relatively small target patient populations, it is more important for analytic systems to help identify those patients and their physicians. This accelerates the improvement of health outcomes while also bringing more efficiency to the entire health system."

Interoperability: About 85 percent of respondents said interoperability was a big priority. Cloud-based applications represent the most efficient way to get beyond that challenge by improving workflow speed, eliminating conflicting data interpretations across departments and reducing the cost of manual data hand-offs.

Collaboration between health systems: Hospital systems can pool data to tweak treatment algorithms and improve outcomes. Life sciences companies can't benefit from this due to the lack of consistent formats and privacy standards among cloud-based service providers. Progress is being made to improve access to shared data sets in the cloud that can be used by multiple groups.

Obamacare Cloud Hosting Contract Exceeds 10X Original Value

Excerpted from NextGov Report by Joseph Marks

The Obama administration has extended and roughly doubled the value of a contract with Verizon's cloud division to host, manage, and secure backend data for healthcare.gov and state Obamacare marketplaces because of a delay in moving that data to a new host, according to contracting documents.

The Centers for Medicare and Medicaid Services will pay the cloud division Terremark up to $58 million to store and manage Obamacare data for an additional seven months while it prepares to move that data to a separate Health Insurance Marketplace Virtual Data Center managed by HP Enterprise Services.

The contract extension, described as a "logical follow on," would roughly double the $60 million the government had already agreed to pay Terremark for hosting services as of December 2013. That was when officials completed a series of repairs and upgrades to the troubled healthcare.gov system, which suffered from severe outages and other usability problems during its first two months online.

The final Terremark price tag of roughly $120 million is more than 10 times the original $11 million value of the contract when CMS first awarded it in 2011. The contract had grown to $46 million as a result of 10 separate modifications by the time healthcare.gov launched in October. Two additional modifications brought the price tag to roughly $60 million by the time repairs were complete in December.

CMS has justified the contract expansions by noting that "at the time of the contract award, the scope of cloud computing needs to support the implementation of insurance exchanges was unknown," and that "CMS believed if the additional services were not added urgently, the exchanges would not function as designed and citizens would continue to have issues using the marketplace."

The government contracted with HP to be Terremark's successor as Obamacare data host in July 2013 but the transition was delayed by months of repair work following healthcare.gov's troubled launch in October. The HP contract is valued at $38 million.

CMS also posted three additional modifications to the Terremark contract on Thursday for a firewall upgrade and several other additional services. Those modifications totaled about $2.5 million.

Microsoft, Amazon Cloud Computing Rivalry May Heat Up

Excerpted from Investor's Business Daily by Reinhardt Krause

Ruling the cloud services business is one road to glory for new Microsoft CEO Satya Nadella, but Amazon's leading Amazon Web Services stands in the way.

AWS is by far the biggest provider of infrastructure-as-a-service, in which companies rent computers and data storage via the Internet cloud.

In cloud services, Microsoft first targeted platform-as-a-services — applications, databases and software that run on cloud infrastructure — but it's been gaining traction as an IaaS provider.

Emerging as a strong No. 2 to Amazon might be enough to win kudos from growth-hungry Microsoft shareholders, analysts say.

Neither Amazon nor Microsoft breaks out cloud revenue in their financial results. Analysts have estimated 2013 revenue for AWS at $2.5 billion to $3.5 billion, which they say represents a 50% to 70% jump from 2012.

Microsoft's cloud revenue also is hard to nail down. BMO Capital Markets estimates it's a more than $1 billion business, but that includes license revenue from software Microsoft has retooled to operate on its Azure cloud platform.

Nadella, who was executive vice president of Microsoft's cloud and enterprise group before being named last month to replace Steve Ballmer as CEO, played a lead role in moving Microsoft to the cloud, along with Ray Ozzie, Microsoft's former chief software architect, who left the company in 2010.

Some analysts expect the Amazon-Microsoft rivalry to heat up as Nadella targets cloud computing growth.

"Microsoft may need a strong No. 1 to chase, and they have that in Amazon," said James Staten, an analyst at Forrester Research. "In the video console market, they had Sony's PlayStation, which helped them rise with Xbox.

"AWS views only Microsoft and potentially Google as their main (cloud) competitors. They're not really concerned about anyone else."

Nadella hasn't said much about any cloud rivalry since being named CEO. But in a letter to Microsoft employees, he wrote: "Our job is to ensure Microsoft will thrive in a mobile and cloud-first world."

AWS and Microsoft face rivals besides Google in cloud computing services, such as IBM (IBM), Oracle (ORCL), VMware (VMW) and Salesforce.com (CRM). IBM last year acquired IaaS vendor SoftLayer for about $2 billion. It also recently purchased Cloudant and plans to invest $1 billion in PaaS.

AWS and Microsoft, though, have strengths most rivals have trouble matching, analysts say.

Amazon expanded into cloud computing as a way to efficiently utilize the massive e-commerce infrastructure it had built up for its own needs. According to one estimate, Amazon has mustered nearly 2.5 million computer servers to power its public cloud services.

Microsoft, its PC and server software widely deployed in corporations, built its own in-house cloud infrastructure to support Office 365, Xbox Live, Bing and other products.

Still, AWS is much bigger. And Oracle has moved faster than Microsoft in making cloud-related acquisitions, says Nomura analyst Rick Sherlund in a research report.

"New management must drive much harder to the cloud," said Sherlund. "It is not too late for either Microsoft or Oracle for PaaS. It is too late for them for leadership in IaaS; they can never catch AWS at the scale they operate."

Microsoft, like VMware, has been adapting as corporate data centers move to cloud technology, says Citigroup analyst Walter Pritchard. VMware's virtualization software is widely used in corporate data centers to increase the flexibility and capabilities of computer servers.

VMware and its majority owner, EMC (EMC), last year launched Pivotal Software, a cloud computing venture that is developing PaaS products that work on Amazon's public cloud platform.

Microsoft's strategic focus is still on Windows Server and other legacy products, says Pritchard.

"With a Windows-only focus, it will be difficult for Microsoft to complete broadly in IaaS. Instead, it must be successful reinvigorating developers around the Azure platform with (PaaS) offerings," wrote Pritchard in a research note.

Forrester's Staten agrees it's vital for Microsoft to attract more software developers to create cloud apps.

The early success of AWS was in making it easy for small and midsize businesses to tap computing resources via the Internet cloud that had only been available to large enterprises. AWS has more recently targeted large companies and other enterprises.

Microsoft's "walled-garden" approach still aims at locking in customers to its software ecosystem, some analysts say.

David Smith, an analyst at research firm Gartner, says Microsoft's Azure cloud platform still needs to invest more in the IaaS market, which is much bigger than the PaaS market.

Customers pay IaaS service providers pennies an hour per server. Amazon has cut its cloud prices nearly 40 times since 2006. Since moving into IaaS last year, Microsoft has been cutting prices to match Amazon.

"Microsoft has a long history of doing things its own way," Smith told IBD. "But in IaaS, they're starting to focus on not just Microsoft OS, (by also focusing on) Linux and Java. I expect to see more of that kind of thinking in the cloud."

As Web Turns 25 Creator Talks about its Future

Excerpted from NY Times Report by Nick Bilton

In 1989, Tim Berners-Lee, a software engineer, sat in his small office at CERN, the European Organization for Nuclear Research near Geneva and started work on a new system called the World Wide Web.

On Wednesday, that project, now simply called the web, will celebrate its 25th anniversary, and Mr. Berners-Lee is looking ahead at the next 25.

But this moment comes with a cloud. The creators of the web, including Mr. Berners-Lee, worry that companies and telecommunications outlets could destroy the open nature that made it flourish in their quest to make more money.

A "Star Trek" fan site was one of scores that were created on GeoCities, one of the first virtual communities. GeoCities was bought by Yahoo in 1999, after which it faltered. A decade later, it was all but shut down.

Today, more than two people in five are connected to the web. Every minute, billions of connected people send each other hundreds of millions of messages, share 20 million photos and exchange at least $15 million in goods and services, according to the World Wide Web Foundation.

Of course, Mr. Berners-Lee had no idea that what he was building would have such an effect on society or grow so large.

"I spent a lot of time trying to make sure people could put anything on the web, that it was universal," he said in an interview. "Obviously, I had no idea that people would put literally everything on it."

Since then, "everything" has included the GIF, (pronounced "jif," like the brand of peanut butter, rather than with a hard G sound), memes, Google, Facebook, Twitter, news sites, Pets.com, YouTube and billions of web pages, by some estimates.

Mr. Berners-Lee wrote the first web page editor and web browser in his office at CERN, and by the end of 1990 the first web page was posted online.

One of the most important aspects of the growth of the web came in April 1993, when the technology was made available for anyone to use, royalty-free.

While Mr. Berners-Lee said he was incredibly grateful for what the web has done since those early days, he warned that people need to realize that a current battle around so-called network neutrality could permanently harm the future of the web.

The idea behind net neutrality is simple: The web material we see on our laptops and smartphones, whether from Google or a nondescript blog, should flow freely through the Internet, regardless of its origin or creator. No one gets special treatment. But companies like Verizon hope some people will pay more to get preferential treatment and reach customers quicker.

"The web should be a neutral medium. The openness of the web is really, really important," Mr. Berners-Lee said. "It's important for the open markets, for the economy and for democracy."

He worries that people online have no idea what could be at stake if large telecommunications companies took control of the web and the type of material we now have access to without any blockades or speed barriers.

Mr. Berners-Lee said he planned to spend the next year working with web consortia to spread awareness of these issues. "It's possible that people end up taking the web for granted and having it pulled out from underneath them," he said.

In addition to helping further net neutrality, the World Wide Web Consortium, the leading web standards organization, hopes to help get the billions of people who are not on the web connected to it. In a news release, the consortium said the goal was to bring those people to the web via mobile phones, which cost lest than traditional laptops and Internet connections.

To help celebrate the web's birthday, Mr. Berners-Lee, the World Wide Web Foundation and the World Wide Web Consortium are asking people to share birthday greetings on social media using the #web25 hashtag, and select greetings will be posted online.

US to Cede its Oversight of Addresses on Internet

Excerpted from NY Times Report by Edward Wyatt

The United States will give up its role overseeing the system of Web addresses and domain names that form the basic plumbing of the Internet, turning it over in 2015 to an international group whose structure and administration will be determined over the next year, government officials said on Friday.

Since the dawn of the Internet, the United States has been responsible for assigning the numbers that form Internet addresses, the .com, .gov and .org labels that correspond to those numbers, and for the vast database that links the two and makes sure Internet traffic goes to the right place.

The function has been subcontracted since 1998 to the Internet Corporation for Assigned Names and Numbers, or Icann, an international nonprofit organization, with the expectation that the United States would eventually step back from its role.

But that transition has taken on a new urgency in the last year because of revelations that the United States intelligence community, particularly the National Security Agency, has been intercepting Internet traffic as part of its global spying efforts.

While other countries have called for the United States to turn over the keys to the system, many businesses around the world, dependent on the smooth functioning of the Internet for their livelihood, have expressed concern about what form the new organization will take.

"We don't want to break the Internet," said Laura DeNardis, a professor at American University and the author of "The Global War for Internet Governance," a recent book on the subject.

For consumers who use the Internet to stream movies or send email, nothing will change, if everything goes according to plan.

"We want to carefully transition to something that doesn't just give the power to one stakeholder, but that takes into account the interests of private industry, of large users of the Internet, of the purchasers of domain names, of governments and of civil society," Ms. DeNardis said.

Lawrence E. Strickling, the assistant secretary of commerce for communications and information, said on Friday that the United States would not accept a proposal that replaced it with a government-led or intergovernmental organization.

The Commerce Department also laid out principles that must govern any new body, including maintaining the openness of the Internet and maintaining its security and stability.

Icann will conduct a meeting that will be the first step in the transition process, beginning March 23 in Singapore.

"We are inviting governments, the private sector, civil society and other Internet organizations from the whole world to join us in developing this transition process," said Fadi Chehadé, the president and chief executive of Icann. "All stakeholders deserve a voice in the management and governance of this global resource as equal partners."

While the announcements were structured to portray a cooperative global community, there has been widespread hostility toward the United States since the former National Security Agency contractor Edward J. Snowden began releasing documents showing the extent of United States global spying.

Those spying programs had nothing to do with the role of the United States or Icann in administering Internet addresses. But the perception that the United States was pulling all the strings led to a global uproar.

President Dilma Rousseff of Brazil canceled a planned visit to the United States last year and called the activities "an assault on national sovereignty and individual rights" and "incompatible with relations between friendly nations."

Brazil also announced it would host Net Mundial, a global meeting on Internet governance, in April in São Paulo to discuss the coming transition.

But by announcing its plans before the Brazil meeting, "the U.S. is trying to make sure the transition happens on its own terms, and that the U.S. is setting the rules for the transition," said Greg Shatan, a partner at the law firm Reed Smith in New York.

With its statement that no government-led organization would take over Icann, the United States also made clear that the International Telecommunication Union, a United Nations affiliate that oversees global telephone traffic, would not be allowed to take over Internet governance. That was an issue last year at an I.T.U. conference in Dubai.

Ms. DeNardis said that a key to a new governance structure would be to keep in place the expertise that currently allows the Internet to function smoothly.

"It is very easy to take the stability of the Internet for granted," she said.

Shared Services Could Save Government $28 Billion

Excerpted from FCW Report by Frank Konkel

What: A MeriTalk study called "Shared Serviced: Ready or Not?" The study was underwritten by ServiceNow. MeriTalk conducted in-person surveys with 138 federal IT professionals at a January event, 76 percent from civilian agencies and 24 percent from defense and intelligence agencies; 67 percent of those surveyed held IT roles.

Why: The White House defines shared services as an IT function that is provided for consumption by multiple organizations within or between federal agencies. Nearly 75 percent of respondents said shared services are a strategic initiative for their agency CIOs in the coming year, and 96 percent believe it should be. Just over half of agencies are using shared services, while 44 percent are actually providing services — cloud computing, for example — to other agencies.

Yet the survey suggests current shared services are haphazardly organized. Only about 40 percent of agencies have defined goals and objectives, and only 32 percent have established -level agreements. Even fewer, only 16 percent, have developed a financial model and chargeback system to position their IT systems as a broker of any such services. Culture, security, procurement, cost savings quantification and infrastructure are all viewed as barriers to improved shared services by respondents.

The potential, however, is great. Those surveyed felt shared services could save 34 percent of the total federal IT budget, the equivalent of $27.9 billion.

Verbatim:

Feds say agency culture is a more significant hurdle than security.

Nearly nine out of 10 Feds believe cloud computing is transforming views of shared services.

To enable government-wide shared services, agencies call for senior leadership support (81 percent), SLAs (75 percent), and a governance process for IT services (66 percent).

Three Routes to FedRAMP: Choose Wisely

Excerpted from FCW Report by Frank Konkel

There are three paths commercial cloud service providers can take to comply with the government's baseline cloud computing standards, known as the Federal Risk and Authorization Management Program (FedRAMP).

Although the end goal is the same, the journeys can differ in the time it takes to get accredited and potentially in cost, according to FedRAMP Director Maria Roat. CSPs are also likely to experience differing business propositions depending on the paths they take.

Thus far, the most common route CSPs have chosen is gaining a provisional authority to operate (ATO) from the FedRAMP Joint Authorization Board (JAB), which is led by CIOs at the General Services Administration, the Defense Department and the Department of Homeland Security. A FedRAMP-accredited third-party assessment organization (3PAO) is required for this process. As of March 14, 11 companies have earned ATOs for platform-, infrastructure- or software-as-a-service offerings.

Alternatively, an agency can grant an ATO to a company, and other agencies can choose to take advantage of that authority to work with the company. Again, 3PAOs play a role in agency-issued ATOs and work with agencies and CSPs to ensure that security standards are met. Two companies and one agency -- the Agriculture Department -- have earned agency ATOs for a total of four cloud service offerings.

There is a little-known third route that CSPs can take, but to date no companies have made use of it, although Roat said one company has expressed interest. A CSP can hire a FedRAMP-accredited 3PAO to complete all required documentation, testing and security assessments. All that information could be sent to GSA's FedRAMP office for verification. This method is potentially attractive to companies that cannot or do not want to take advantage of existing federal contracts and do not want to partner with another CSP.

Once a cloud service provider earns an ATO, any agency can use that company's cloud services with confidence, and that's certainly an attractive option for CSPs. But going through the FedRAMP pipeline tends to take "a few months longer," Roat said. Most CSPs achieve JAB approval in about six months, which is longer than it typically takes a CSP to earn an agency-issued ATO.

"The JAB has more separate eyes viewing it," Roat said, explaining some of the time difference. "It's a government-wide look. The agency process tends to be shorter because it doesn't have all parties reviewing it."

Yet that government-wide look can carry weight in the federal cloud space because of how risk assessments are conducted in both situations. CSPs can designate whether they wish to be evaluated against a low-sensitivity or a moderate-sensitivity security baseline depending on the types of data their systems are meant to handle. High-sensitivity designations are currently not part of FedRAMP.

But Roat said an agency perspective on risk assessment could differ from JAB's point of view. In addition, an agency issuing an ATO is responsible for continuous monitoring, which JAB performs for the providers it approves.

"The JAB won't accept any high vulnerability," Roat said. "An agency could say yes to a high vulnerability. Agencies could be more flexible."

However, she added that agencies have plenty of incentive to be thorough in their risk assessments for credibility reasons, as do 3PAOs. Agencies know that other agencies could use the CSPs they approve, and therefore, they could damage their reputations if anything goes wrong.

"Agencies know they are under pressure," Roat said. "They know they need to do a job. They want to get it right the first time. They all have credibility on the line."

Costs are also rumored to vary widely, with agency ATOs likely to be a cheaper overall option. One industry source told FCW that one CSP invested $5 million to achieve JAB approval -- a significant amount of money given how much federal business would be required to recoup those costs.

Some general return-on-investment data is publicly available at MeriTalk's FedRAMP OnRAMP, which also provides a snapshot of which cloud providers are in the FedRAMP pipeline and which avenue they are pursuing.

The Foundation of Clouds: Intelligent Abstraction

Excerpted from ComputerWorld Report by Chris Poelker

The term cloud computing is bandied about all the time these days, but many folks are still confused about what all the fuss is about, and what it means to them. The IT landscape is changing faster than it ever has before, and it is becoming more and more difficult for quote "normal" people to keep up.

Unless you are an uber-geek — think Scotty from Star Trek [1] -- and like to spend all your time reading through technical manuals, it can become real easy to fall behind in the skillsets needed to stay relevant in this new era of IT. As an example, storage area networks (SAN) are now considered traditional technology, and the SAN is being phased out as more and more solutions begin to adopt storage solutions which bring the data I/O closer to the system itself. Cloud infrastructure, HADOOP, GRID computing, and modular building block data centers are replacing the old Fibre Channel LUN (Logical Unit Number) based storage solutions.

New (actually old, but resurrected) technology like erasure coded disk is starting to impact the way shops are deploying RAID storage. Object-based storage and network attached storage storage is becoming more and more prevalent as application servers are virtualized into the cloud. Although good old SAN storage is still the best way to move large blocks of data around quickly, you still need to be aware and understand the changes in the storage and server landscape, and how it applies to how information technology is implemented today, and what it will look like in the near future. So let me try to explain what is going on as simply as I can. I will collapse all the elements that make up cloud computing into a single easy to remember term: intelligent abstraction (IA).

Cloud computing uses something called virtual abstraction to enable the rapid deployment of applications and data to reduce the cost and complexity of providing the underlying infrastructure, which also simplifies operations. The goal is to free up the IT team to get back to more strategic projects, and allow them to use technology as a service rather than something that they build and manage themselves.

I define intelligent abstraction as it pertains to the computer industry as the de-coupling of applications and data from all physical constraints while providing policy-based management, movement, storage, and protection of systems and information.

The concept of IA is simple. The cloud is made up of server and storage virtualization solutions which are tied together with software management and monitoring functions. According to the National Institute for Standards and Technology (NIST), for a solution to be considered cloud, it must include the following five essential characteristics: on-demand self-service, broad network access, resource pooling with location independence, rapid elasticity, and measured service.

IA combines the benefits of artificial intelligence (AI), server and storage virtualization, and policy-based automation together with advances in data management technology designed for the software defined data center to re-define how information and data center infrastructures are managed.

The great news for the storage guys is that since the underlying storage infrastructure is virtualized, it dramatically reduces the requirements for developing a deep knowledge of the storage itself. You only need to learn how the virtual abstraction layer works. The good news for the CFO is that since the storage is virtualized, it can come from any storage provider, which means you can pit your vendors against each other so you can purchase storage at the lowest possible cost. In essence, intelligent abstraction commoditizes storage and servers, and enables IT to become a service, which is the essence of cloud computing.

The Case for Analytics in the Cloud This is an age of optimization. Consumers want their data to be easily accessible, available in real time, and on multiple devices. Enterprises want real-time data analytics to be a systemic part of their business models. In industries such as retail, information technology, marketing, healthcare and banking, real-time data analytics is essential for daily functioning of a business.'

In the enterprise space, there are two key factors enabling this push towards data analytics consumption patterns and big data.

Enterprises that are consumer facing find that data analytics is necessary to predict consumption patterns and consequently dictate business decisions. Secondly, with the advent of big data, there are copious amounts of data that need to be analyzed and shared across industries more efficiently.

This is being looked at as an avenue for adding more value to their customers and to identify new streams of revenue. While data mining techniques are efficient, there are several limitations which can be overcome by providing analytics in the cloud.

This market insight evaluates how cloud computing can enhance the benefits of analytics.

Cloud analytics is a service that provides data and predictive analytics through a public or a private cloud model. The intersection between analytics and cloud technologies will create opportunities for companies to process and use big data on a large, secure scale.

These are provided through cloud models Infrastructure as a Service (IaaS)-based data warehouses, Software as a Service (SaaS)-based intelligence tools, and other analytic products that are hosted on the cloud. While cloud analytics provides the same offerings as traditional analytics, it enhances the service by integrating the attributes of cloud computing.

- Rapid Deployment: Cloud computing brings rapid deployment to analytics. Through IaaS and SaaS solutions available in the market, enterprises can quickly choose and deploy the solution of choice and start using it in a matter of hours to minutes. Time taken to migrate data into the cloud and setting up the reports are the major bottlenecks in the pace of deployments. This also allows for rapid returns on investments.

- Scalable and Agile: As cloud services are highly scalable, analytics services can be scaled up or down depending on demand. This will enable enterprises to efficiently perform their activities as traditional providers require long term licenses, while companies may require options that allow for scaling dependent upon requirements (for example, big project versus small projects).

- Consolidation of Data in the Cloud: For companies that have a share of their data stored in the cloud, it is the obvious choice to have analytics in the cloud as well. This would eliminate the need for conversions of traditional data and would consolidate data.

- Collaboration with Users: One huge benefit of cloud analytics is the ability to collaborate with users by providing them with partial/specific server access to be able to view the project being collaborated on. This is easier than providing external users access to the company's network, especially with firewall concerns.

- Mobility: Having data analytics on the cloud would provide users with an option to log on to view the data from their mobile devices with secure access. This is especially useful for users that need to access this data out of office and hence not on the companies' firewall especially those using portable devices (tablets and mobile phones).

- Speed: Data in the cloud can be analyzed at a much faster speed, not being contained by small office bandwidths. It will also be easier to disseminate findings to users through various devices as they would all be able to log onto the cloud simultaneously.

The Emergence of the Mobile Cloud

Excerpted from Tech Page One Report by Andy Patrizio

The mobile cloud can give you the options you need to keep working on the go.

Using the cloud to provide applications on mobile devices will fundamentally change computing in the enterprise. The cloud empowers end users to sign up for applications and services that can be accessed on their mobile devices, oftentimes without needing IT to intervene. While using such services will make your enterprise end users more productive than ever, IT needs to be prepared to make the "mobile cloud" a reality before users start signing up for disparate services on their own.

The increasing use of mobile computing devices such smartphones and tablets for business is placing growing demand on IT to provide business executives with access to apps, storage and services on a variety of devices, whenever and wherever these services are needed. Cloud-based options are a relatively quick and cost-effective option for delivering business applications and services to your mobile users.

The key to remember here is that it's relatively easy for your mobile business users to sign up for many cloud-based mobile services. Integrating these into existing infrastructure is the challenge. So, if IT doesn't provide what users are demanding, you're likely to see them sign on for services on their own, creating an environment of application creep that can eventually undermine your infrastructure.

Here's what IT needs to know about getting out in front of the "mobile cloud" trend. Keep in mind that there is a wide variety of cloud service providers available to offer mobile apps, services and storage, and pricing can vary widely.

The mobile cloud: 8 tips for choosing the right provider. Here's what you need to know about choosing the right cloud service provider to meet the needs of your mobile users:

  1. Determine your goals for adopting these mobile services and applications. Who in your enterprise needs them, and do these users fully understand what providing these apps entail?

  2. Evaluate how the new apps or services will co-exist with your existing solutions. Will they expand them or replace them? Is interoperability seamless? How well will the new services and solutions integrate with your current environment and support existing solutions? The last thing you want is to create silos of data.

  3. Shop around on cost. Storage costs can range from $250 to $2,899 for 1TB (one terabyte) of storage. That's a huge range of prices.

  4. Sort out the budget. Is IT paying for these apps and services, or is the business unit?

  5. Consider the software offered. Google hosted apps might make a good alternative to Office, especially given Microsoft will not release Office for Android or iOS. However, these mobile apps can add up fast if you are not careful.

  6. Comparison shop between the mobile services you believe you will need versus how much it would cost to build your own internal services and make them available to remote users. Internal expansions have a high up-front expense but can possible pay for themselves over time.

  7. Check out the reliability rate of the cloud services and their Service Level Agreements (SLAs). Have they suffered repeat outages? If so, that's a red flag. Do they offer a comprehensive SLA, with significant promises to the customer and a willingness to make you whole in the event of an outage?

  8. Kick the tires first. Most mobile providers offer their service free on a trial basis. If they don't, move on. You have plenty of options.

The mobile cloud: 5 pitfalls to avoid when choosing a cloud provider. Don't start with the product. Start with the needs and requirements of your users. We all know Google and Salesforce, but do you necessarily need what they offer?

  1. Don't lock in with one vendor. If you do, get a promise up front that you can move to another provider with all of your data. This is a volatile market.

  2. Don't let your staff dictate the provider. You've ceded enough power by allowing them to use their own devices for work. If you let them start dictating the mobile apps and services they use, you will have ceded control completely.

  3. Don't expect a cloud vendor to understand your IT situation as well as you do, unless that vendor has also provided your on-premises solutions, which is highly unlikely.

  4. Don't believe cloud will solve all your problems. It's just one of many possible solutions. You may end up finding that using a Virtual Private Network (VPN) to allow mobile access to internal apps services is the best solution for your enterprise.

  5. Don't operate in a vacuum. Seek outside input. Talk to your peers at other enterprises. When you are at a trade show or conference, that is the time to work the room and get recommendations and guidelines.

IT is accustomed to having apps and services inside the enterprise, well within its control. Mobile apps and services from third party vendors are not. That means a significant amount of due diligence is necessary on your part to make sure your mobile executives get what they need. The effort will be well worth it when you're able to provide your enterprise executives with the level of mobile services they need to move the business forward.

Coming Events of Interest

Interop Las Vegas — March 31st to April 4th in Las Vegas, NV. The leading independent technology conference and expo series designed to inspire and inform the world's IT community. New in 2014: Cloud Connect Summit and the InformationWeek Conference.

CLOSER 2014 — April 3rd-5th in Barcelona, Spain. The Fourth International Conference on Cloud Computing and Services Science (CLOSER 2014) sets out to explore the emerging area of cloud computing, inspired by recent advances in network technologies.

NAB Show — April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.

Media Management in the Cloud — April 8th-9th in Las Vegas, NV. This two-day conference provides a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain, including production, delivery, and storage.

CLOUD COMPUTING EAST 2014 — May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

Copyright 2008 Distributed Computing Industry Association
This page last updated March 23, 2014
Privacy Policy