Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

FalconStor

ShareFile

VeriStor

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

June 17, 2013
Volume XLIV, Issue 3


Webinar: CLOUD COMPUTING IN THE ENTERPRISE

CLOUD COMPUTING IN THE ENTERPRISE — IMPLEMENTING CLOUD SOLUTIONS, a Market Analyst produced webinar sponsored by Sprint, will place on Wednesday June 26th at 2:00 PM ET.

Marty Lafferty, CEO of the Distributed Computing Industry Association (DCIA), will moderate a panel of industry experts who will explore both the benefits as well as the factors involved in implementing enterprise cloud solutions.

Topics will include deployment methodologies in different verticals; technological considerations, security issues; potential hurdles to implementation; the impact of pervasive mobility and "bring your own device" (BYOD) trends; growth factors; and evolutionary progress in enterprise cloud computing.

This hour-long webinar is a must-attend event for those implementing cloud solutions or thinking of implementing one.

All attendees will be entered into a drawing to win a Samsung Chromebook 550.

Please click here to sign-up now for this free webinar.

Ottolenghi Named to Senior Post at Las Vegas Sands

Las Vegas Sands announced today that Les Ottolenghi will join the company as Senior Vice President and Chief Information Officer. He will be responsible for overseeing the company's technology innovation strategies and infrastructure delivery on a global basis.

Ottolenghi will start with the company on June 17th and will report to Company President and Chief Operating Officer, Michael Leven.

"Les is the ideal leader to join our organization given the opportunities ahead of us and we are pleased to welcome him to the team", said Leven.

"Les is a recognized innovator across established hospitality companies and start-up organizations. His successes in high-growth, high-volume technology-dependent organizations will be extremely valuable as we look to innovate how we leverage technology and data to grow our business while delivering guest experiences that capture the spirit in each of our integrated resorts."

While creating and building technology organizations over the last fifteen years, Ottolenghi has earned a reputation for innovative and impactful technology products and solutions. Since 2004, Ottolenghi has successfully co-founded three technology companies after leading technology in CIO roles at Agentware from 1999 to 2003 and Carlson Wagonlit Travel from 1996 to 1998.

His early experiences in innovating and deploying technology at Carlson Wagonlit Travel and Holiday Inn Worldwide were at the forefront of the hospitality industry's growth via web-based reservations systems and tools.

Ottolenghi earned his MBA in Decision Information Analysis from the Goizueta School of Business at Emory University and a BA from Duke University.

CLOUD COMPUTING WEST 2013 Call for Speakers

There's no question that advances in cloud computing are having enormous effects on the creation, storage, distribution, and consumption of diverse genres of content.

And most profound among these effects are those involving the increased proliferation of portable playback systems and the accompanying generation of unprecedented amounts of viewership, listenership, and usage information from audiences globally.

This week, we'll be sending the third wave of invitations to prospective speakers to participate in CLOUD COMPUTING WEST 2013 (CCW:2013) taking place October 27th-29th at The Cosmopolitan in Las Vegas, NV.

This year's themes are "Revolutionizing Entertainment & Media" and "The Impact of Mobile Cloud Computing & Big Data."

The ubiquity and widespread acceptance of user interfaces that reflect the dynamic interactivity exemplified by smart-phone applications is rapidly replacing the flat linearity of traditional TV channel line-ups and changing expectations for a new generation of consumers.

Cloud-based information and entertainment-of-all-kinds accessible everywhere always on each connected device will become the new norm.

And perfect data related to consumer behaviors associated with discovering and consuming this content will displace metering and ratings technologies based solely on statistical sampling.

You are encouraged to get involved in CCA's and DCIA's CCW:2013 as exhibitors, sponsors, and speakers.

Two CCW:2013 conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment production and storage, as well as media delivery and analysis options; along with the growing impact of mobile cloud computing on this sector, and the related expansion of big data challenges and opportunities.

The CCA is handling exhibits and sponsorships. Please click here for more information.

The DCIA's role is to provide keynotes, panelists, and case-study presenters to participate in our comprehensive agenda of sessions in ENTERTAINMENT & MEDIA and MOBILE CLOUD & BIG DATA.

Please click here to see which topics are of the most interest and here to apply to speak at CCW:2013.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyLast week's revelations about PRISM and other forms of National Security Agency (NSA) dragnet surveillance should not have come as a surprise to industry participants.

The same cloud computing solutions that are revolutionizing the private sector, enabling unprecedented data collection capabilities -- among numerous other benefits, are obviously available to government agencies.

The NSA reportedly obtained direct access to servers at major US Internet service providers including Microsoft, Yahoo, Google, Skype, AOL, Facebook, Apple, and others to collect material including search history, e-mails, file transfers, and live chats.

Google and the others have denied complicity.

"Google cares deeply about the security of our users' data. We disclose user data to government in accordance with the law, and we review all such requests carefully. From time to time, people allege that we have created a government 'back door' into our systems, but Google does not have a back door for the government to access private user data."

An Apple spokesman said it had "never heard" of PRISM.

Companies are legally obliged to comply with requests for users' communications under US law, but the PRISM program reportedly allows the intelligence services direct access to companies' servers.

The NSA's usage raises many questions about involuntary disclosures of private information — and particularly, as we discussed last week, due process — or the absence thereof.

There are plenty of reasons to care about privacy, disclosure, and surveillance beyond the moral outrage and sense of personal violation that constitute most people's first responses.

First, substantial expansion of surveillance activity may in and of itself actually make us less safe.

That's because increased accumulation of private data, whether about consumers or organizations, produces exponentially larger "haystacks" of information without necessarily exposing a greater number of "needles."

The 9/11 Commission concluded that government agencies had all they needed to predict the attacks — but couldn't isolate important elements within the noise of over-collected data.

They lacked the appropriate Big Data analytical tools to make the information actionable.

Since then, over-collection has drastically increased with no indication that this has brought authorities any closer to predicting or preventing future bad acts — just a vague hope that all this data may be useful some day.

And second, once a software algorithm ascribes suspiciousness to a party based on its analysis of reams of associated private data, everything else about that entity can be made to seem sinister or inexplicable.

The documentary Naked Citizens presents several horrifying cases of police incorrectly concluding from surveillance data that innocent individuals are engaging in suspicious activities, and thereafter mistakenly — or with intent to prosecute — interpreting everything they learn about those suspects as evidence of wrongdoing.

The impact of wrongful accusation in this context can seriously compromise the subjects, if not cause even greater harm to society at large.

Once surveillance is built into the networks and connected devices, over-zealous or corrupt authorities can use it to attack individuals and organizations, which ultimately will undermine civilization and foster revolt and anarchy.

Communications networks, computing devices, and data centers are more secure if they're designed to keep everyone out. Adding any form of back door can make an entire system insecure. Once any element is designed for surveillance, anyone who can bribe or impersonate an official can access private data.

This week, as awareness of PRISM spread, hundreds of thousands of American citizens have demanded that the US Congress investigate the NSA's spying programs. And many are questioning why lawmakers have failed to protect civil liberties by not properly exercising their oversight powers.

And now, the Free Press Action Fund has teamed up with Mozilla, the Electronic Frontier Foundation (EFF), Access, Demand Progress, Fight for the Future, and other groups to launch StopWatching.Us, a call for Congress to investigate the NSA's privacy-killing surveillance schemes.

The DCIA encourages DFINFO readers to stand up for privacy and insist that Congress demand due process for accessing private data in the digital age. Share wisely, and take care.

In a Cloud Computing Economy, the NSA is Bad for Business

Excerpted from GigaOM Report by Derrick Harris

The real problem for the National Security Agency's data-collection programs might not be citizen outrage, but something far more powerful — corporate outrage. We have an economy increasingly dependent on web and mobile services (broadly defined as "the cloud"), and it doesn't make a whole lot of sense to put up barriers or conditions on using them. For consumers, it's bad for privacy. For companies that sell cloud services, it's bad for business.

Perhaps you noticed former Microsoft Chief Software Architect Ray Ozzie address the issue in a weekend post on Hacker News, where he wrote:

"In this world where "SaaS" and "software eats everything" and "cloud computing" and "big data" are inevitable and already pervasive, it pains me to see how 3rd Party Doctrine may now already be being leveraged to effectively gut the intent of U.S. citizens' Fourth Amendment rights. Don't we need a common-sense refresh to the wording of our laws and potentially our constitution as it pertains to how we now rely upon 3rd parties? It makes zero sense in a "services age" where granting third parties limited rights to our private information is so basic and fundamental to how we think, work, conduct and enjoy life."

Ozzie might be truly concerned about citizen privacy, but as the founder of a new cloud-based mobile communications service, he's probably also concerned about attracting users. He's not alone. His former employer and other large companies — including some named as part of the NSA's secretive PRISM program — have been pushing for privacy reform for years as a means to cement the viability of the cloud.

Maybe now they'll finally get their way.

The argument — as I've highlighted before — might not be so much about privacy as it is about profit: Companies like Microsoft, Google, Apple and all the startups that Congress seems to love so much rely on people trusting the cloud in order to make money. If the PATRIOT Act, the Electronic Communications Privacy Act (ECPA), or any other legislation scares users away, that money goes with it.

Thus far, most of the activity has centered on ECPA, the antiquated a piece of legislation written in 1986 that makes it relatively easy for government agencies to obtain people's e-mail and other electronic communications (e.g., Twitter direct messages) without a search warrant. The companies have banded together with unlikely allies like the Electronic Frontier Foundation and EPIC to create the Digital Due Process coalition, a group whose name references the Fourth Amendment right to due process that prohibits unreasonable searches and seizures.

At long last, their efforts are gaining some real traction. Updated versions of the ECPA are making their way through both the House and the Senate right now, with Sen. Patrick Leahy's bill arguably the most high-profile of the bunch.

More generally, though, the third-party doctrine that Ozzie referred to is part of a larger legal theory that treats any information in the possession of someone else — your credit card transactions, call records, your journal, you name it — differently than if you alone were in possession of that information. It's a hot topic of debate among legal scholars, and it seems the advent of the cloud has some members of the Supreme Court ready to weigh in on it should the right case arise.

In a 2012 case notable for its holding regarding the legality of warrantless GPS tracking, Justice Sonia Sotomayor addressed the bigger picture in a concurring opinion. She called the third-party doctrine "ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks."

"I would not assume," she added, "that all information voluntarily disclosed to some member of the public for a limited purpose is, for that reason alone, disentitled to Fourth Amendment protection."

Importantly, though, while ECPA governs the actions of law enforcement agencies investigating crimes or just trying to gather data, it doesn't address the types of surveillance that the NSA carries out under the banner of national security. But the tech industry isn't blind to laws such as the PATRIOT Act, either. Here's what they had to say about data privacy in 2011, via a think-tank comprised largely of IT industry executives:

(1) modernize legislation (the Electronic Communications Privacy Act) governing law enforcement access to digital information in light of advances in IT; (2) study the impact of the USA PATRIOT Act and similar national security laws in other countries on companies' ability to deploy cloud in a global marketplace; and (3) have the U.S. government take the lead on entering into active dialogues with other nations on processes for legitimate government access to data stored in the cloud and processes for resolving conflicting laws regarding data.

Already, it appears Europeans are searching for ways to withdraw from American service providers. Users in other parts of the world might, or should, be even more hesitant to use American services. And even if some Americans say they're not creeped out by the government collecting their phone records (and, presumably, the rest of their digital communications), many are.

It's hard to say how intensely the tech lobby will step up its privacy efforts in light of the NSA scandal, but it's hard to imagine it will stay quiet if its constituents see potential users bailing on their services. And what's bad for corporations in this situation is probably bad for the economy. A bad economy is bad for politicians always looking toward the next election.

It seems crazy to think the NSA will willingly give up its surveillance powers or that a court could come to a decision on this issue any time soon, but some members of Congress could be swayed to act. In a debate between privacy and the economy on one hand and national security on the other, you'd think something will have to give.

Where Cloud Goes Next

Excerpted from Network World Report by John Considine

It's difficult to define what the "cloud of tomorrow" will look like because of all the changes happening in the IT industry -- changes to fundamental application architecture, service models, and interactions between components. The cloud continues to disrupt IT in new ways so predicting tomorrow is a perpetual moving target.

Understanding that, a clearer picture of the future of the cloud might be gleaned by looking at the current offerings and identifying how they should change to accelerate the cloud market.

It's important to recognize why the cloud is different and what — from a technical standpoint — drives the disruption it causes:

Independent architecture: Historically, enterprises select technologies, partners and vendors and then integrate these (mostly) different elements to form an IT solution. With cloud services, businesses aren't infrastructure decision-makers anymore. The cloud provider makes all of the decisions about the infrastructure and how it can be configured and managed.

Multi-tenant infrastructure: Public clouds are inevitably multi-tenant. This is how a cost-effective and scalable architecture is built and it has profound effects on the behavior and capabilities of the infrastructure, and ultimately the user applications.

Different control models: When a business designs and manages its own infrastructure, it leverages familiar tools, expertise and technology controls. Internal experts in each and every layer of the infrastructure -- from servers to storage arrays to network switches/routers/firewalls -- are leveraged to manage and monitor this complex stack. In a cloud, almost all of this work has moved to the cloud provider and the "consumer" interaction changes to the controls that the cloud provider exposes instead of those familiar controls from internal infrastructure.

These differences are forcing enterprises to change their cloud use habits and are limiting functionality and what can be done within the cloud. The applications and services that have already made the jump to cloud have most often been web-based applications specifically designed to work in the cloud.

Of course, there are many applications that have not made transition -- these are mostly enterprise applications such as CRM, ERP, B2D, SCM (and just about any other three letters you choose to combine). These applications are "the hard stuff" that makes a business run.

These applications are deemed difficult because they are restricted by the limitations imposed by cloud architecture. This leaves two choices -- change the applications or change the cloud. As in the technology industry, all things evolve and these applications will eventually be "re-written" or discarded as newer ones are developed. However, there is an opportunity to change the clouds so that they do more allowing us not to wait for the evolution of applications.

What do we have to do to make the cloud better for tomorrow?

Security: Often a twofold problem, as there are different measures taken to protect both the servers and the data:

Network protection: We have spent many hours protecting data center perimeters, while allowing applications to communicate to one another within the data center. With cloud, the developer or the cloud service provider is responsible to protect the server. Future cloud offerings should provide isolation that is similar to traditional IT environments. At the same time, the cloud should provide strong protections coupled with policies that can be applied across the whole organization -- freeing developers from this responsibility.

Data protection: Several cloud providers have given data protection special attention and developed policies to control access to information. The cloud should provide more data protection services like user controlled encryption functions to provide customizable methods of protecting customer data independent of the cloud providers. [Also see: "The state of cloud encryption: From fiction to actionable reality"]

Management: The elements that compose a virtual infrastructure are different and how the IT staff interacts with those infrastructure tools and components are varied. Today, cloud management is complicated because the systems and applications must account for each different deployment environment. The next cloud evolution needs to provide enough control so that the user can match the various environments and eliminate the need for additional translations and conversions.

Maintenance: The number of pieces that make up an IT solution is extremely challenging and time-consuming for cloud users due to the various dependencies of each element. In a typical IT model, when security vulnerabilities are discovered, a security patch is usually a good solution. In a cloud environment, it's critical that the rest of the components are taken into consideration to ensure the new security patch does not affect other functionality. When other functionality is affected, most cloud solutions require the addition of agents, scripts or device drivers to carry out control of the cloud environment -- driving up cost. This creates a huge number of new dependencies and interactions complicating maintenance of the deployed applications. Going forward, cloud providers need to simplify the management process to assure systems stay up while costs stay down.

Performance: Performance is critical. Security has traditionally been the top concern of companies considering cloud deployments, but increasingly performance is overtaking it. It's extremely difficult to manage and predict performance in current cloud offerings but several cloud providers have developed a series of procedures that help control performance issues through careful provisioning and a strong infrastructure. The major hurdle is overcoming the multi-tenancy effect. Future offerings need to address performance as a fundamental goal. The cloud of tomorrow will treat performance as a critical function by allowing the users to select the amount of performance they need for their applications.

If we don't just accept the limitations that have been built into the cloud today, we will see many more cloud-based applications. By enabling more traditional workloads to join the cloud revolution we don't just enable the "old," but we provide greater controls and capabilities to build even more amazing cloud applications. We must change the cloud in order to make things easier for the users and not force users to adapt to the cloud. The only question that remains: Who will be the first to bring powerful, yet simple, answers to the cloud space?

How Cloud Computing Democratizes Big Data

Excerpted from ReadWrite Report by Seth Payne

Big Data, just like Cloud Computing, has become a popular phrase to describe technology and practices that have been in use for many years. Ever-increasing storage capacity and falling storage costs - along with vast improvements in data analysis, however, have made Big Data available to a variety of new firms and industries.

Scientific researchers, financial analysts and pharmaceutical firms have long used incredibly large datasets to answer incredibly complex questions. Large datasets, especially when analyzed in tandem with other information, can reveal patterns and relationships that would otherwise remain hidden.

As a product manager within the Global Market Data group at NYSE Technologies, I was consistently impressed with the how customers and partners analyzed the vast sets of market trade, quote and order-book data produced each day.

On the sell side, clients analyzed data spanning many years in an attempt to find patterns and relationships that could help fund portfolio managers build long-term investment strategies. On the buy side, clients mined more-recent data regarding the trade/quote activities of disparate assets. University and college clients sought data spanning decades. Regardless of the specific use case, clients required technology to process and analyze substantial and unwieldy amounts of data.

Various technologies are employed to meet the needs of these various use cases. For historical analysis, high-powered data warehouses such as those offered by 1010data, ParAccel, EMC and others, are incredible tools. Unlike databases, which are designed for simple storage and retrieval, data warehouses are optimized for analysis. Complex event processors such as those from One Market Data, KDB and Sybase give high-frequency and other algorithmic traders the ability to analyze market activity across a wide array of financial instruments and markets at any given microsecond throughout the trading day.

These technologies are now being deployed within new industries. Business intelligence tools such as those offered by Tableau and Microstrategy can now deal with very large and complex datasets. To a lesser extent, even Microsoft Excel has been retooled to handle Big Data with newly architected pivot tables and support for billions of rows of data within a single spreadsheet.

But Big Data is useful only if analysts ask the right questions and have at least a general idea of the relationships and patterns Big Data analysis may illuminate.

Is Big Data right for your company? The first question any firm must ask is if they will benefit from Big Data analysis. Begin by understanding the data sets available to you. Analysis of 20 years of stock closing prices, for example, would not likely require the power of Big Data systems. Given the relatively small size of this dataset, analysis can, and probably should, be performed using SQL or even simply Excel.

But large sets of unsorted and unordered data — such as financial transactions, production output records and weather data — do require Big Data analysis to bring order to the chaos and shed light on relationships, trends and patterns made visible only by structured and systematic analysis.

To start, formulate a relatively simple hypothesis and use Big Data analysis to test it. The results of this analysis should reveal information that will lead to further, more complex questions.

It is no surprise that the rise of Big Data has coincided with the rapid adoption of Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) technologies. PaaS lets firms scale their capacity on demand and reduce costs while IaaS allows the rapid deployment of additional computing nodes. Together, additional compute and storage capacity can be added to almost instantaneously.

For example, a large hedge fund in New York used a cluster of computing nodes and storage to analyze the day's trade/quote activity across all U.S. equity markets. The size of the datasets used in the analysis - typically 10GB to 12GB compressed - was growing steadily, allowing the market data manager to accurately plan his capacity needs. On occasion, however, trade/quote volumes explode, creating exponentially larger data sets. On these occasions, the market data manager can deploy additional virtual machine (VM) nodes in the cluster, ensuring that even unusually large datasets do significantly delay analysis without having to permanently add expensive computing resources.

The flexibility of cloud computing allows resources to be deployed as needed. As a result, firms avoid the tremendous expense of buying hardware capacity they'll need only occasionally.

While the cloud grants tremendous flexibility and reduces overall operation costs, it is not appropriate for all Big Data use cases.

For example, firms analyzing low-latency real-time data — aggregating Twitter feeds, for example — may need to find other approaches. The cloud does not currently offer the performance necessary to process real-time data without introducing latency that would make the results too "stale" (by a millisecond or two) to be useful. Within a few years, virtualization technology should accommodate these ultra low-latency use cases, but we're not there yet.

Cloud computing has given businesses flexible, affordable access to vast amounts of computing resources on demand - bringing Big Data analysis to the masses. As the technology continues to advance, the question for many businesses is how they can benefit from Big Data and how to use cloud computing to make it happen.

An Inside View of the State of BYOD

Excerpted from CIO Insight Report by Don Reisinger

In the land of the enterprise, there is one trend that CIOs, no matter how hard they try, simply can't sidestep: Bring your own device, or BYOD.

At a shockingly rapid rate, companies around the globe are letting their employees to bring their own devices to the office.

What's more, those smart-phones, tablets, and laptops are being used for all types of corporate functions, including checking e-mail, accessing company data, and connecting to the corporate network.

While BYOD does improve employee collaboration and productivity, it also raises important security issues.

In a new study from the security firm Webroot, the length at which BYOD has entered the corporate market is described in depth.

Webroot found that not only do a majority of companies have BYOD policies in place, but an overwhelming number of them are watching employees use personal devices for work even when they're not authorized to do so.

In other words, it's either support the BYOD trend or accept that employees will no longer listen to the IT side of the business.

Verizon Offering Private IP out of 16 Equinix Data Centers

Excerpted from Data Center Knowledge Report by Jason Verge

Verizon is offering its Private IP service out of 16 Equinix data centers around the world. The partnership means Verizon customers gain access to over 300 cloud service providers inside Equinix data centers, and Equinix bolsters its connectivity options. Fully integrated solutions will be available by the end of 2013.

"By leveraging Verizon's Private IP Service and Equinix data center and interconnection services, customers can count on high levels of security, reliability and performance as they grow their businesses and expand into new markets," said Equinix CEO and President Steve Smith.

Verizon said in the release that rapid growth of cloud computing and software-as-a-service (SaaS) applications is driving demand for data center services, so there's a reason to get cushy with Equinix, which is the leading player in interconnection and has been building up its ecosystem of partners.

"Verizon and Equinix are market leaders, and the new alliance we have formed will provide customers of both companies the foundation to transform their businesses," said John Stratton, president, Verizon Enterprise Solutions. "Whether it is e-commerce applications, big data, media and entertainment or cloud computing, enterprises need to quickly and securely access and move information from various points around the world."

Verizon's Private IP solution is offered in 147 countries across six continents (not much infrastructure on the 7th). Verizon offers reporting and monitoring tools with its private IP solution to let users customize their network to meet needs. The company recently announced 100GE access for Private IP at the end of May.

Octoshape to Power Deutsche Telekom's LiveStream Video Perform

Excerpted from Telecom Lead Report

Deutsche Telekom's new cloud-based video solution provides the best quality video distribution of over-the-top (OTT) content via fixed and mobile broadband networks.

The LiveStream Perform solution is capable of providing the dial-tone reliability of broadcast TV through broadband networks.

Powered by Octoshape's streaming technology, LiveStream Perform enables scale and efficiency for operators and provides predictable business models for broadcasters.

Karim El-Khazen, vice president, Business Development and Innovation, said: "Thanks to the Octoshape technology, we can expand our product portfolio with a service that represents a differentiating factor and accompanies our customers into the next phase of video distribution over the Internet."

Infonetics Research recently said the cost of implementing a service delivery platform (SDP) is a major barrier to deployment, and operators are turning to more productized solutions, including commercial off-the-shelf software.

"Though many of the basic drivers behind SDP investments haven't changed — things like shortening time to market for new services and reducing Opex — operators are definitely looking to SDPs to monetize their networks and manage the impact of over-the-top providers on the bottom line," said Shira Levine, directing analyst for service enablement and subscriber intelligence at Infonetics Research.

According to an Infonetics Research survey, getting services to market faster is overwhelmingly the top business driver behind survey respondents' SDP investments, followed closely by exposing network APIs for 3rd-party developers and internal development teams.

Around 46 percent of operators surveyed say their SDPs are or will be based primarily on commercial off-the-shelf (COTS) software with only minor customization. Half of respondent operators named their IT department as the group that holds the purse strings for SDP purchase decisions, and 33 percent named marketing.

4 Technology Insights from Citrix Synergy 2013

Excerpted from Government Technology Report by Hilton Collins

Last month, technology professionals traveled to Anaheim, CA to network with peers and learn more about application virtualization technology, cloud computing, and mobility at Citrix Synergy 2013, a conference dedicated to insight, training and networking

David Smith, Citrix's Director of State and Local Government, and Tom Simmons, the company's Area Vice President of the United States public sector, spoke to Government Technology at the conference about issues that public-sector leaders deal with in data management, virtualization, mobility and cloud deployment.

1. Policy Forces the Government to Approach Data Management Differently than the Private Sector

Public-sector agencies have the same technology needs as private enterprises, but the government's policy restrictions often force the public sector to manage data differently than the private sector would have to.

Simmons spoke of requirements unique to federal agencies. "We do see some unique technology requirements," he said, "but generally, I'd say 90 percent of the technology that the private sector embraces will meet the needs of the federal."

He mentioned the Federal Information Processing Standard (FIPS), for example, which is a set of standards for encoding and encrypting data. And when mobile technology is involved, further adjustments must be made in government data management. IT leaders typically want to restrict access to some data types if an employee is accessing the government network on a personal tablet or smart-phone.

"A lot of people are looking at BYOD as a way to handle some mobility challenges," Smith noted, "but depending on what type of data, policy may dictate that that's not the right type of device to deliver information to."

2. The Mobile Workforce Creates New Data Management Challenges for Government

Simmons and Smith spoke further about mobility challenges. Data management isn't the only thing that can get complicated when employees go mobile; application services delivery can grow challenging as well.

"Mobility has caused customers to look a lot more at how they deliver applications and data to employees," Smith said. "How can you deliver an application to an end user or data to an end user, but not actually leave that data resident on the device?"

But not all government workplaces have the same standards when it comes to allowing data to reside on mobile endpoints. It's up to a particular agency to determine what should be segregated and what shouldn't.

"In some civilian agencies," Simmons said, "it's a concept of whether it's a government-furnished device or a bring-your-own-device kind of a concept, 'How do I segregate and separate my government data and applications from my personal data and applications, like my photos and my iTunes and things like that?'"

3. Virtualization Can Play a Role in More Sophisticated Data Management and Storage

Can virtual server deployment keep data centers from becoming crammed with physical servers? Perhaps, but Simmons and Smith feel that the problems and solutions to data server and storage issues are more complicated than that.

"I think the cost of storage has come down so dramatically over the last five years that it's not an issue of capacity. It's an issue of the ability to analyze and access data," Simmons said, adding that the most pressing concern is figuring out how to analyze and make use of surging data volumes to increase operational intelligence.

Smith said that it takes a combination of technologies to solve the data storage volume, when one exists, and virtualization can play a part in it.

"I think that new strategies on how to manage the massive amount of data that is increasingly created and maintained is a larger problem beyond just virtualization," he said. "Virtualization can provide some keys to simplifying some of the challenges."

4. Cloud Computing Helps Government Modify Cybersecurity Strategy

When government enterprises migrate data and operations to cloud providers, they typically give up some control over how everything gets managed, including security -- but adopters can configure cloud environments with security in mind.

According to Simmons, many Citrix government clients want segregated cloud environments, so data within one private cloud stays separate from data in a public cloud.

Cloud computing can also simplify security as employees become more mobile, Smith said. "Finding a new strategy to consolidate the delivery of information through a cloud-like service is actually providing a more secure environment than what they have traditionally done in having data reside often on endpoint," he said, adding that in his opinion, the cloud computing model may allow governments to create new, innovate ways to provide services.

"I think cloud shows promise in providing new ways to deliver services [and] make it quicker and more expedient to provide access to different agency services to the citizen but also in finding new and different ways to deliver services that may not have been thought of in the past," he said.

Edward Snowden: The Whistleblower behind the NSA Surveillance Revelations

Excerpted from The Guardian Report by Glenn Greenwald

The individual responsible for one of the most significant leaks in US political history is Edward Snowden, a 29-year-old former technical assistant for the CIA and recently terminated employee of the defense contractor Booz Allen Hamilton.

Snowden has been working at the National Security Agency for the last four years as an employee of various outside contractors, including Booz Allen and Dell.

The Guardian, after several days of interviews, is revealing his identity at his request. From the moment he decided to disclose numerous top-secret documents to the public, he was determined not to opt for the protection of anonymity. "I have no intention of hiding who I am because I know I have done nothing wrong," he said.

Snowden will go down in history as one of America's most consequential whistleblowers, alongside Daniel Ellsberg and Bradley Manning. He is responsible for handing over material from one of the world's most secretive organizations — the NSA.

In a note accompanying the first set of documents he provided, he wrote: "I understand that I will be made to suffer for my actions," but "I will be satisfied if the federation of secret law, unequal pardon and irresistible executive powers that rule the world that I love are revealed even for an instant."

Despite his determination to be publicly unveiled, he repeatedly insisted that he wants to avoid the media spotlight. "I don't want public attention because I don't want the story to be about me. I want it to be about what the US government is doing."

He does not fear the consequences of going public, he said, only that doing so will distract attention from the issues raised by his disclosures. "I know the media likes to personalize political debates, and I know the government will demonize me."

Despite these fears, he remained hopeful his outing will not divert attention from the substance of his disclosures. "I really want the focus to be on these documents and the debate which I hope this will trigger among citizens around the globe about what kind of world we want to live in." He added: "My sole motive is to inform the public as to that which is done in their name and that which is done against them."

He has had "a very comfortable life" that included a salary of roughly $200,000, a girlfriend with whom he shared a home in Hawaii, a stable career, and a family he loves. "I'm willing to sacrifice all of that because I can't in good conscience allow the US government to destroy privacy, Internet freedom and basic liberties for people around the world with this massive surveillance machine they're secretly building."

Three weeks ago, Snowden made final preparations that resulted in last week's series of blockbuster news stories. At the NSA office in Hawaii where he was working, he copied the last set of documents he intended to disclose.

He then advised his NSA supervisor that he needed to be away from work for "a couple of weeks" in order to receive treatment for epilepsy, a condition he learned he suffers from after a series of seizures last year.

As he packed his bags, he told his girlfriend that he had to be away for a few weeks, though he said he was vague about the reason. "That is not an uncommon occurrence for someone who has spent the last decade working in the intelligence world."

On May 20, he boarded a flight to Hong Kong, where he has remained ever since. He chose the city because "they have a spirited commitment to free speech and the right of political dissent", and because he believed that it was one of the few places in the world that both could and would resist the dictates of the US government.

In the three weeks since he arrived, he has been ensconced in a hotel room. "I've left the room maybe a total of three times during my entire stay," he said. It is a plush hotel and, what with eating meals in his room too, he has run up big bills.

He is deeply worried about being spied on. He lines the door of his hotel room with pillows to prevent eavesdropping. He puts a large red hood over his head and laptop when entering his passwords to prevent any hidden cameras from detecting them.

Though that may sound like paranoia to some, Snowden has good reason for such fears. He worked in the US intelligence world for almost a decade. He knows that the biggest and most secretive surveillance organization in America, the NSA, along with the most powerful government on the planet, is looking for him.

Since the disclosures began to emerge, he has watched television and monitored the Internet, hearing all the threats and vows of prosecution emanating from Washington.

And he knows only too well the sophisticated technology available to them and how easy it will be for them to find him. The NSA police and other law enforcement officers have twice visited his home in Hawaii and already contacted his girlfriend, though he believes that may have been prompted by his absence from work, and not because of suspicions of any connection to the leaks.

"All my options are bad," he said. The US could begin extradition proceedings against him, a potentially problematic, lengthy and unpredictable course for Washington. Or the Chinese government might whisk him away for questioning, viewing him as a useful source of information. Or he might end up being grabbed and bundled into a plane bound for US territory.

"Yes, I could be rendered by the CIA. I could have people come after me. Or any of the third-party partners. They work closely with a number of other nations. Or they could pay off the Triads. Any of their agents or assets," he said.

"We have got a CIA station just up the road — the consulate here in Hong Kong — and I am sure they are going to be busy for the next week. And that is a concern I will live with for the rest of my life, however long that happens to be."

Having watched the Obama administration prosecute whistleblowers at a historically unprecedented rate, he fully expects the US government to attempt to use all its weight to punish him. "I am not afraid," he said calmly, "because this is the choice I've made."

He predicts the government will launch an investigation and "say I have broken the Espionage Act and helped our enemies, but that can be used against anyone who points out how massive and invasive the system has become".

The only time he became emotional during the many hours of interviews was when he pondered the impact his choices would have on his family, many of whom work for the US government. "The only thing I fear is the harmful effects on my family, who I won't be able to help any more. That's what keeps me up at night," he said, his eyes welling up with tears.

Snowden did not always believe the US government posed a threat to his political values. He was brought up originally in Elizabeth City, North Carolina. His family moved later to Maryland, near the NSA headquarters in Fort Meade.

By his own admission, he was not a stellar student. In order to get the credits necessary to obtain a high school diploma, he attended a community college in Maryland, studying computing, but never completed the coursework. (He later obtained his GED.)

In 2003, he enlisted in the US army and began a training program to join the Special Forces. Invoking the same principles that he now cites to justify his leaks, he said: "I wanted to fight in the Iraq war because I felt like I had an obligation as a human being to help free people from oppression".

He recounted how his beliefs about the war's purpose were quickly dispelled. "Most of the people training us seemed pumped up about killing Arabs, not helping anyone," he said. After he broke both his legs in a training accident, he was discharged.

After that, he got his first job in an NSA facility, working as a security guard for one of the agency's covert facilities at the University of Maryland. From there, he went to the CIA, where he worked on IT security. His understanding of the Internet and his talent for computer programming enabled him to rise fairly quickly for someone who lacked even a high school diploma.

By 2007, the CIA stationed him with diplomatic cover in Geneva, Switzerland. His responsibility for maintaining computer network security meant he had clearance to access a wide array of classified documents.

That access, along with the almost three years he spent around CIA officers, led him to begin seriously questioning the rightness of what he saw.

He described as formative an incident in which he claimed CIA operatives were attempting to recruit a Swiss banker to obtain secret banking information. Snowden said they achieved this by purposely getting the banker drunk and encouraging him to drive home in his car. When the banker was arrested for drunk driving, the undercover agent seeking to befriend him offered to help, and a bond was formed that led to successful recruitment.

"Much of what I saw in Geneva really disillusioned me about how my government functions and what its impact is in the world," he says. "I realized that I was part of something that was doing far more harm than good."

He said it was during his CIA stint in Geneva that he thought for the first time about exposing government secrets. But, at the time, he chose not to for two reasons.

Please click here for the full report.

Coming Events of Interest

2013 Creative Storage Conference - June 25th in Culver City, CA. Co-sponsored by the DCIA, CSC:2013 offers attendees opportunities to make valuable connections and participate in the latest trends and requirements for digital storage to serve creative minds and create new and growing markets. Register now and save $100.

Cloud World Forum - June 26th-27th in London, England. The Cloud World Forum offers a comprehensive agenda and speaker line-up from the cloud sector, making it an ideal platform for global authorities to present their "how-to" strategy and vision. Many recognized headline participants along with detailed coverage of the enterprise IT market.

Cloud Computing Summit - July 16th-17th in Bradenton, South Africa. Advance your awareness of the latest trends and innovations from the world of cloud computing. This year's ITWeb-sponsored event will focus on key advances relating to the infrastructure, operations, and available services through the global network.

NordiCloud 2013 - September 1st-3rd in Oslo, Norway. The Nordic Symposium on Cloud Computing & Internet Technologies (NordiCloud) aims at providing an industrial and scientific forum for enhancing collaboration between industry and academic communities from Nordic and Baltic countries in the area of Cloud Computing and Internet Technologies.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Two major conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; and the impact of mobile cloud computing and Big Data analytics in this space.

CCISA 2013 – February 12th–14th in Turin, Italy. The second international special session on  Cloud computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed and  Network-Based Processing.

Copyright 2008 Distributed Computing Industry Association
This page last updated June 23, 2013
Privacy Policy