Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

Oracle

Savvis

SoftServe

TransLattice

Vasco

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

October 7, 2013
Volume XLV, Issue 7


Learn How to Overcome NSA Surveillance at CLOUD COMPUTING WEST

Is the controversy surrounding NSA surveillance a threat to advancement of the cloud computing industry? Or is this an opportunity for responsive solutions, legislative reform, and new business practices to accelerate growth? Make your voice heard, understand the real impacts, and come away with a clear action plan that will help increase sales and boost profitability.

Register today for CLOUD COMPUTING WEST 2013 (CCW:2013), the Cloud Computing Association's (CCA) and Distributed Computing Industry Association's (DCIA) business strategy summit taking place October 27th-29th at The Cosmopolitan in Las Vegas, NV.

Don't miss the timely and important Sunday afternoon Opening Session TOWN HALL MEETING ON THE NSA PRIVACY SCANDAL AND THE CLOUD COMPUTING INDUSTRY.

What can be done in response to this challenge so that your business actually improves as a result and the industry continues to advance? What improvements in architecture, encryption, and data processing methods could mitigate threats like this through new technological solutions?

Will the NSA controversy spur passage of new laws and the establishment of new regulations — how can legislative reform benefit the cloud computing industry? What business practices, voluntary industry standards, and other private sector actions can we take individually and jointly to overcome this issue and help expand our businesses?

Our newly added Town Hall Meeting will assess the impact of this controversy, outline the legislative reform process underway in Congress, and drive for proactive responses the cloud computing industry can make to foster growth.

Has this scandal affected your business and if so how? What should Congress do to mitigate this and prevent a recurrence? A coalition of more than two-dozen affected parties is now advocating federal legislative reform to ensure the privacy of our data stored in the cloud. Learn first-hand from Jim Dempsey, Vice President for Public Policy, Center for Democracy and Technology (CDT) about the very latest developments on the Hill.

How can the cloud computing industry respond to increase our growth prospects? This is your chance to better understand the impacts of this controversy on business, to make your voice heard on this vital issue, and to come away with a clear action plan that will help expand sales and boost profitability through more advanced solutions and improved business practices.

At the conference you'll interact with media and entertainment sector companies like ABC-Disney-ESPN, Comcast, DirecTV, Netflix, Sony Games, and Warner Bros.; cloud computing leaders like Amazon Web Services (AWS), IBM, Microsoft, Oracle, Rackspace, and TransLattice; mobile cloud players like AT&T, Dell, Hewlett-Packard, NTT Data, Sprint Nextel, and Toshiba; insightful analysts, advocates, and industry observers — like ABI Research, CDT, Hughes Hubbard, and the authors of 21st Century Television: The Players, The Viewers, The Money and Securing the Cloud.

SIGN-UP NOW for CCW:2013. Please click here for exhibiting and sponsoring information and here to apply to speak at this event.

The Internet Society Blasts US Online Spying

In a position paper responding to reports of the US government's circumvention of encryption technology, the Internet Society expressed alarm at the alleged programs that "are a fundamental threat to the Internet's economic, innovative, and social potential."

The Society's Paul Brigner talks about the group and its reaction to government online surveillance.

Watch the Video.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyPlan now to attend CLOUD COMPUTING WEST 2013 (CCW:2013), the Cloud Computing Association's (CCA) and Distributed Computing Industry Association's (DCIA) business strategy summit taking place October 27th-29th at The Cosmopolitan in Las Vegas, NV.

The event will begin on Sunday afternoon October 27th with the newly added "Town Hall Meeting on the NSA Privacy Scandal and the Cloud Computing Industry" featuring Center for Democracy and Technology (CDT) Vice President for Public Policy Jim Dempsey, along with Las Vegas Sands Corporation Global CIO Les Ottolenghi, Rackspace Cloud Products Program Manager Tom Hopkins, Rafelson Media CEO Peter Rafelson, Edwards Wildman Palmer Partner Larry Freedman, Wilson Sonsini Goodrich & Rosati Of Counsel Gerry Stegmaier, and others followed by a Networking Meet-Up for speakers, delegates, and exhibitors.

Monday morning October 28th will open with opening plenary session keynotes on "The State of Cloud Computing Adoption for Entertainment" by Amazon Web Services (AWS) Media & Entertainment Partner Eco-System Manager Bhavik Vyas and ABI Research's Practice Director Sam Rosen, who will address the "Consumer Transition to the Cloud: Service Provider & OTT Video, Gaming, and Music Services."

Next we'll explore "The Needs of Enterprise End-Users in the Media Sector" with Netflix Architect and Principal Engineer Mikey Cohen examining key "Cloud Migration Considerations" and Las Vegas Sands Corporation Global CIO Les Ottolenghi outlining "International Media Enterprise Requirements."

After a mid-morning Networking Break, we'll delve into some of the "Latest Trends and Newest Offerings" with Microsoft Platform Technology Evangelist Yung Chou keynoting on "Hybrid Cloud, An Emerging IT Computing Model" and Rackspace Cloud Products Program Manager Tom Hopkins presenting "Strawberry Coconut Cloud — You Choose the Flavor."

After a panel discussion of "Outstanding Issues," which will add Hughes Hubbard & Reed New Media, Entertainment & Technology Leader Dan Schnapp to the morning's keynote speakers, we'll break for our Conference Luncheon, followed by Dessert and Coffee Service in the Exhibit Hall.

Then a series of eCLOUD sessions will feature such topics as "The Cloud & Television" by Frank Aycock, author of 21st Century Television: The Players, The Viewers, The Money, "Collaboration & Production" by TransLattice CEO Frank Huerta, "Editing & Transcoding" by V2 Solutions VP of Media Technology and Solutions Adam Powers, and a panel discussion with Rafelson Media CEO Peter Rafelson, GenosTV CTO Mike West, TransLattice Architect and Director of Research Robert Ross, and ZYNC Render CMO Todd Prives.

A series of mCLOUD sessions will cover "Mobile Storage Considerations" by CSS Corp. VP and CTO Carrier Services Melody Yuhn, "Lowering Latency" by Aryaka President & CEO Ajit Gupta, "HyperElasticity" by Kwaai Oak CTO Reza Rassool, and a panel discussion adding Sprint Nextel Cloud Solutions Manager Jay Gleason and Wilson Sonsini Goodrich & Rosati Of Counsel Gerry Stegmaier will examine "Mobile and Big Data Management."

After a mid-afternoon Networking Break, the eCLOUD will continue with "Cloud-Based Content Management Processes" by Autodesk Senior Global Industry Marketing Manager Richard Blatcher, "Distribution Channel Storage" by Savvis Senior Director - Media Tom Moran, "Cloud-Based Delivery Systems" by Intertrust Technologies Corp. Vice President for Product Management John Gildred, "Cloud Media Lockers" by Securing the Cloud Author Vic Winkler, and a panel discussion adding PADEM Group President and Chief Analyst Allan McLennan will discuss "Security & Reliability Issues."

The mCLOUD will continue with sessions including "Big Data Infrastructure" by HP Converged Systems Senior Vice President & General Manager Tom Joyce, "Cloud & Big Data — The Perfect Storm" by ViaWest CTO Jason Carolan, "Analytic Programs" by Master Control Senior Product Manager Cloud Solutions Victor Gill, "Big Data Software Applications" by Oracle Director Product Marketing SDP and Cloud Solutions Brian Kracik, and a panel discussion on "Security & Reliability Issues" will add Akamai Product Line Director Enterprise Cloud Gary Ballabio, BrightLine Principal Cloud Assurance and Compliance Doug Barbin, and VASCO Data Security VP of Product Marketing Michael O'Malley.

Monday will end with an Evening Networking Reception.

Tuesday morning, the eCLOUD will resume with IBM Cloud Architecture Executive Mark Sorency explaining "How to Build Your Cloud Strategy," Unitas Global Co-Founder Grant Kirkwood discussing "Cloud Vendor Selection for Media Companies," and SAP America Media Industry Principal Kurt Kyle examining "Cloud Economics in the Entertainment Sector," followed by a panel forecasting "Future Cloud Opportunities for Media & Entertainment" that will add Citrix Principal Product Manager Cloud Platform Group Manan Shah and Gaikai Chief Business Officer & SVP of Strategy Robert Stevenson.

The mCLOUD will continue with FalconStor VP Enterprise Solutions Chris Poelker on "The Cloud and Big Data," Red Bend Software EVP Marketing Lori Sylvia offering guidance on "Differentiating with Cloud-Based Mobile Services," and "Mobile Cloud / Big Data Economics" presented by SoftServe VP Technology Solutions Russ Hertzberg, followed by a panel on "Future Mobile Cloud & Big Data Opportunities" will add NTT Data Vice President Alkesh Shah and Rafelson Media CEO Peter Rafelson.

After a mid-morning Networking Break, "Final Considerations" will feature DataDirect Networks Director of Marketing for Cloud, Content & Media Mike King answering the question, "Does Object Storage Actually Fit into File-Based Workflows," Dell Enterprise Cloud Evangelist Michael Elliott discussing "Hybrid Clouds — The End State" as well as plenary keynotes from Aspera Director of Cloud Platforms & Services Jay Migliaccio and Equilibrium CEO Sean Barger. The closing plenary panel on "What's Next for Cloud Computing" will add Edwards Wildman Palmer Partner Larry Freedman and Equilibrium VP of Business Development Daniel Kenyon.

Please click here to exhibit or sponsor the show and click here to apply to speak at CCW:2013. Share wisely, and take care.

Twitter Plans to Raise $1 Billion through IPO

Excerpted from Multichannel News Report by Mike Farrell

Social media phenomenon Twitter removed the veil from its initial public offering (IPO) intentions Thursday, revealing plans to raise as much as $1 billion in public funds as its moves to cement its place in the television and mobile landscapes.

Twitter first revealed plans to go public last month, announcing via tweet that it had made private filings with the Securities and Exchange Commission (SEC) as part of a program that allows smaller companies to keep financial information shielded from the public as they try to drum up investor support.

According to the documents, the company nearly tripled revenue from $106.3 million in 2011 to $316.9 million in 2012 and is on pace to grow by another 60% in 2013, reporting $253 .6 million in sales in the first half of this year. At the same time the social media giant is hurtling toward profitability, reporting its first year of positive cash flow in 2012 -- $21.2 million. In the first half of this year, EBITDA has grown to $21.4 million.

The IPO is a fraction of that of Twitter's main social media rival — Facebook — which launched a $16 billion offering last year. But that is mainly due to Twitter's size —Facebook reported revenue of $5.1 billion in 2012, compared to Twitter's $316.9 million.

Twitter, once derided because of doubts that it could build a business model on the 140-character ramblings of its users, has proven to be a force in advertising and is gaining a foothold in the television market, with recent deals with ratings measurement giant Nielsen. In the IPO, the company noted that it has grown monthly active users (MAUs) from 138 million in March 2012 to 218 million by June 30, 2013. Since its inception in 2007, users have issued more than 300 billion tweets.

And Twitter believes it has only scratched the surface. It notes in the prospectus that there is still a significant opportunity to expand its base — of the world's 2.4 billion Internet users and 1.2 billion smartphone users, only 215 million are MAUs of Twitter.

According to the document, some of the proceeds from the IPO will be used to address expanding that base through geographic expansion, additional mobile applications, product development and establishing platform partners.

The company has been in a race for dominance of the second-screen with rival Facebook, and has created campaigns around big TV events like the NBA Finals, the NCAA Basketball Tournament, the Super Bowl (more than 24 million tweets were sent during last year's contest alone) and shows like ABC's Scandal. The company said in the prospectus that it is looking to expand its presence in the television arena.

"We plan to continue to leverage our media relationships to drive more content distribution on our platform and create more value for our users and advertisers," the company said in the prospectus.

Twitter hasn't set a date for its IPO yet, that should come in subsequent filings. The company did not specify which exchange it will trade on, but has picked its ticker symbol -- "TWTR." Goldman Sachs is the lead underwriter for the offering, followed by Morgan Stanley and JP Morgan.

Verizon Launches New Cloud Service

Verizon today announced Verizon Cloud — its new cloud Infrastructure as a Service (IaaS) platform and cloud-based object storage service. With this service, Verizon is fundamentally changing how public clouds are built.

Large enterprises, mid-size companies and small development shops will get the agility and economic benefit of a generic public cloud along with the reliability and scale of an enterprise-level service with unprecedented control of performance. The public beta for Verizon Cloud will launch in the fourth quarter of this year.

"Verizon created the enterprise cloud, now we're recreating it," said John Stratton, president of Verizon Enterprise Solutions. "This is the revolution in cloud services that enterprises have been calling for. We took feedback from our enterprise clients across the globe and built a new cloud platform from the bottom up to deliver the attributes they require."

Verizon Cloud has two main components: Verizon Cloud Compute and Verizon Cloud Storage. Verizon Cloud Compute is the IaaS platform. Verizon Cloud Storage is an object-based storage service.

Verizon Cloud Compute is built for speed and performance. Virtual machines (software-based computers and servers) can be created and deployed in just seconds, and users build and pay for what they need. With Verizon Cloud Compute, users can determine and set virtual machine and network performance, providing predictable performance for mission critical applications, even during peak times.

Additionally, users can configure storage performance and attach storage to multiple virtual machines. Previously, services had pre-set configurations for size (e.g. small, medium, large) and performance, with little flexibility regarding virtual machine and network performance and storage configuration. No other cloud offering provides this level of control.

In addition, while Verizon built the solution for enterprises, it is nimble enough to meet the needs of small and medium businesses, individual IT departments and software developers.

Verizon Cloud Storage is an object-addressable, multitenant storage platform providing safe, durable, reliable and cost-effective storage accessible from anywhere on the Web. Object storage is extra robust and Web-traffic reliable, making it ideal for cloud-based applications. Verizon Cloud Storage overcomes latency issues that have plagued many traditional storage offerings, providing improved performance.

Cloud Computing and the Rise of Big Data

Excerpted from TechRepublic Report by Nick Hardiman

The cloud enables big data processing for enterprises of all sizes by relieving a number of problems, but there is still complexity in extracting the business value from a sea of data.

At first glance, it isn't obvious why the unstructured data methods of the new big data world are even necessary. Even if new methods bring new business value, why not stay on-premise? Why bother with cloud databases?

Big data is one of those new, shiny labels, like SDN, DevOps and cloud computing, that is both hard to ignore and hard to understand. There is no single "big data" type — it is a collective label stuck on unstructured data, the technology stack it inhabits, and the new business processes that are growing up around it.

For instance, the discipline of big data analytics is about getting business value out of large data sets. Data scientists work with resources and processes to turn data into useful information. The classic Relational DataBase Management System (RDBMS) can handle a lot of data, and has been doing so for decades.

Why can't a data scientist stick with structured data in an RDBMS? Which is best — RDBMS or NoSQL?

The technical stack an enterprise chooses is dictated by the type of data they need to store, and the type of data is dictated by business requirements.

The RDBMS is good for managing structured, highly relational data and will continue to be the software of choice for many requirements.

For the growing amount of unstructured data produced by social media, sensor networks, and federated analytics data-and for constantly changing data that needs to be replicated to other operating sites or mobile workers-NoSQL technologies better fit those use-cases. Unstructured data can be terabytes or even petabytes in size.

The RDBMS is the type of storage software that has been dominant for decades. All data in an RDBMS is structured — clean, ordered and easy to understand. That makes it good for some work but bad at others. RDBMS products are also well known; a generation of DB administrators is experienced in RDBMS care and feeding.

One big problem with an RDBMS is when it gets too busy. When the quantity of data starts filling up the disk, and the queries are thrashing the CPU and the result sets choke the RAM, more resources are required to keep the DBMS working. There is only one way to scale, and that's "up." Scaling out doesn't work because a relational database service only has one front door. And the only way to scale up is to buy a bigger box.

Scaling up does not cure RDBMS problems. Even the biggest computer, with its huge IT budget-gobbling price tag, only solves the resource problem. The IT department still has to solve other problems like HA fail-over, disaster recovery and storing data where it's needed.

If the infrastructure is on-premise, there are traditional problems to overcome. Managing on-premise RDBMS is expensive and time consuming. An on-premise MySQL, Oracle or SQLServer database service is propped up by an overloaded IT department with a queue of work and inflexible hardware. If an enterprise rents Microsoft Azure Database, Google Cloud SQL or Amazon RDS these infrastructure headaches go away.

In theory, managing cloud-based big data is cost-effective, scalable, and fast to build. Unfortunately, it's not all good news.

DB administrators don't have an easy ride. The NoSQL databases that have appeared in the last few years, with their key-value pairs, document stores, and missing schemas, don't look like the relational databases they are slowly replacing. Also, the new rivers of data are difficult to capture, store, process, report on, and archive.

It's not so bad for system administrators. If they run a private cloud, the new unstructured data technology stack of hardware and software looks like the old structured data stack — IaaS at the bottom, a database service in the middle, and applications on top delivering the business value. If they manage public cloud services, they don't have to touch the lower layers of the technology stack.

Sticking data in Windows Azure Tables, Amazon SimpleDB, or MongoDB is just the start of the data science required to make the most of big data. There is plenty of business partnering, re-skilling and other attitude adjustment to take care of.

Mobile Cloud Forces Redesign to Combat Threats

Excerpted from v3.co.uk Report by Alastair Stevenson

Businesses will have to redesign their networks from the ground up if they hope to protect their data from next-generation hackers, according to McAfee President Michael DeCesare.

DeCesare said business and high-tech companies will have to use a by-design strategy if they wish to remain ahead of the threats they face. He was speaking during a keynote at the McAfee Focus conference.

"We have to figure out how to integrate security into networks from the get-go. We have to redefine the role of network security. Companies are going to have to change. All companies will be rebuilding their networks," he said.

DeCesare cited new trends resulting from developments in mobile cloud technologies, such as bring-your-own-device (BYOD), as proof of the weakness of current networks.

"We are asking so much of our networks these days, not just with security, but in general. But, when we designed these networks five or 10 years ago we did not contemplate what we'd be asking of them today: to ingest the concept of a public or a private cloud, adjusting to the parameters of BYOD," he said.

"We're also asking them to be able to offer higher levels of security on any bit of information or device connecting to them and we're balancing that with the concept of software-defined networking. What is happening in the network space is the same thing that was happening with the data centre space over the last 10 years — virtualization is coming to the network."

He added that businesses will have to move quickly to address the problem as it is now far easier for hackers to target them. "There is a physical divide as the budgets we have aren't growing at the same rate as the technological sophistications of the adversaries that we face every day. Lastly there is a playing field divide, it is asymmetrical warfare. As security professionals we try and guard against this growing number of attacks on every IP-enabled device," he said.

"But every one of these that comes online is on different platforms, different software versions and we as security professionals have to guard all of these. But the adversaries we deal with every day just have to find one way in. The adversaries also don't have to worry about usability standards, they don't care if they break machines."

DeCesare said the trend is even more troubling as hackers have already begun using the new technologies to create next-generation cyber attacks. "We've seen an increase over the last 12 months of targeted Trojans. Sure we've seen these before, what's changed is the deployment model. A big thing is the concept of free apps. Ten years ago you'd have never downloaded anything free to your laptop or phone, free was bad news. Now this has changed and applications can be built for different purposes," he said.

"We've also seen an increase in evasion techniques. Malware is now able to know if it's in a sandbox and sit idle until the scan's finished before moving on. These are all examples of the growing technical sophistication of where the cyber criminals are heading."

McAfee is one of many security companies to warn of the dangers posed by new smart devices and cloud services. Last month F-Secure web reputation service expert Christine Bejerasco claimed that the failure of free cloud services, such as Facebook, Twitter and Dropbox, to adequately test their security before launching helped to ignite the current cybercrime boom.

John McAfee to Launch NSA-Proof P2P Networking Tool

Excerpted from The Guardian Report by Alex Hern

If there's anyone who would know about staying under the government's radar, it's John McAfee.

That might be why the controversial programmer's return to the world of IT is a device designed for making personal encrypted networks — perfect for keeping users' business away from the NSA.

McAfee made a public appearance at the C2SV Conference in San Jose, CA to reveal his new company, Future Tense, and its first product — a small piece of hardware called D-Central.

The device can be dropped in a pocket or a bag, creates a localized wireless network, designed to exist on a "lower scale" than the Internet.

D-Central can either be set in a private mode, which provides encryption for all users but leaves them mutually identifiable, or a public mode, allowing users to make files available to the public while still maintaining anonymity.

"The NSA helped create every single encryption algorithm that we use, and therefore can get access to anything they want," McAfee told the conference.

"I'm 68 years old and if you can just give me any small amount of information about yourself, I promise you within three days, I can turn on the camera on your computer at home and watch you do whatever you're doing."

He claimed that users could request any file, which would then automatically download once a user with that file joined the network.

"If you're on a college campus, you'll probably get responses within a quarter of a second," he said.

With no unique identifier for the devices, the recipient would not know who had provided the file, and the sender wouldn't know who received it.

McAfee confirmed that there is a road map to launch. "We have the design in place … I would say we are six months out from the first prototype," he said.

In 2012 the anti-virus pioneer, who founded McAfee Associates in 1987 before leaving the company in 1994, fled his home in Belize after police attempted to interview him as a "person of interest" following the murder of a neighbor.

After making it to nearby Guatemala, he was arrested and eventually deported to the US.

BitTorrent Experiments with Secure Chat

Excerpted from CNET News Report by Seth Rosenblatt

The aftermath of the NSA spying revelations has people and companies scrambling for ways to create more secure communications, which has led BitTorrent to build an instant-message chat client that follows the torrenting principle of decentralized data transfer.

The first release of BitTorrent Chat is a private alpha, meaning you have to go to the BitTorrent Chat sign-up page to get an invite, which will take you to a download.

The client uses the concept of decentralized technology that's at the heart of torrents to run instant messages between people, but BitTorrent was cagey about confirming details about the program.

There's no central server that stores communications, although it apparently works "similar to BitTorrent Sync, but adapted for real-time communications," said BitTorrent's communications chief Christian Averill.

Eventually, the service is expected to work with other instant-messaging accounts and be interoperable with SIP standards, but for now it requires a BitTorrent account. BitTorrent has not yet confirmed which of the three major desktop platforms of Windows, Mac, or Linux that the alpha will be available on.

Mobile apps are also planned for BitTorrent Chat.

Averill was unable to provide details on how the service logs your chats, so it's not clear at this time whether message logs are stored locally, or even available as an option. BitTorrent Chat, he said, came about during one of the company's internal hackathons, which has led to BitTorrent Labs projects such as Sync.

It may have been a routine hackathon that led to BitTorrent Chat, but if it works as advertised, it would appear to be perfectly poised to take advantage of the surprising number of NSA spying revelations that have been making headlines since Edward Snowden first leaked documents to the press earlier this year.

Instant-message chat logs and traffic are governed by the same legal standards as e-mail and mobile-phone text messages, so it's likely that the government has been asking for IM logs along with e-mail and other online communication services offered by companies like Google, Facebook, and Microsoft at the center of the controversy.

When asked about what BitTorrent's response would be to potential requests from government agencies like the National Security Agency for a BitTorrent Chat back door, he said, "We're not familiar with specifics of NSA programs, so it's not something we can really comment on."

"We are focused on creating something durable that respects user privacy and that has real consumer benefits," he said.

Identifying & Mitigating Cloud Computing Vulnerabilities

Excerpted from TechTarget Report by Amy DeCarlo

In less than a decade, cloud computing has grown from an intriguing niche to a mainstream market segment. Future expectations are high, with Morgan Stanley projecting Amazon Web Services (AWS) will hit the $24 billion revenue mark in 2022. Of course, how successful any single provider is in growing its cloud business depends on its ability to help dispel the cloud security worries that still sideline some on-demand deployments.

The presumption that a highly virtualized, multi-tenant environment is intrinsically more susceptible to attack is a byproduct of the belief that the level of accessibility and flexibility that makes the cloud so appealing to customers also opens the door to opportunistic hackers who are ready to capitalize on the many points of entry.

These concerns about cloud computing vulnerabilities can translate into real reticence for customers deciding whether to deploy their most critical applications in the cloud. Yet there are indications that cloud security is becoming less of a barrier to entry than it was in the past. The allure of the on-demand model is strong enough that many businesses are willing to set aside some concerns around data security and privacy, at least on an experimental basis, with project-based Infrastructure as a Service (IaaS) deployments to support short-term capacity needs.

The good news is this next wave of cloud deployments has been fairly successful, which has helped to bolster confidence in the model. There is, however, still a confidence gap between businesses still on the sidelines and those that have already taken the plunge.

A Microsoft-commissioned survey of more than 200 small and medium-sized businesses (SMBs), conducted by research firm comScore Inc., found 42% of those organizations that aren't using the cloud find it inherently unreliable. Contrast that with the 94% of SMBs polled in the June 2013 survey that say the level of security they are getting with cloud-based applications is higher than what they had implemented in an on-premises model.

These findings support the idea that many businesses actually find one of the most compelling benefits of the cloud is that providers can offer a level of expertise and integrated security superior to what many businesses can provide internally. Simply put, security can be a critical differentiator for a cloud provider.

So, what cloud-specific vulnerabilities and threats are the most dangerous, and how can providers best protect their cloud environments? The reality is that neither the general nature of security threats nor the types of controls deployed to mitigate risk are radically different from those in a traditional environment.

Attackers tend to follow similar patterns and employ many of the same methods they use to breach a traditional environment: bypassing access controls, discovering valuable data, taking control of the asset where the data resides and then stealing or exposing the data. However, the nature of the cloud means providers need to adjust their approach to address issues specific to an on-demand environment.

Just as they secure a conventional IT environment, a cloud provider needs a multilayer approach that addresses security in a comprehensive manner, incorporating access management, perimeter security and threat management, encryption, distributed denial-of-service (DDoS) mitigation, privacy and compliance management. But in a shared cloud environment, elements like identity and access management become especially crucial because data from multiple clients is stored in and accessed through the same shared environment.

Cloud providers need to assure customers they have an effective solution in place that not only grants access, but can also validate identity in a virtualized environment using such methods as multifactor authentication.

Providers also need to address hypervisor security by using monitoring tools that can detect suspicious behavior, including unusual traffic patterns and unusual transactions, which might signify a threat to the integrity of the environment. Providers also need to answer questions around data commingling from both a privacy and a compliance perspective by outlining how they logically partition client data.

Many hackers launch volumetric attacks on the cloud, designed to flood the environment and expose vulnerabilities. To this end, providers need the right DDoS mitigation strategies in place that can help identify traffic anomalies before they interfere with progress.

Also, in a multi-tenant cloud environment, providers need to make sure that businesses migrating application workloads from a traditional environment have correctly configured communication settings for some elements --including encrypted or unencrypted data channels, IP addresses, and host names — so they are transmitted over a secure channel.

Providers face a host of challenges as they work to protect cloud data, but the real test may actually come in learning how to effectively communicate the effort to customers, which will involve outlining security controls and highlighting incidents where the provider obstructed a breach.

Success in the cloud comes down to a number of factors. While issues like price and the geographic location of data are important, what really distinguishes a cloud provider is its ability to act as a trusted partner to its customer, supplying not only the appropriate infrastructure, but also delivering on its promise to be trustworthy.

Mistakes, Misconceptions, and Misuses of Cloud Computing

Excerpted from Web Not War Report by Kamil Shafiq

In the late 1890s, there was a fuel-burning, smoke-blowing generator in the basement of every factory and business in Chicago. Also in this basement were men working to maintain and service this generator to keep it running as best they could.

When the generator was working, it was business as usual. When it wasn't, machines stopped, lights went out and businesses were powerless. All operations were at a standstill. Then, the Edison power company created a turbine power station that could generate and distribute large-scale power to businesses throughout the city.

This was a cheaper, cleaner and more reliable solution than any on-premise generator. Fast-forward twenty years, and on-site power generators were nowhere to be found, every factory and business was accessing power by plugging into a wall.

Just like on-site power generators, today's organizations need to invest a great deal of time and money in maintaining their IT infrastructure. Components such as hardware, software, and services need to be constantly updated to meet ever-changing business needs. With on-site IT infrastructure, the scaling process can be slow, expensive and far from optimal.

Just as Edison's turbine power station was, cloud computing is a paradigm shift that provides computing over the Internet. Instead of managing it on-site, businesses can now integrate and access their entire IT infrastructure on a cloud.

A cloud computing service consists of very large data centers connected to high speed, low cost networks. They provide various software, hardware, and information resources as needed. Organizations can simply connect to the cloud and use the available resources on a pay-per-use basis. This helps companies avoid capital expenditure on additional on-site infrastructure resources.

Cloud computing encompasses the following three service models: IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service). This is often called the SPI model.

IaaS is where most business start when it comes to using cloud computing. Think of IaaS as a power station, where organizations receive infrastructure components such as computing power and virtual storage. Applications are the same, but instead of running them on-site, they are running on a more reliable cloud infrastructure. Here, the organization has control over every element in the IT department, including the hosting environment and its applications. Despite this, the organization still needs to allocate additional staff to maintain and manage the infrastructure and applications.

The PaaS model provides organizations with a platform, or a run-time environment to create and deploy applications. Here, the organization is only responsible for the development maintenance and management of the applications. In both the IaaS and PaaS, Microsoft provides the Windows Azure platform as a viable service model.

The SaaS model provides organizations with ready to use applications that use a combination of cloud-based computing and storage services. Essentially, SaaS is fully-serviced software running on fully-serviced infrastructure. Instead of using Azure, Microsoft provides various online services such as BPOS (Business Productivity Online Suite) and Microsoft Dynamics CRM Online as a SaaS offering.

Cloud computing is based off the idea of multi-tenancy. With a multi-tenant app, there isn't a separate copy of an app allocated for each business that's using it. Instead, it's one app that everyone shares. Think of it as a giant office building, where everyone shares the infrastructure and services, but each person can customize their own office space or cubicle.

You can deploy a cloud computing service using three different models: a private cloud, a public cloud or a hybrid cloud. A private cloud functions solely for one organization on a private network and is highly secure. A public cloud is owned by the cloud service provider and offers the highest level of efficiency in shared resources. A hybrid cloud is a combination of private and public deployment models. In a hybrid cloud, specific resources are run or used in a public cloud and others are on-premises in a private cloud. This combination provides increased efficiency, and allows companies to customize their cloud usage to fit their needs.

But why are these "solutions" good for business? The first reason is infrastructure costs. With cloud computing, you no longer have to worry about buying, building and managing system servers, data storage, firewalls, routers, etc. Cloud services cover them all.

The second reason is scalability. Cloud computing allows users to instantly enhance or diminish different facets of their IT infrastructure according to business requirements.

The third main reason is flexibility. As long as you have a computer and an Internet connection, you'll be able to access your applications from any location in the world, and now on many other devices as well. Some people see this is as a solution, and others as a problem.

Though cloud computing has drastically changed the way businesses operate and manage their applications, many are still hesitant to make the leap. The past five years have taught us quite a bit about the common problems and misconceptions that cloud-users share.

Firstly, many providers partake in what has now been deemed "cloud washing", where they label their existing services as cloud solutions despite them not being so. The main difference between a cloud washing and an authentic cloud solution is that an authentic service can support a pay-per-use payment plan, while a cloud washing one cannot.

One of the common mistakes that the average user makes is assuming that any application or service that is not on-site is automatically "in the cloud". This is untrue, and poses a problem for firms that are seeking a genuine, agile, and elastic cloud computing service.

Other common mistakes that users make when it comes to cloud computing are integration issues. These can often be broken down into data, application and access/identity issues. The problem with data integration is that once an application is moved on to a cloud network, new data is often generated there.

If this new data needs to be used in conjunction to on-premise data that is not on the cloud, there is simply a disconnect. Application integration issues are similar. When some are being housed on-premises and others on a cloud, communication between them all can be far from seamless.

Access/identity integration issues are perhaps the most important, given that they can compromise security if not addressed immediately. An example of this kind of issue can be seen in the situation where an employee leaves a firm, but retains access to data on the cloud despite being removed from on-premise networks. To avoid this type of problem, it's important to establish internal access policies with the cloud provider when initiating a service.

This way, there's a protocol that's set to be followed in the instance that an employee leaves or is let go. It's uncertainty in these kinds of situations that can threaten business security, and make cloud users over-cautious. Often times, newcomers feel the need to enforce encryption everywhere: online, offline, on-site and off-site.

This high volume of differing encryption causes conflict, and simply isn't necessary when using a reliable cloud service. It's important to note however, that most of these issues are relevant only to users who are new to the world of cloud computing.

Many large businesses have started incorporating different elements of cloud computing in their business, while others have already integrated it entirely. It's obvious that cloud computing is still in a transitionary phase, where skeptics are still hesitant to disrupt traditional business practices.

"If it's not broken, don't fix it" is no longer an attitude that's holding businesses back from taking the leap.

I think Joe Weinman, the author of Cloudonomics puts it best when he writes, "Ultimately, the cloud is the latest example of Schumpeterian creative destruction: creating wealth for those who exploit it; and leading to the demise of those that don't."

Though some may say that the future of IT management is still up in the air, I think there's a fair share of evidence that supports it is up in the clouds.

Cloud Adoption Grows Despite New Challenges

Excerpted from CIO Insight Report by Samuel Greengard

Over the last few years, cloud computing has fundamentally changed the nature of IT and business. It has created new opportunities and introduced more efficient ways to manage everything from data and software to infrastructure and platforms.

However, a new report from 451 Research, in conjunction with the Uptime Institute and the Yankee Group, indicates that a number of non-IT related obstacles are slowing the pace of project completions.

The report, TheInfoPro Wave 5 Cloud Computing Study, notes that 60 percent of respondents view cloud computing as a natural evolution of IT service delivery and, as a result, do not allocate separate budgets for cloud computing projects. However, among the group that does establish a separate budget for cloud projects, 69 percent expect to increase their spending over the next two years.

Not surprisingly, private clouds continue to dominate the IT landscape. Overall, 61 percent of respondents use internal clouds for virtualization, 26 percent rely on them for automation, 10 percent for orchestration, 2 percent for consolidation and 1 percent for standardization. However, over the past six months, Infrastructure as a Service (IaaS) and Software as a Service (SaaS) activity doubled to between 30 percent and 33 percent of the total projects mentioned.

"Current adoption of cloud computing is a work in progress for most organizations as they continue to implement server virtualization, automation and orchestration capabilities that are the prerequisite underpinnings for the majority of cloud-ready data centers," says Peter ffoulkes, research director for servers, virtualization and cloud computing at TheInfoPro, a branch of 451 Research.

But the report also found that despite growing cloud adoption, 83 percent of respondents face significant roadblocks when it comes to deploying projects. The figure rose by 9 percent since the end of 2012. Interestingly, IT roadblocks have declined to 15 percent while non-IT roadblocks have increased to 68 percent. Respondents cited a number of general issues, mostly revolving around people, processes, budgets, time, politics, security challenges, contractual agreements and change management issues. These "non-IT roadblocks are usually harder to solve than technology problems," ffoulkes says.

Regulatory and compliance issues, particularly as they pertain to a cloud environment, are another pain point. Too often, respondents noted, cloud-based systems are set up on a pass/fail basis and they don't adequately address the nuances of today's business and IT environment, particularly in the security arena. As a result, some professionals are shying away from cloud projects they might otherwise embrace.

ffoulkes says that there are a number of key takeaways for CIOs and other business and IT leaders. They include:

It's crucial to have a clear understanding of organizational requirements and to deal with technology issues. But executives must also focus on "cultural and organizational change management" in order to enable a successful transition to the cloud.

Rejection and selection criteria are different, ffoulkes points out. It's important to separate these issues and understand what drives each decision.

Costs can always be negotiated, but security, reliability, integration, service levels and compliance issues are not negotiable.

It's critical to build a high level of flexibility in a cloud infrastructure. The environment is likely to change significantly over the next few years and an organization should have an "escape route" in something goes astray.

Transparency is critical. Visibility into technology and processes is essential and an enterprise must be able to control and audit systems easily and effectively.

Security must be built into the foundation of any cloud environment.

"The cloud is the foreseeable future for IT but it will exist in many different forms and will evolve significantly over the next several years," ffoulkes concludes. He says that today's business and IT leaders must change with the evolving cloud space or wind up being replaced. "The transition to cloud computing models is an inevitable evolution of IT delivery."

Coming Events of Interest

CLOUD COMPUTING WEST 2013 - October 27th-29th in Las Vegas, NV. Two major conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; and the impact of mobile cloud computing and Big Data analytics in this space.

Government Video Expo 2013 - December 3rd-5th in Washington, DC. Government Video Expo, co-located with InfoComm's GovComm, brings the east coast's largest contingent of video production, post, digital media, and broadcast professionals together with the government AV/IT specialists. The combined event features over 150 exhibits and nearly 6,000 registrants.

GOVERNMENT VIDEO IN THE CLOUD - December 4th in Washington, DC. This DCIA Conference within Government Video Expo focuses specifically on cloud solutions for and case studies related to producing, storing, distributing, and analyzing government-owned video content.

International CES - January 7th-10th in Las Vegas, NV.  The International CES is the global stage for innovation reaching across global markets, connecting the industry and enabling CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $209 billion US consumer electronics industry.

CONNECTING TO THE CLOUD - January 8th in Las Vegas, NV. This DCIA Conference within CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector. Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

CCISA 2013 – February 12th–14th in Turin, Italy. The second international special session on  Cloud Computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed, and  Network-Based Processing.

NAB Show - April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.

CLOUD COMPUTING EAST 2014 - May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.

Copyright 2008 Distributed Computing Industry Association
This page last updated October 13, 2013
Privacy Policy