Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

Edwards Wildman

IBM

Iron Mountain

OutSystems

Paragon

Rackspace

SoftServe

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

October 6, 2014
Volume XLIX, Issue 11


Join The DCIA's Internet of Things at CES

The Distributed Computing Industry Association (DCIA) is pleased to announce The DCIA's Internet of Things (IoT) at CES 2015, a four-day marathon presentation, from January 6th through 9th in Las Vegas, NV, of the newest and arguably largest ever distributed computing industry phenomenon.

As we reported last week, the IoT is now on a growth trajectory to surpass 50 billion objects by 2020.

The 2015 International CES Show is the ideal place to start learning in-depth about the multiplicity of opportunities that this rapidly emerging movement offers product developers, software engineers, marketers, entrepreneurs, and other forward-looking professionals across many economic sectors.

The IoT traces its routes to 1969, when the first nodes of what would eventually become known as ARPANET, the precursor to today's Internet, were established at UCLA and Stanford universities; and 1982, when Internet Protocol (TCP/IP) became a standard, ushering in the worldwide network of fully interconnected networks that we now call the Internet.

John Romkey and Simon Hackett created the world's first connected device (other than a computer): a toaster powered through the Internet in 1990.

In 1999, Kevin Ashton coined the term "Internet of Things," and established MIT's Auto-ID Center, a global research network of academic laboratories focused on RFID and the IoT. That year, Andy Stanford-Clark of IBM and Arlen Nipper of Arcom (now Eurotech) introduced the first M2M protocol for connected devices: MQ Telemetry Transport (MQTT).

In 2005, the United Nations (UN) first mentioned IoT in an International Telecommunications Union report. Three years later, the first international IoT conference took place in Zurich, CH. The IPSO Alliance was formed in 2008 to promote IP connections across networks of "smart objects."

Google introduced a self-driving vehicle project in 2010, a major milestone in the development of a connected and autonomous car. Also that year, Bluetooth Low Energy (BLE) was introduced, enabling applications in the fitness, healthcare, security, and home entertainment industries.

In 2011 Nest Labs (Now Google) introduced sensor-driven, WiFi-enabled, self-learning, programmable thermostats and smoke detectors; and IPv6 launched, a protocol to exponentially expand the number of objects that can connect to the Internet by introducing 340 undecillion IP addresses.

Last year, Google Glass, controlled through voice recognition software and a touchpad built into the device, was released to developers; and this year, Apple announced HealthKit and HomeKit, two health and home automation developments, while the firm's iBeacon advanced context and geolocation services.

Approximately 12.1 billion Internet-connected devices were in use in April 2014. Currently, about 100 things connect to the Internet every second, and the number is expected to reach 250 per second by 2020.

As more and more "things" get connected to the Internet — expanding from the initial smartphones, phablets, tablets, laptops, game consoles, toasters, physical activity monitors, home security systems, industrial equipment, and vehicles of all kinds — the stakes grow exponentially larger.

As Gilad Meiri, CEO of tech startup Neura, notes, "The IoT holds potential for disruptive change, and its evolution will likely be faster than the Internet."

"The DCIA's IoT at CES" will feature the very latest in connected consumer device innovations, wearable creations, machine-to-machine (M2M) advances, RFID developments, micro-sensor discoveries, smart environment construction, and more, which are leading the way in this world-altering trend.

AWS Security Certs Advance DoD Cloud

Excerpted from TechTarget Report by Beth Pariseau

Amazon Web Services (AWS) was recently awarded provisional authority to host highly sensitive workloads for the Department of Defense (DoD), but the agency hasn't fully embraced public cloud yet.

Government bureaucracy remains slow-moving despite AWS security certifications that allow the cloud provider to host sensitive data

AWS was the first public cloud to receive a provisional authorization from the DoD under the Defense Information Systems Agency's (DISA's) Cloud Security Model to host Level 3-5 workloads, in late August. Levels 3-5 refer to unclassified, but highly sensitive data. Level 6, which is still excluded from the provisional authorization, pertains to classified data.

Meanwhile, the DoD is methodical in deploying new technologies, said US Air Force Brigadier General Steve Spano, who now works as Amazon's General Manager for Defense and National Security, in a keynote here this week during the Distributed Computing Industry Association's (DCIA) and Cloud Computing Association's (CCA) trade event, the CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014).

Spano described four stages of public cloud adoption, from test and dev apps to migration of production applications to migration of mission-critical applications and eventually, all-in.

"By and large, I would say that DoD is in Phase 1 as an entity and an organization," Spano said. "Moving an organization such as the DoD, having lived it for 28 years, is quite challenging."

But while the move has been slow, it is still definitely occurring, according to Spano. Some agencies within DoD are at Phase 2 or 3.

"This wasn't the case a couple years ago," he said. "Now we're beginning to turn the corner."

To earn the Level 3-5 provisional authority, AWS' GovCloud region, already compliant with federal regulations such as the International Traffic in Arms Regulations (ITAR), had to implement 45 new security controls to satisfy the DoD's security concerns.

These concerns will evolve and probably relax a little, Spano said.

For Amazon, it means "a day-to-day investment in the challenge of continually educating and pushing the transformation in large bureaucracies that aren't used to moving as fast as others, particularly within the commercial sector," Spano said.

Spano also had some pointed words about those who insist on-premises security trumps the public cloud's ability to secure workloads.

"When I was on active duty, I often thought that security was used as a smokescreen for what really is a lack of trust and control," he said. "I'm giving something up -- I can't hug my server and thus it's not secure." However, the demands on the department will increase as time goes on, while resources, particularly as the military cuts its force in response to government sequestration, are in decline.

"Believing that that gap of risk is mitigated by the fact that systems are on-premises is a false sense of security in my mind," Spano said.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThanks to all who contributed to the Distributed Computing Industry Association's (DCIA) and Cloud Computing Association's (CCA) industry-leading CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014) this week at the Hilton Austin in Austin, TX.

We're especially grateful to Conference Sponsors and Exhibitors IBM, SoftServe, Rackspace, Edwards Wildman, Paragon, Iron Mountain, and OutSystems; and to CCA Executive Don Buford and his entire team for all that they did to support the highly stimulating and provocative CDSE:2014.

An archival website has been created featuring audio recordings of all sessions, keynote and workshop presentations, the conference brochure, a photo album, workshop flyers, and the CCA website event page.

On the Business Strategy Conference track, Tim Hayden's new book "The Mobile Commerce Revolution" had its debut, followed by a broad look at the state of cloud adoption focusing on the public sector with the opening keynote from Amazon Web Services and address from Google; and a leading global cloud attorney from Edwards Wildman Palmer joined the panel discussion that followed.

Meanwhile the first Workshops & Special Seminars, from Microsoft and IBM, featured "Getting Started with Azure" and "DevOps Services in the Cloud." Microsoft and IBM conference keynotes focused on Enabling DevOps and Mobile Cloud Architectures.

There was a fascinating case study from Sense Corp. on the role of OutSystems in the College Bound cloud initiative. And again the keynotes were followed by an in-depth panel discussion.

Google and Dell presented workshops on "GIS & GPS in DriveTexas" and "Managing Hybrid Cloud Environments."

Back on the conference track, Dell and Rackspace addressed the "Current State of Private Cloud" followed by Aspera on "Cloud Object Stores," and then a panel discussion.

Rackspace led the concurrent workshop followed by SAP on "Ariba — the World's Largest Sourcing Network."

Media & Entertainment advances were highlighted in solo presentations on "Advanced Private Cloud Technologies and Services," "The Slow Lurch to the Cloud," and "Media Delivery and the Cloud" from Front Porch Digital, Mediafly, and RealEyes Media, and a follow-on panel discussion; and workshops from SoftServe on DevOps and Microsoft on Azure.

Mobile Cloud was the focus of Adjacent Technologies, JW Secure, and NetSuite in "Mobilizing Content," "The Secret Life of Mobile Data," and "The Mobile-Enabled Supply Chain" and the panel discussion that followed.

Meanwhile, workshops featured "IBM Bluemix: Mobile Apps in the Cloud" and "FedRAMP Accreditation" by Paragon.

Big Data was covered by SoftServe, Talend, and Oracle, in "Big Analytics: A Next Generation Roadmap," "Talend Big Data," and "Database as a Service (DBaaS) Private Cloud," plus a panel discussion; and workshops on "Digital Transformation — Avoid Becoming Digital Prey" from Dell and "DirectTrust — Healthcare Exchange" from DigiCert.

A series of cutting-edge sessions focusing on "Predictive Analytics" from QRhythm, "Cloud Platform Management" from VDI Space, and "Application Continuity" from Iron Mountain, with a panel discussion to follow; while workshops covered "Real-Time Data Solutions with SAP's HANA" and "Contained Excitement using Docker" from Rackspace.

SAP presented a fascinating case study about the 2014 World Cup "Staying Ahead of the Game" and Google offered a workshop called "Hangout with Google."

The Healthcare program block had keynotes on "HIPAA in the Cloud," "Digital Transaction Management," and "MyDirectives" from OnRamp, Kinetic Concepts, and ADVault, plus a panel discussion.

Workshops on "Building Managed Cloud Environments for the Public Sector" and from IBM, "Critical Regulatory Issues" from Edwards Wildman.

The Conference Closing Session covered "Mining the Cloud" and "Data Security" by SERC and Paragon, followed by some much needed "Reality Therapy" from Xvand, and then a final panel discussion; and the Closing Workshop prophetically focused on "The Coming Internet of Things" from Flux7. Share wisely, and take care.

How the IoT Will Change Cloud Computing

Excerpted from MSPMentor Report by Michael Brown

Some experts are predicting that the Internet of Things (IoT) will accelerate use of the cloud, improve predictions about consumer preferences, and change the scope of services managed service providers (MSPs) can offer. Here's a quick look at the trend.

The IoT global market is still at its infancy and positioned for exponential growth over the next couple of years as people continue to connect to the Internet through billions of devices and applications.

As an MSP that provides cloud-based file sharing, IoT needs to be on your radar.

Here's a look at how MSPs will be able to leverage the cloud as the only platform and service flexible, scalable, and analytic enough to support the IoT as it grows.

Just how big is the IoT ecosystem?

The IoT ecosystem includes any form of technology that can connect to the Internet. This means connected cars, wearables, TVs, smartphones, fitness equipment, robots, ATMs, vending machines, and all of the vertical applications, security and professional services, analytics and platforms that come with them.

And this is only the beginning. As this infographic by Intel illustrates, the IoT is predicted to have 4 billion people using 50 billion devices by 2020, nearly doubling the amount of connected technology we see now.

The goal of the IoT is to make these applications, services, and devices as ubiquitous as possible, all while enabling the gathering of vast quantities of data about user and consumer preferences.

As the IoT expands, so will cloud computing in the following ways.

1. Startups: Given the amount of innovation evolving out of the IoT, you can expect to see many more start-ups offering new devices and services, which is great for cloud vendors. Startups often embrace the cloud because of its "no upfront payment necessary" model. SaaS-enabled enterprise level applications allow smaller businesses to use sophisticated software for project and customer relationship management.

2. Developing countries: Much of the cloud growth we see is actually the result of developing countries that have been slow to adopt the cloud. In fact, 90 percent of the revenue generated from the IOT has come from developing countries. Although this percentage is expected to wane once these counties have finished playing "catch-up", developing countries are still a great market for cloud growth.

3. Analytics and advertising: Data analytics will become even more accurate in predicting consumer preference and behavior. The IoT will dramatically change the way we live our daily lives and what information is stored about us.

How do you believe the cloud might evolve as the IOT does?

100 Million Smart Things: Broadcom's Scott McGregor

Excerpted from NY Times Report by Quentin Hardy 

The technology industry is adding to its already impressive capacity for seeking the greatest possible scale. In a little over five years, it is said, 50 billion devices will be connected to the Internet. That is on top of data centers with millions of computer servers, several billion mobile devices, and all the gaming devices, personal computers, and other things with a chip inside.

Scott McGregor, the chief executive of the Broadcom Corporation, would like his company's communications-based semiconductors in all of that. He may be hungry, but he is also thoughtful: Distributing machine-based intelligence almost everywhere, he says, will change work and invention like never before.

McGregor, 58, has run Broadcom since 2005. He has degrees in psychology and computer science and worked at the famed Xerox Palo Alto Research Center in the early days of personal computers. He joined Microsoft in 1985 — his work there included making the original version of Windows — and was the head of emerging business at Philips Semiconductors.

The following conversation was condensed and edited.

Q. How do things compare with when you started in this industry?

A. In 1985, there were probably 10 different types of central processing units (CPUs), the semiconductor brains, in the world. Each went into a computer. Now we make thousands of different kinds of chips, because there is a market for them.

Q. What size of market?

A. You could probably create a processor with a Wi-Fi connection for a light bulb or a watch and sell 100 million units.
At the high level, our newest chip has 7.2 billion transistors. It handles video traffic at the equivalent of one million homes streaming a movie at the same time. It's designed to go inside a data center that has a million servers. In about a year, that will be touching consumers, for example, when people access videos in the cloud through their smartphones.

Q. Or maybe through your connected light bulb sending information about my lighting habits to the cloud?

A. Through all kinds of things. Last month we released a $20 unit to make your own wearable device. It has sensors to measure speed, a gyroscope, a compass, and can measure humidity and temperature. It communicates with iOS or Android devices.

Q. Why offer that?

A. For $20, someone can be in the business of wearable computing. We think someone will invent something with it that will sell 100 million units of our device. I don't know what. One of my managers taped it to a washing machine and had it beep his phone when the washer stopped. Someone built a pet sensor, a tooth-brushing game for the iPad.

Q. What is it like trying to supply an Apple or a Samsung, the world's biggest makers of mobile devices?

A. In our business now, you either win the socket that the chip is going into, or you don't. Cellphones are made with such tight specifications and thin profit margins for a supplier that they tend not to have a second one.

Q. Why get into something so tough and cutthroat?

A. It has never before been possible to get an order for 100 million of something. If you can win it, it's profitable. You can afford to make huge capital investments, spend on intense research and development — as long as you keep winning contracts.
It also means it costs $100 million or more to start a new chip company, which is why you see an industry roll-up and no venture capitalists funding new ones. We try to focus on having a breadth of intellectual property (IP) and agility, to move quickly.

Q. What will happen with ubiquitous, massive computing in the hands of the powerful?

A. I was in China, and at a large manufacturer I saw a camera that could look at a crowd and instantly record and recognize every face.

Q. Who was that?

A. I'm not saying.

Q. Massive instant logging of all faces, in a totalitarian state. I'm not entirely comfortable.

A. All sorts of things are going to be recorded, and when the speed of computing increases, there will be all kinds of ways to use computing. Talking to computers will become normal, since that will work faster. We're very interested in payments, access to vendors and security authorization through voice. Down the road, you may leave your DNA as a signature on a transaction — it's hard to copy.

Q. I recently talked with two young engineers about how long it will be before we have chips in our brains that give us access to all human literature or any language.

A. People will elect to have things like that. It will make them more competitive. The trick is going to be to connect things to greater sources of intelligence, with an attraction to them that is greater than the creepy factor of using them.

IoT & Cloud Computing: Big Opportunities Youth

Excerpted from Want China Times by Lin Shu-hui

New technology, especially the Internet of Things (IoT) and cloud computing, will bring Taiwan huge business opportunities in China, Southeast Asia, and other emerging markets, Taipei's Want Daily reports, citing comments from Google Taiwan Managing Director Chien Lee-Feng.

Chien made the comments at an innovation forum for youth held in Taipei yesterday to encourage young people to be brave in innovation and to start their own businesses.

Taiwan isn't short of talent, but the overall atmosphere is not good, due to the recent food safety scandal and the Kaohsiung gas explosion, which have people scared, meaning developments in neighboring Japan, mainland China and Southeast Asia have been overlooked in the country, Chien said. Taiwan is surrounded with huge business opportunities, but the island has, as yet, failed to adjust to the changes in the outside world, he said.

Chien said in just a few years, the iPhone, Android, Facebook, Youtube, and Google Plus have emerged and become popular globally, Alibaba Group has listing on the New York Stock Exchange, and Huawei and Xiaomi handsets have also expanded their reach. The global internet population will soon reach 5 billion, with Taiwan's neighboring China and Southeast Asia set to account for 2.5 billion, or 50% of this total.

He reminded Taiwan to recognize that new technology means the arrival of new opportunities, especially the IoT. Global IoT installations will reach 50 billion units by 2020 valued at around US $1.9 billion.

As Taiwan has a strong information and communication technologies (ICT) industry, Taiwanese firms should use the existing, competitive supply chain resources to create a new economy and new industries, especially mobile internet and cloud services. Chien encouraged Taiwan's young generation to embrace these new opportunities, and to engage with the global market beyond the island.

Lu Yi, Vice President of Sina Weibo, who was also at the forum, suggested Taiwanese young people create their own businesses in several fields by using social marketing, in the individual credit financing, transportation, merchants, services, and food industries.

DoD May Invite Cloud Vendors Into Governement Data Centers

Excerpted from Information Week Report by Jai Vijayan

In an effort to tap commercial cloud technology without sacrificing security or control, the Department of Defense (DoD) is considering two potential models.

The US DoD is exploring the idea of having commercial cloud vendors use secure DoD data centers and facilities to deliver private cloud services to the military.

The goal, explained in a just-published Request for Information (RFI) document, is to put in place an ecosystem that will allow the DoD to take advantage of commercial cloud computing technologies while ensuring the level of security needed to run highly sensitive workloads.

One option being explored is a Data Center Leasing Model (DCLM), under which cloud vendors would be allowed to lease out rack or floor space in DoD data centers and run their hardware and software from them.

Selected vendors would be subjected to security scrutiny and an accreditation process before being allowed leased space in DoD's core data centers. The vendors would deliver their services for the military wholly from inside the DoD data centers.

The second model that is being explored is dubbed the On-Premise Container Model (OCPM) and involves the cloud vendor delivering services to the military via containerized data centers.

Under the proposed model, prefabricated containers filled with data center equipment would be dropped off outside select DoD data centers, where they would be supplied with the required heating, cooling, redundant power supplies, and network connectivity.

Since both models require commercial cloud vendors to operate inside of or in close proximity to a DoD data center, they would be considered secure enough to support Level 5 and Level 6 workloads -- the military's most sensitive data.

DISA released details of the two options it is exploring Wednesday in a formal Request for Information (RFI) from commercial cloud vendors. It described the RFI as an attempt to assess vendor readiness to provide commercial cloud services on DoD networks for use by the military.

"DISA is exploring several possible ways to integrate commercial cloud services with DoD networks," the RFI said. "These models are being considered as possible alternatives in providing cloud ecosystems and services to the DoD community."

The RFI seeks information from vendors about their willingness and ability to deliver services from DoD facilities and data centers.

The cloud services that the DoD is particularly interested in over the short term include workload and virtual machine management systems and object and block storage systems.

The DoD is unsure of the exact size and scale of its cloud infrastructure requirements, but it expects the infrastructure to range from small configurations of up to 10,000 virtual machines to large configurations exceeding 200,000 virtual machines, the RFI said.

The two deployment options being considered by DISA appear similar to the CIA's $600 million, 10-year initiative to get Amazon to deliver private cloud services behind the agency's firewall.

The DISA RFI reflects the level of attention the DoD is putting into ensuring the security of its cloud deployments.

Like many other federal departments, the DoD has committed to accelerating commercial cloud adoption over the next few years in a bid to pare costs and improve efficiencies. But it has been very cautious in how it has gone about doing that so far because of the especially sensitive nature of its operations.

"The Department has specific cloud computing challenges that require careful adoption considerations, especially in areas of cybersecurity, continuity of operations, [and] information assurance (IA)," DoD CIO Teresa Takai said in a report outlining the department's plans back in 2012.

The DoD is taking advantage of the Federal Risk and Authorization Management Program (FedRAMP) to put in place standard processes for assessing and authorizing public cloud computing services on its network.

The department is also using FedRAMP to define requirements for continuous monitoring and auditing for cloud computing providers.

What it have been somewhat slower in doing is actually implementing commercial cloud services, according to John Pescatore, former Gartner analyst and director of emerging security trends at the SANS Institute in Bethesda, Md. "FedRAMP has been phenomenally successful in getting commercial cloud services certified for government use, especially for low- to medium-risk workloads," he said. Even so, federal IT departments have to clear several other hurdles, most notably from their own inspector general, before they can actually deploy cloud services.

"Think of an infrastructure-as-a-service application where Amazon has a FedRAMP certification and some agency is running their software on that infrastructure," Pescatore said. "That's not something that IGs are used to auditing. So they are very conservative."

The DoD's apparent interest in having cloud providers deliver service out of containers placed in close proximity to their data centers also reflects the lingering concerns over loss of control that many organizations have when migrating to the cloud, Pescatore said. "It shows a certain 'server-hugger' stance that says, 'Wait a minute. Unless I have physical control of the data center, it will never work.'"

Open NFV Group Uncloaks its Platform Plan 

Excerpted from Light Reading Report by Carol Wilson

The Linux Foundation today made its long-awaited formal announcement of the Open Platform for NFV Project (OPNFV), promising to deliver a carrier-grade, open source reference architecture as a means of speeding up network functions virtualization (NFV) deployment. The group's initial focus will be on developing the NFV infrastructure and virtualized infrastructure management, two key pieces not already under development, and is promising its initial results in the first half of 2015.

OPNFV, which includes some but not all of the pioneering telecom operators behind the European Telecommunications Standards Institute (ETSI) group which created NFV as a concept, was first discussed publicly last spring, but has been holding its cards close to the vest on details until today. In briefings in advance of the announcement, Linux Foundation Executive Director Jim Zemlin said the new organization will build on existing open source projects, including OpenDaylight, Linux, and OpenStack , but will be significantly different from those in its intent.

All of those projects are what Zemlin terms "upstream projects" focused on specific components, while OPNFV is more focused on the underlying architecture and how that is going to be developed. In that respect, the new group is more similar to Debian or Fedora from the Linux world -- both of those are operating systems based on free or open source software that were later integrated into commercial deployments.

"This group wants to focus on NFV specifically and on creating a carrier-grade environment," Zemlin says. "That involves integrating a collection of open source components together but it also has a much broader scope in terms of looking at things like hardware acceleration and service chaining all of the things you need in an NFV environment, which are broader than the scope of any of the individual components that are part of open source projects."

The network operators currently committed to OPNFV are AT&T, China Mobile, NTT DoCoMo, Telecom Italia, and Vodafone, platinum members, and CableLabs , CenturyLink, Orange, and Sprint as silver members.

Notably absent are Telefonica, which has openly called for an open source approach to NFV-I and virtualized infrastructure management, as well as BT Group and Verizon Communications, which played prominent roles in the ETSI NFV ISG. Zemlin says he expects more network operators to join the project, once they see its direction, and thinks some operators are still unfamiliar with open source development in general or don't have R&D organizations that can operate externally as part of an open source group, delivering useful code.

Please see Telefonica's Push for Open NFV Infrastructure.

"They need to understand how to engage -- the intellectual property (IP) frameworks, the development process, etc.," Zemlin says. "We are engaging with a lot of those folks right now. So I think you'll see more come into the organization rather quickly."

Vendors committed to OPNFV at the platinum level include Brocade, Cisco Systems, Dell, Ericsson, Hewlett-Packard, Huawei, IBM, Intel, Juniper Networks, NEC, Nokia and Red Hat. Fifteen other vendors came in as silver supporters.

The OPNFV reference platform is intended to provide consistency, interoperability and performance among the open source components that will be involved, and the group will be working with the other groups to continuously share information and carry out integration and testing, Zemlin said.

According to the Linux Foundation's announcement, the project's initial objectives include developing "an integrated and tested open source platform that can be used to investigate and demonstrate core NFV functionality", as well as proactively engaging end-user companies for their input and establishing an open NFV ecosystem, and promoting its own efforts.

"What we expect to have is a baseline reference implementation that companies will 'productize' and you'll see it deployed in some commercial environments," Zemlin says.

The first project developers' meeting takes place this week and the group will decide, among other things, how and where the developing reference architecture will be tested -- whether it happens in operator labs, vendor labs or within the Linux Foundation. Zemlin expects code contributions from multiple parties, including network operators, to kick off the development. Many of the early service provider participants are among the more "progressive" in terms of having their own software development efforts, he notes.

Adoption of Software-Defined Networks Is Growing

Excerpted from Baseline Magazine Report by Dennis McCafferty

The adoption of software-defined networks (SDNs) is growing rapidly in enterprises, according to a recent survey from Network Instruments. The accompanying report, the "Seventh Annual State of the Network Global Study," covers a wide range of IT topics, including the elevated profile of unified communications (UC) and bring-your-own-device (BYOD) initiatives, but SDN is growing at a strong pace for a relatively new technology.

Surprisingly, there's no standard definition to describe SDN. More than one-third of the survey participants described it as the automated provisioning of network resources, while another quarter said it's the replacing of tools for network traffic optimization and acceleration with software.

However, 37 percent of respondents said SDN is "undefined, like a trip without a map." Clearly, SDN remains a work in progress. "As with any emerging technology, IT management is grappling over the definition of SDN, as well as its benefits and importance to the organization," says Brad Reinboldt, Manager of Product Marketing for Network Instruments.

More than 240 network engineers and senior-level IT managers took part in the research.

US Stakes-Out Net Battleground Ahead of ITU Meeting

Excerpted from The Register Report by Richard Chirgwin

America is once again steeling itself to defend the Internet from the gasp of the ITU.

A White House blog post by State Department officials Daniel Sepulveda, Christopher Painter, and Scott Busby says the US still believes there are countries and groups that hope to use ITU mandates over the Internet as a means to impose local control over "content, technologies, or services".

In the post, they reiterate that, "The US government remains committed to supporting the evolution of the multi-stakeholder approach to Internet governance and has taken steps to demonstrate this commitment."

However, with the ITU plenipotentiary meeting scheduled for South Korea in October, the State Department is fearful of the agenda being hijacked. In particular, they write, any proposals to give governments "the sole authority" over the Internet will be "categorically" rejected.

As the US government relaxes its former exclusive control over core Internet activities — particularly naming and numbering via ICANN — the best way for management to be restructured has become a hot topic.

The theory behind the multi-stakeholder model is that by imitating institutions like the Internet Engineering Task Force, 'net administration can continue without excessive interference from governments.

However, there have been pushes to restructure Internet administration in other ways. Some countries have sought to replicate international telephone call charging principles on the Internet, in proposals that would see originating countries pay to land their data in destination countries.

Other proposals have been more sinister, with countries including Russia, China and Saudi Arabia looking for ways to legitimize content controls on an international basis.

"Remitting the Internet to intergovernmental control — whether the ITU or otherwise — would produce three negative outcomes," the post states, saying that it would lead to slow decision-making; would exclude civil society, academia and industry from the processes; and would encourage oppressive regimes to introduce content controls.

Cloud Computing Gives Rise to New Revenue

Excerpted from Automated Trader Report by Jane Shurtleff

Cloud computing has proven to be not only a disruptive technology, but also an incredibly fast-growing and lucrative one. The Forbes' Roundup of Cloud Computing Forecasts and Market Estimates, 2014 presents an IHS report which predicts that by 2017, enterprise spending on cloud computing will be $235.1 billion, triple the $78.2 billion spent in 2011.

Rise in spending translates into a rise in revenue for many cloud platform companies. For example, Microsoft reported last month that its commercial cloud revenue increased by 147% over the comparable quarter last year and now stands at $4.4 billion.

In April 2014, we announced our partnership with Microsoft, bringing their Azure ExpressRoute service to our IBX ® data centers worldwide. We also announced the Equinix Cloud Exchange, an advanced interconnection solution that enables on-demand and direct connectivity to multiple clouds and multiple networks across the globe. Launching these two solutions has made Equinix an industry-leading provider of cloud access and interconnection services.

Thanks to the growth in cloud services, we have seen tremendous gains in our Cloud and IT Services business this year, with record bookings and revenue growth at 19% year over year. Our cloud business accounted for a third of our new customers, including wins from ClearGov, Digital Ocean, and Velo Cloud. During Q2 of 2014, cloud players like Oracle, Workday, and Marketo expanded into additional global markets, capturing the performance and user experience benefits of customer proximity only achieved within Platform Equinix data centers.

Many enterprises cite reducing and controlling costs as the number one challenge to managing their business. Moving resource-intensive applications and IT services to the cloud is an extremely attractive alternative to maintaining costly, in-house IT infrastructures, especially for small-to-medium businesses trying to compete in markets with larger enterprises. And, the pricing war among the big three cloud providers, Amazon Web Services, Google and Microsoft Azure, is definitely working in businesses' favor, with cuts ranging from 30% - 60% over the last year.

We know from our own rapid growth in the cloud marketplace that cost-effective connectivity is the key to making the cloud work. Businesses need to be able to connect to multiple clouds, while lowering operational costs. At the same time, cloud service providers need to instantaneously connect to thousands of partners and customers, giving them the fastest and most economical path to market expansion and revenue acceleration.

As the home of the interconnected cloud, we get it. Your cloud interconnection platform needs to be as agile as the cloud services it supports and efficient enough to give you the cost-savings your business requires. We have amassed a diverse inventory of cloud and network providers worldwide. More than 450 cloud providers reside inside Equinix data centers today, making Equinix the place to go for both large and small businesses adopting cloud services.

Many cost reductions come from the close proximity to such a high concentration of cloud providers within Equinix IBX data centers, however, they're also a result of the increased simplicity in managing cloud environments. Automated provisioning of multiple cloud connections, as supported by our Equinix Cloud Exchange, is a perfect example of how companies can simplify interconnecting to multiple cloud services. We are also the home of 1,000 networks, giving businesses the choice to select the most cost-effective connectivity solution and take advantage of our IP peering services to reduce networking costs.

The numbers don't lie. Equinix has achieved the critical mass of cloud services and interconnectivity options to drive the convergence of cloud services and cost savings in our data centers that businesses need.

Big Data & Cloud Computing Bigger Foothold

Excerpted from Search Data Management Report

Big data in the cloud is something like science-fiction writer William Gibson's famous description of the future: It's here -- it's just not very evenly distributed.

High-profile vendors such as Amazon Web Services, Google, Microsoft, IBM, and Rackspace offer cloud-based Hadoop and NoSQL database platforms to support big data applications. A variety of startups have introduced managed services that run on top of the cloud platforms, freeing users from the need to deploy their own systems there. And mixing big data and cloud computing is often the first choice for Internet companies, especially software and data services vendors that are just getting started themselves.

But many mainstream organizations don't view managing data in the cloud the way the Web wunderkinds do. Some get white knuckles about data security and privacy protections in the cloud. Others still run most of their operations on mainframes and other well-entrenched systems far removed from cloud architectures. And the sheer mass of data stored in such systems makes moving it to the cloud physically challenging. Moreover, available processing capacity in existing data centers makes the promised financial benefits of using public clouds like AWS and the Google Cloud Platform less compelling, even to companies that are interested in taking advantage of the reduced costs and increased flexibility that cloud-based systems can provide.

Citigroup Inc. is a case in point. The financial services company is faced with an incoming flood of unstructured data as the Web becomes a ubiquitous application interface. It also has to deal with a mix of different data structures in online financial applications. Those challenges led Citi to adopt MongoDB's namesake NoSQL database. MongoDB is supported on AWS and other cloud platforms, and Citi is taking a cloud approach with the software, said Michael Simone, global head of CitiData platform engineering. But it's a private cloud, built within the confines of the New York company's corporate firewall and fully managed by its IT department.

"For now, we're not committing to extended or public cloud integration," Simone told attendees at the MongoDB World conference in New York in June. "Citigroup's data centers are vast and deep themselves, and we feel we can build an economical on-premises cloud."

Giving Away Software to Make It More Valuable

Excerpted from NY Times Report by Quentin Hardy

A big-data start-up just destroyed itself to save itself. That is perhaps not a big deal, but it says a lot about where the world of big computing is headed.

The company was called Continuuity, and it had spent three years building a cloud-based method to more easily run data analysis and related software applications, by improving the way data is collected and prepared.

Now led by people who previously held senior positions at Yahoo and Facebook, it is called Cask, and all that software it built is being offered to the world free, as open source.
There is a race on to build profitable big-data applications. Open-source versions of big data could accelerate that race, but they could also affect the economics of the business for some of big data's corporate champions.

"The new part of the big-data story is all these companies trying to bring analytics to the masses," said Jonathan Gray, a Cask co-founder and its chief executive. "Ultimately, open source is a requirement for widespread adoption."

Of course, open source is an approach and a project, not a business. Mr. Gray said that Cask was hoping to attract the kind of interest Continuuity never could when it sold its software and that it would make money by selling subscriptions and support to people who want a more closely managed version of the software. It may be hard to give up three years of work, but it may also be a way to give it new life.

If the model works, it will be interesting to see where open-source data ingestion fits into the development plans of enterprise software companies. Big companies have spent a lot of money on finding unseen, profitable patterns in their customers' records, most notably as hinted in a leak from Salesforce.

The software giant Oracle's big show starts this Sunday in San Francisco, and it is likely to include Oracle-branded big-data announcements.

Certainly, plenty of open source is going into corporations now, to a point where its development is increasingly steered by big business. Companies like Cloudera, whose "open source, but you can pay too" Hadoop data-framework model is being copied by Cask, have found plenty of success coexisting with enterprises.

In March, Intel invested in Cloudera at a level that people inside Cloudera refer to as "a mini-I.P.O." Intel also discontinued work on its own version of Hadoop. Bringing things full circle, a former executive involved in Intel's Hadoop efforts is joining Cask.

The Future of Cloud Computing

Excerpted from InformationWeek Report by Charles Babcock

Cloud computing is beginning to transform the way enterprises buy and use technology resources, and that was evident at the Interop 2014 conference and exhibition in New York this week. Cloud experts and practitioners of all stripes were in attendance and provided some insights -- positive and negative -- on where this trend is heading.

In a workshop on Designing Infrastructure for Private Clouds, Ivan Pepelnjak, the network architect for UpSpace.net AG, a consulting company in Slovenia, hit upon one of the defining characteristics of cloud computing. "Cloud is all about self-service. You need to be able to allow your internal customer to change the rules on the load balancer and firewall. When someone says, 'This will never fly in my organization,' just tell them, 'Your developers are already using Amazon.'"

In weighing an open source internal cloud versus commercial products, he asked: "How expensive will it be to operate cloud? On the open source side you will have total control. But can you afford that? You either pay vendors or pay staff."

A questioner later in the session noted his academic institution had commissioned a staff-implemented, open source cloud to save money, but when it was up and running after a year of work by five to six IT staffers, the institution decided "it needed technical support. We proceeded to sign a contract for support with Red Hat" that erased most of the savings of a year's worth of work.

The audience member, who didn't identify his employer, said open source cloud builders should be realistic when they undertake the project themselves and realize, when they get to their goal, they may do something -- hire a vendor for support -- that they attempted to avoid at the start, he said.

Paul Savill, Senior VP of Level 3 Communications, said during the Next Generation Applications workshop that he once received a call from a customer complaining of a network slowdown. It was "a major beverage company" that was seeing a slowdown in a network application that required coast-to-coast message exchanges. The service had been provided at 14 milliseconds, but the network added 2 more milliseconds of latency so the round trip took 16 milliseconds. The customer's fine-tuned application felt the impact of the extra 2 milliseconds and customer requests backed up in the queue. The service level agreement called for a maximum of a 40-millisecond round trip so the service provider didn't think increasing the latency to 16 milliseconds was a problem. But the customer did and complained. "The performance of the network is critical to cloud services. If you think the network is just the last piece and not that important, then you're going to make some mistakes," he warned.

A debate on the future of PaaS and whether a file packaging system for Linux containers, Docker, is really a needed element of a platform-as-a-service, drew an emphatic response from Krishnan Subramanian, director of strategy for Red Hat's OpenShift platform. Red Hat is heavily invested in a partnership with Docker and is bringing out a version of Red Hat Enterprise Linux that's geared to running Docker containers, Atomic RHEL. "Offering PaaS that doesn't support Docker Linux containers is the equivalent of selling snake oil," said Subramanian."

Michael Biddick, CEO of Fusion PPT and a writer for InformationWeek, spoke on hybrid clouds. "Hybrid operations provide us with agility, scalability, and economic benefits. But things will get in the way of delivering those benefits. There's concerns about compliance, security, and the previous investment in legacy systems. It's difficult too because people don't fully understand it and they resist it," he said.

Biddick later made some pointed comments on the value of public cloud SLAs. "The service level agreements have you agreeing to a lot of things that aren't really too good for you. … We see service credits given and cash payments, but rarely in significant amounts. … They contain language like 'will make a commercially reasonable effort to provide continuous service.' It's almost irrelevant after an hour of downtime for your business to get a $200 check back," he said.

Docker makes it much easier to turn a cloud workload into a set of files that is portable across cloud environments, said Shashi Kiran, senior director of market management for Cisco, in a session called Managing the Hybrid Cloud. Linux "containers are probably the next major push that we will see that will drive hybrid cloud," Kiran said. On the same day, Cisco announced it had added 150 data centers to its Intercloud partnership that has a group of companies offering compatible OpenStack cloud services. BT, the former British Telecom, NTT Data, Deutsche Telekom, and Equinix were among the biggest service providers in the group. Cisco sees containers as one way to make workloads portable across such a group. "Portability across hybrid environments. That's Intercloud," he said.

In the same discussion, Payal Chakravarty Jain, senior product manager for IBM Cloud and Smarter Infrastructure, said: "We think DevOps will mature and become the standard way to practice IT operations and development in the future. New startups are way ahead that way."

Mark Russinovich, CTO of Microsoft Azure, said in an interview that Microsoft expects Azure in the future to host many Linux workloads running in Windows Hyper-V virtual machines. Up until now, Microsoft hasn't branded itself as a welcoming and hospitable place for Linux. It's also developed for its own internal use a way to run Windows workloads in containerized form -- without adding the overhead of virtual machines. He said Microsoft uses its "Drawbridge system to provide greater security and isolation for our internal operations."

Coming Events of Interest

IEEE International Conference on Cloud Computing for Emerging Markets — October 15th-17th in Bangalore, India. The third annual CCEM, will address the unique challenges and opportunities of cloud computing for emerging markets in a high quality event that brings together industry, government, and academic leaders in cloud computing.

CloudComp 2014 — October 19th-21st in Guilin, China. The fifth annual international conference on cloud computing. The event is endorsed by the European Alliance for Innovation, a leading community-based organization devoted to the advancement of innovation in the field of ICT.

International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

GOTO Berlin 2014 Conference — November 5th–7th in Berlin, Germany. GOTO Berlin is the enterprise software development conference designed for team leads, architects, and project management and is organized "for developers by developers". New technology and trends in a non-vendor forum.

PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.

Storage Visions Conference — January 4th-5th in Las Vegas, NV. The fourteenth annual conference theme is: Storage with Intense Network Growth (SWING). Storage Visions Awards presented there cover significant products, services, and companies in many digital storage markets.

International CES — January 6th-9th in Las Vegas, NV. The International CES is the world’s gathering place for all who thrive on the business of consumer technologies. Held in Las Vegas every year, it has served as the proving ground for innovators and breakthrough technologies for more than 40 years — the global stage where next-generation innovations are introduced to the marketplace.

Copyright 2008 Distributed Computing Industry Association
This page last updated October 10, 2014
Privacy Policy