Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

QinetiQ

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

June 16, 2014
Volume XLVIII, Issue 7


Cloud Market Much, Much Bigger Than We Thought

Excerpted from ReadWrite Report by Matt Asay

A billion dollars isn't what it used to be. Indeed, anyone that hanging around the cloud computing industry for the past couple of years has been barraged by $1 billion strategic commitments from big vendors like IBM and HP. While $1 billion sounds like a big number, it's not always clear how much of those billions are being spent on hand-waving marketing versus serious cloud engineering.

Even if we credit these efforts as serious attempts to transform old-school enterprises into new-school cloud vendors — and we should — what's more interesting is to understand just how much business is actually moving to the cloud.

A start-up out of Portland, OR, called Cloudability may be able to tell us. And the answer is, "far more than we may have imagined."

Cloudability gives companies visibility into how much money they're spending for cloud computing services like Amazon Web Services (AWS).

Last week, Cloudability crossed the $1 billion mark in cloud spending managed by its system. That's $1 billion in cloud spend in the three years since its founding, $999,000,000 more than the company managed in 2011.

For a still relatively small start-up to be notching those kinds of numbers, it suggests that even our most optimistic estimates of cloud computing spend may be low.

Earlier this week I talked with Cloudability's CEO, Mat Ellis to get his thoughts on the significance of its own $1 billion announcement and why cloud spending is accelerating.

One of the big shifts driving more money being spent on AWS and other cloud services, Ellis points out, is the cloud is becoming an essential part of the "compute supply chain." In other words, rather than vertically-oriented IT services built and running within the company's own data centers, companies are shifting more information technology (IT) services to public cloud services and plugging into published application program interfaces (APIs) rather than write their own code.

A shift is taking place away from isolated, departmental deployments to public cloud resources and instead "enterprises-wide spending." Some of this is driven by a desire to shift spending from capital expenditure to operational expenditure. Some of it is simply driven by increasing comfort with public cloud security and performance. Whatever the reason, enterprises are clearly spending lots of money on cloud computing.

The growing amount of dollars going into enterprise cloud deployment is why a company like Cloudability can exist. As Ellis suggests, while everybody knows cloud is growing, "what they don't know is the havoc it's having inside companies as they try to manage their spending."

Developers who report into a line of business are tasked with writing applications. How is of less importance. This "bottom-up" IT phenomenon is "a big reason cloud cost analytics have become mandatory for large buyers of cloud services" because, "if you're spending lots of money on cloud, you have to manage it actively," according to Ellis.

Which, of course, is exactly what we saw happen with the growth of the open source movement.

Open source, beloved by developers even as it was initially shunned by company executives, became part of the business fabric of software development without the business leaders ever realizing what happened. Cloud spending has followed this same path. Unlike open source, where lawyers got involved to ensure nobody was giving away critical intellectual property (IP), in cloud computing it's very become a matter of managing cost.

As Ellis points out, this isn't about cutting cloud spending, but rather about channeling it:

"Cloud analytics really isn't about cutting costs, like most people think, but rather to see where it's happening and where it could be best spent. In fact, nearly all of our enterprise customers are using us to help them expand their cloud usage, but in a controlled and efficient way. Once you've demonstrated this stuff works and makes you a profit there's really no stopping it."

Developers, Developers ... Developers!

When I asked Ellis what all this means for traditional IT, his response was immediate: "The CIO role is fundamentally changing." Not changing in a "the CIO is doomed" sort of way, but changing in terms of a fundamental shift in what the chief information officer does.

CIOs, Ellis says, generally understand that the entire way they buy, sell and support IT services is shifting. One of the biggest challenges CIOs have is when and how to really engage with the cloud. I quoted Fidelity Investments' CIO recently, "We know about all the latest software development tools. What we don't know is how to organize ourselves to use them."

This is true of CIOs and public cloud services, too. The spirit is willing; the flesh proves very weak.

But the first step is to understand how their users are spending money, Ellis points out.

As for developers, cloud's impact on them is huge, too. Ellis suggests that, "They're now operating in territory beyond the code they write," with "corporate visibility into their spending that forces new levels of fiduciary responsibility."

But, again, it's not really a matter of cutting down spending, but instead "analytics tools like ours give them the ability to justify asking for more resources to do more things."

More resources to do more things. Billions more.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe all new Distributed Computing Industry Association (DCIA) and Cloud Computing Association (CCA) co-hosted CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014) is shaping up to be a remarkable event for cloud-computing customers.

In addition to numerous hands-on instructional workshops and special seminars, on Wednesday October 1st and Thursday October 2nd, in Austin, TX, enterprise end-users from six sectors will take center-stage to share case-studies from their adoption of cloud-based business solutions.

If you represent a company that has adopted cloud computing for logistics, big data, or mobile, and has an interesting story to tell, please contact me at your earliest convenience.

Likewise, if your organization is based in media and entertainment, healthcare and life sciences, or government and military -- and has experiences to share based on your implementation of cloud computing, please get in touch as soon as possible.

If you are a cloud solutions provider, and would like to recommend your enterprise customers for one or more of these speaking roles -- possibly in a joint presentation session with you at this major industry event — also please call or email ASAP.

We're finalizing the conference agenda and speakers list to be included in the promotional materials and conference program now.

Call or e-mail at your earliest convenience — this week if at all possible — for more information.

This inaugural summit and expo actually co-locates two related but distinct events with broader audience appeal than prior CCA & DCIA offerings.

First, it will provide the kind of senior-level strategic business conference we've pioneered with the CLOUD COMPUTING EAST and CLOUD COMPUTING WEST conference series — and for which audiences will be of the same caliber as the decision-maker attendees for those;

And second, it adds an all new opportunity for cloud-solution providers and vendors to present hands-on instructional workshops and special seminars — and for which audiences will be more directly involved in developing, programming, and implementing cloud-computing solutions.

To make it even more attractive to end-user enterprises and other cloud-solutions customer organizations to attend and take advantage of both events, we're offering senior level speakers the opportunity to bring two staff members at no charge.

These more junior attendees will get enormous value from attending several of the hands-on workshops as well as being able to attend the relevant keynotes and panel sessions at the business conference.

The schedule has been carefully organized so that workshop attendees do not have to miss out on the thematically related conference sessions that are most directly related to their areas of interest.

According to the research firm IDC, cloud computing was an estimated $47.4 billion industry in 2013 and is expected to more than double by 2017. The cloud's 23.5% compound annual growth rate is five times faster than that of the broader technology market.

The needs have never been greater for developers, programmers and architects to advance their knowledge, capabilities and skill-sets in order to profit from this revolutionary transformation in the business processes of the future.

During the conference part of CDSE:2014, highly focused business strategy and technical keynotes, breakout panels, and seminars will thoroughly explore cloud computing solutions, and ample opportunities will be provided for one-on-one networking with the major players in this space.

Also, as noted above, the event will feature co-located instructional workshops and special seminars facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

Attendees will, see, hear, learn and master critical skills in sessions devoted to the unique challenges and opportunities for developers, programmers, and solutions architects.

All aspects of cloud computing will be represented: storage, networking, applications, integration, and aggregation.

Three tracks will cover mobile, logistics, and big data considerations that cut across nearly every enterprise vertical migrating business functions to the cloud.

Three tracks will zero-in on three economic sectors that are now experiencing the most explosive growth: media and entertainment, government and military, and healthcare and life sciences.

The DCIA and CCA will debut the new Cloud Computing Competency Certification (CCCC) program with opportunities to qualify and receive Level One Certification (CCCC-L1) on site.

Share wisely, and take care.

Google to Dockerize Future of Cloud Computing

Excerpted from US Finance Post Report by Asif Imtiaz

Back in 2012, Google hired Eric Brewer, a computer science professor from the University of California, Berkeley, CA to redesign a brand-new computing platform that could span dozens of data centers across the globe and can process billions of user requests within milliseconds.

That man delivered a keynote speech at a conference in San Francisco, CA, mentioning that Google will put its considerable weight behind a new cloud-computing platform, Docker.

"Docker is an open platform for developers and sysadmins to build, ship, and run distributed applications. Consisting of Docker Engine, a portable, lightweight runtime and packaging tool, and Docker Hub, a cloud service for sharing applications and automating workflows," says its website.

Docker is the brainchild of Solomon Hykes, who dreams to wrap any software in Docker containers, or "Dockerize" them as he prefers to call it, which will allow it to run on any data center regardless of the operating system besides hosting on a privately owned data center.

Docker is a dream come true for companies like Google that rely heavily on distributed computer networks to provide web services. When users search on Google's website or open an email or Gmail, they are not simply fetching data from a single server.

Google uses computing power from several server processors in order to balance the load on a single server. When you are serving billions of requests per millisecond using a dedicated server, even a warehouse full of servers are not enough.

Docker will effectively "standardize" the operating system environment of any software, making it portable across platforms. It means Mac software can easily share resources from a Linux server. Not only will that it enables software developers to span their network across multiple server platforms. For example, a single piece of software can run in the cloud by sharing server resources from both Google's cloud servers as well as Amazon's web services virtual server at the same time.

Before Docker, companies had to build mammoth data centers to share computing power across servers, as each server had to adhere to the same standards. With Docker, any piece of software can leverage the scalability of multiple data centers to deliver seamless content across multiple operating system platforms.

Docker is a big deal, and Google knows it.

A developer from eBay, Ted Dzuiba, told Wired that " if you believe that what makes life easier for developers is where things are moving, then this containerization thing is where things are moving."

Google usages enough electricity to power 200,000 homes in any given moment. Docker will not only help companies like Google to cut down their investments and carbon footprint, it has the potential to alter the face of the entire Internet infrastructure.

Docker 101: What It Is and Why It's Important

Excerpted from Network World Report by Brandon Butler

Docker is a hot topic this week. If you're unfamiliar with what this technology is or what it means for your business, here's a guide.

Docker is both an open source project and the name of a start-up that focuses on Linux Containers. Containers are the idea of running multiple applications on a single host.

It's similar to compute virtualization, but instead of virtualizing a server to create multiple operating systems, containers offer a more lightweight alternative by essentially virtualizing the operating system, allowing multiple workloads to run on a single host.

Docker the company has released the 1.0 version of its product this week (read more about the 1.0 release here), and in conjunction with doing so is hosting an event named DockerCon. Docker Founder and CTO Solomon Hykes said the open source Docker project has been downloaded (for free) more than 2.75 million times and more than 460 contributors helped create this version.

Docker has built-up partners to support its product and service providers are jumping on board to offer Docker services.

Containers, and specifically Linux containers, are not new. Tech giants such as Oracle, HP, and IBM have been using containers for decades. In recent years, though, the open source project Docker has gained popularity as an alternative, or complement to virtualization.

Recognizing a market opportunity to provide support around the open source project, a company named dotcloud was formed, but was renamed Docker. In January the company received a Series B funding round worth $15 million, led by Greylock Partners. Red Hat has committed a major investment in the company as well. (Read more about Red Hat's work with Docker here.)

The open source project has two major aspects: control groups, or cgroups, which define the compute, memory and disk i/o that a workload needs; and namesakes, which isolate and separate each of the workloads.

Docker the commercial product has two major components as well: Docker Engine, which is the core software platform that enables users to create and use containers; and Docker Hub, a SaaS-based service for creating and sharing Docker services.

With the release of the 1.0 version and Docker Hub, the company says it has more than 14,000 applications that can be used with its containers.

Writes tech blogger Scott Lowe. "Containers, on the other hand, generally offer less isolation but lower overhead through sharing certain portions of the host kernel and operating system (OS) instance."

Containers are an attractive option for environments where there is only a single operating system, whereas virtual machines (VMs) and hypervisors can be useful if there is a need to run multiple OSs in an environment.

VMs are not going away, but containers could offer a better way to run certain applications instead of virtualization. (Read more about how containers can replace VMs here.)

One of the major benefits of containers is portability. Containers can run on top of VMs, or bare metal servers. They can run on-premises or in the cloud.

This has made one of the earliest popular use cases of containers be around software development. Coders can write applications, place it in a container, and then the application can be moved across various environments, as it is encapsulated inside the container.

Docker the open source project is free to download from GitHub. Docker the product offers privately hosted repositories of containers, which are about $1 per container. See full Docker pricing here.

With all the buzz around Docker, many tech companies are looking to get in on the action. Docker is building up its partnerships, too. The commercial version of Docker comes with support from the company, and integrations with a variety of other software platforms, including Linux distros from Red Hat, SuSE and Ubuntu, and other services like scheduling tools such as Puppet, Chef, Ansible and Jenkins.

Other service provider vendors are enabling Docker on their platforms.

Rackspace CTO John Engates, for example, wrote a blog post this week saying that initially he and the cloud hosting company were not terribly impressed with Docker. But then after customers started using it and asking for Rackspace to support it, the company was "pulled" into the community, Engates says.

Now, they're converts; Engates calls containerization "next generation virtualization."

Rackspace is using Docker to test and deploy new applications in various environments; it's even using containers in networking, because it allows for multi-tenancy of software-based load balancers.

The biggest impact though, he says, could be the way containers could usher in an era of portability of workloads across environments.

"Docker could provide the abstraction that makes swapping workloads between clouds possible. They don't have to be OpenStack clouds either. OS-level virtualization makes the application agnostic to the underlying infrastructure. Docker could enable spot markets for cloud computing and the ability for users to find a best-fit solution for their needs."

He goes on to list some of the ways users can get involved in the Docker community if they're interested.

Revamped HealthCare.gov to Run on Amazon Cloud

Excerpted from Executive Biz Report by Mary-Louise Hoffman

Amazon Web Services has been selected by the Obama administration to host cloud computing functions on the federally-run health insurance exchange portal, FCW reported Tuesday.

Adam Mazmanian writes that HealthCare.gov is undergoing a technical makeover to address glitches and feature new user tools when the next enrollment period commences in November.

The Centers for Medicare and Medicaid Services said testing of the revamped marketplace is scheduled to occur this summer, according to FCW.

Mazmanian writes the portal will implement an insurance comparison tool, an identity management technology and a platform designed to simplify the application process.

The site is also being redesigned to help applicants to enroll for coverage through the EZ App, according to FCW.

IBM Builds New Data Centers for Federal Government

Excerpted from Washington Post Report by Mohana Ravindranath

IBM is planning to open a new group of Internet cloud data centers designed to deliver security, desktop virtualization and other services to the federal government, the company announced on Wednesday.

The first center, in Dallas,TX, is scheduled to be online this month. IBM is planning another, to be based in Ashburn, VA, for the fall.

The new data centers are part of IBM's recent $1.2 billion investment in its global cloud business; the company aims to have 40 data centers up and running by the end of 2014, adding 15 new sites — in Hong Kong, London, and Mexico City among other locations — to its existing 25.

The centers are designed to meet the federal cybersecurity requirements, and each have the capacity to support 30,000 servers, according to IBM. They use a software infrastructure provided by Softlayer, a Dallas-based company IBM acquired in 2013 for $2 billion.

IBM generated about $4.4 billion in cloud revenue in 2013, and since 2007 has invested more than $7 billion in cloud-related acquisitions. The company aims to take in $7 billion in annual cloud revenue by 2015.

Kaspersky Lab & Telefonica Join Forces

Kaspersky Lab has announced a new strategic cooperation agreement with Telefonica, to provide its customers worldwide with cyber-security services. 

In a statement Friday, Kaspersky said Telefonica, one of the world's leading integrated operators in the telecommunications sector, is better known through its commercial brands O2, Movistar, and Vivo. 

"Under this agreement, Telefonica will incorporate into its cyber-security portfolio Kaspersky Lab's superior threat intelligence services fueled by the cloud-based Kaspersky Security Network, as well as the extensive expertise of Kaspersky Lab's Global Research & Analysis Team (GReAT)." 

"Telefonica's cyber-security portfolio is built on top of internally developed technology and brings together the expertise of its Global CyberSOC (security operations center) team with the experience of key security players such as Kaspersky Lab," it said. 

Its Vice President, Corporate Sales and Business Development, Veniamin Levtsov expressed confidence that the cooperation would allow Telefonica to extend its wide range of cyber security services. "Being equipped with such a powerful instrument, the experts at Telefonica will be better prepared to react to the threats targeting their customers," he said.

BitTorrent Experiments with Decentralized Chat Service

Excerpted from VentureBeat Report by Ruth Reader

Are you longing to chat in private — not just "off the record" but off the grid? BitTorrent today released an internal alpha of its server-less chat app, making your dreams of truly private chats a near-reality.

BitTorrent realized the word "privacy" means different things to different people and wants this new app to account for everyone's needs, BitTorrent senior product manager Jaehee Lee explained in a blog post today. That doesn't just mean encrypting messages, like other messaging apps have done, but also keeping your metadata "decentralized." Instead of messages passing through a central server, where they are unencrypted and vulnerable to data sweeps, BitTorrent's chat app will enable you to communicate device-to-device, without a server.

BitTorrent will have many iterations of the chat app, said Lee, who is full of feature ideas. It will be device-to-device. It will offer VPN for those times when you need to remain anonymous even to the person you're talking to. Maybe it will even be a completely transparent service.

"What if we created a way to inform users of how their messages were being routed and so they could decide for themselves if they feel comfortable chatting through that connection? What if we allowed them to choose a specific type of connection?" wrote Lee.

For now, it sounds like BitTorrent has a lot of ideas to contend with before it unveils the ultimate privacy chat app. Nevertheless, Lee insists a private alpha launch is "just around the corner." Stay tuned.

12 Cloud Computing Companies to Watch

Excerpted from Network World Report by Brandon Butler

Venture capital investment has rebounded in recent years to healthy levels and one segment of the technology market that's basking in this is cloud computing.

Investments in cloud computing companies rose from $2.1 billion in 2010 to $4.2 billion in 2012, and according to Dow Jones VentureSource's numbers from 2013, $3.4 billion more was invested through the first three quarters of the year.

"We're at the early stages of the impact cloud computing will have on the broader market," says North Bridge Venture Partners' Michael Skok, who estimates that the cloud has penetrated less than 10% of the roughly $300 billion software market. "That makes people very bullish."

Start-ups are forming up and down the cloud stack, from optimizing infrastructure services to offering development platforms. There are applications served from the cloud and new classes of apps that could not have been made without the cloud. Network World has scanned the universe of cloud computing startups founded within the past few years and built this non-exhaustive list of companies (listed in alphabetical order) that give a glimpse of what's to come in this fast-growing market. (Watch a slideshow version of this story.)

CloudLock provides security on top of Google Apps and Salesforce.com by ensuring that sensitive information is encrypted and being handled correctly. CEO Gil Zimmerman was entrepreneur-in-residence at VC firm Cedar Fund before co-founding CloudLock. Before that, he held positions at Sun and EMC. Co-founder Tsahy Shapsa is formerly of Sun and Network Appliances. 2007, but rebranded in 2010 Waltham $28.2 million Cedar Fund, Ascent Venture Partners, Bessemer Venture Partners Co-founders Shapsa and Zimmerman served in the Israeli Defense Forces, and Shapsa served in the Israeli Prime Minister's office as a security team leader. Now, both executives run networking events in the Boston area for Israeli start-ups.

CloudMunch offers application lifecycle management software for devops and agile development processes. Co-founder and CEO Pradeep Prabhu is former VP and head of SaaS at Infosys. Co-founder and CTO Prasanna Raghavendra was head of engineering in Infosys's SaaS practice. 2011 Seattle $1M seed round Svapas Innovations CloudMunch has a completely distributed workforce across the globe with no central office. All employees work from their home and the company manages its operations using its own technology.

CloudPhysics uses a SaaS platform to deliver big data analytics to optimize how data centers run Here's the story on the four co-founders: CEO John Blumenthal was formerly director of product management for VMware's ESX storage stack and before that he worked at Veritas Software. CTO Irfan Ahmad was also a VMware engineer, while VP of Operations Jim Klechner was with Currenex and Chief Scientist Xiaojun Liu worked at Google, Salesforce.com and Sun. 2011 Mountain View $12.5M Kleiner Perkins Caufield & Byers, the Mayfield Fund, Mark Leslie, Peter Wagner, Carl Waldspurger, Nigel Stokes, Matt Ocko and three VMware co-founders Diane Greene, Mendel Rosenblum and Ed Bugnion

CloudVolumes provides application virtualization and management software. CEO Raj Parekh is a founder of Redwood Ventures and previously held CTO and VP-level positions at Sun. Co-founder and CTO Matthew Conover is a former technical director at Symantec, while fellow co-founder and VP of Products Shaun Coleman was director of product management for Citrix XenDesktop. 2011 Santa Clara $23.5M TiE Angels, Kumar Malavalli, Sanjog Gad, Rob Thomas, Bill Crane (former VP Engineering LinkedIn, VP Engineering Proofpoint) CEO Parekh is a serial entrepreneur, who in 1998 founded Redwood Venture Partners, which has invested more than $250 million across 30 companies. He sits on the boards of more than 15 companies, including as chairman at some.

Digital Ocean offers public IaaS. The team came from hosting company Server Stack, including CEO Ben Uretsky and VP of Marketing Mitch Wainer. 2011 New York City $40.4M. Andreessen Horowitz, IA Ventures, CrunchFund, TechStars Digital Ocean was born out of the TechStars startup accelerator program, which provides seed funding via more than 75 venture capital firms and angel investors. The named Digital Ocean is meant to be a play on words: The company's compute services are named Droplets, like those formed from the water in the ocean and that make clouds.

Docker offers an alternative approach to virtualization to apps developers via containerization technology. dotCloud was founded in 2010 by Solomon Hykes as a Platform-as-a-Service company. In March, 2013, Solomon and other members of the dotCloud team released Docker.io, an open source engine to deploy any application and its dependencies as a lightweight container that runs virtually anywhere. In early 2013, the company rebranded to Docker and Ben Golub — former CEO of Gluster and Plaxo -- was brought on board as CEO. 2010 San Francisco $26M Greylock Partners, Insight

ElasticBox allows apps to be portable and run regardless of whether their underlying infrastructure is one of many public or private clouds. CEO and co-founder Ravi Srivatsav has held senior engineering and product management positions at IBM and Microsoft, where he most recently worked in cloud computing. Other executives include co-founder and CTO Alberto Arias Maestro, the former chief architect of DynamicOps, which was bought by VMware and now serves as the basis of the company's multi-cloud provisioning tool. Timothy Stephan, VP of Product, was formerly head of product at mobile device management company MobileIron. 2011 Mountain View $3.4M Sierra Ventures, Andreessen Horowitz, Intel Capital, Nexus Venture Partners, AngelPad, Raymond Tonsing Co -founders Srivatsav and Maestro each had twins within two months of each other, and all four of their kids are younger than 3 years old. The company also has an

Jelastic is hoping to pioneer the idea of a Platform as an Infrastructure. Jelastic is a cloud platform that runs on bare metal or virtualized servers to create a cloud that intelligently monitors and provisions the infrastructure based on the needs of applications running on it. Co-founders are CTO Ruslan Synytsky and CFO Alexey Skutin. Synytsky formerly led engineering and software architecture teams at iQueLab, SolovatSoft and Datamesh. Jelastic brought on CEO John Derrick, who has extensive experience in advising and growing startups. 2011 San Mateo $2.5M Maxfield Capital, Runa Capital, Almaz Capital Co-founder Synytsky is a former engineer and programmer for the National Space Agency of Ukraine.

Koality speeds-up code testing, via the cloud. Co-founders Jonathan Chu (CEO), Jordan Potter (COO) and Brian Bland (CTO) all worked at software analytics firm Palantir Technologies. 2012 San Francisco $1.8M FF angel investor, Webb Investment Network, Felicis Venture, Index Ventures The name Koality plays off the idea of code "quality." The logo is of a Koala bear hugging a repository. Developers spend a lot of time building code, which for many businesses turns into the lifeblood of their operations. An inevitable part of coding is testing it to make sure it actually works, and on big projects, testing can take a long time. .

nCrypted Cloud Encrypt files stored in consumer cloud services like Box, Dropbox and Google Drive. Founder and CEO Nick Stamos served as CTO of Phase Forward, a health care IT company that was bought by Oracle after he left. He also founded security company Verdasys and served as its president until 2011. Co-Founder and CTO Igor Odnovorov also worked at Phase Forward and Verdasys. July 2012 Boston $3M in angel funding Former Cisco, Microsoft executives

PernixData offers flash virtualization. Co-founder and CEO Poojan Kumar most recently served as head of data products at VMware; Co-founder and CTO Satyam Vaghani was VMware's Principal Engineer and Storage CTO. 2012 San Jose $27M Lightspeed Venture Partners, Mark Leslie, John Thompson, Lane Bess, Kleiner Perkins Caufield & Byers Investor John Thompson is now Microsoft's Chairman of the Board. Compute and server resources have been virtualized and network virtualization is underway as well. But what about storage? As cloud computing has become more pervasive in the enterprise, users are running into storage bottlenecks.

Salsify provides cloud-based product information management and exchange for e-commerce websites. All three co-founders formerly worked at Endeca, an e-commerce search and navigation software purchased by Oracle for $1.1 billion in 2011. Salsify CEO Jason Purcell was Endeca's 24th employee and ran its e-commerce business. VP of products Jeremy Redburn ran product management and marketing for Endeca's business intelligence unit. VP of Marketing Rob Gonzalez ran marketing and product management for Cambridge Semantics. 2012 Boston $8M series A financing; the company had been bootstrapped by its founders before then. North Bridge Venture Partners, Matrix Partners The salsify plant is a cousin to the dandelion and is the namesake of the company. "It is beautiful, low maintenance, and spreads like wildfire -- all qualities we think the worlds of product content management, syndication, and distribution need," the Salsify website says.

Microsoft, Google, Amazon Entertaining $10B Spotify Purchase

Excerpted from Digital Music News Report by Paul Resnikoff

Earlier this week, sources pointed to efforts — by major labels — to sell Spotify for as much as $10 billion, with a 20% cumulative equity interest motivating the effort. Now, according to more sources talking to Digital Music News, Microsoft, Google, and possibly Amazon are all entertaining the idea of buying Spotify, though $10 billion could be considered an aggressive ask.

Telecommunications companies were initially tipped as prospective buyers, though they may not be the marrying type. Initial sources pointed us to acquisition interest from major telecommunications and mobile giants, though additional sources have since poked serious holes in that intel. One reason is that major mobile companies already have music applications on their decks, which makes a purchase a bit superfluous.

That said, alliances between mobile companies and music services continue to commence, with AT&T+Beats and Sprint+Spotify just two examples in the US alone, with plenty of other alliances sprinkled across Europe, Asia, South America, and Australasia, and beyond. That suggests that the best strategy for telecommunications companies might be 'casual dating,' especially given the inflexibility and limited upside that comes with a multi-billion purchase.

Additionally, technologies and delivery methods can change dramatically in just a few years, making ownership a dicey bet.

Against that backdrop, one source pointed to Verizon as a company actively seeking an alliance with a major music service, especially in the wake of Sprint+Spotify. That raises the possibility of a deal with any number of smaller but highly-competent services, including Rhapsody, Deezer, and Rdio, with Deezer a particularly attractive target given its extensive experience linking with telecommunications companies (Orange, Tigo, etc.)

That shifts the discussion to mega-companies like Microsoft, Google, and Amazon, all of whom are struggling to compete in the streaming arena. Google's Play Music All Access remains over-named and underused; Amazon's Prime Music is just starting as consumers migrate away from downloading, and Microsoft's streaming efforts are perennially underachieving.

Enter Spotify, which actually has traction with 10 million paying subscribers, not to mention 40 million active users and a nearly-ubiquitous presence among music listeners. That makes an acquisition attractive, as it would offer one of the larger players an instantly-competitive streaming solution overnight, and a seriously potent weapon against Apple and its just-acquired Beats.

That said, the usual elephantine suspects have been balking at a double-digit billion price tag, according to our sources, and Spotify may lack the leverage to pull it off. "A big problem is the initial public offering (IPO) market," one source relayed. "Everyone thinks that Spotify can't go public successfully, so that limits their options and price."

"This isn't WhatsApp," one source relayed after seeing the $10 billion target. "The price would be more like $5 billion."

Netflix vs. Verizon: Sign of Cloud Wars to Come

Excerpted from InfoWorld Report by David Linthicum

Fighting has erupted between Netflix and Verizon over who bears responsibility for the low quality of service some Netflix subscribers purportedly experience on Verizon's FiOS broadband service. The angry banter escalated sharply on Thursday when Verizon sent Netflix a letter threatening legal action if the video-streaming company doesn't stop talking smack about Verizon.

Verizon sells a cloud service of its own that Netflix does not use, and both companies sell streaming video services. Verizon customers who are also Netflix subscribers must access their Netflix content via their Verizon account. That seems to be where things go wrong for Netflix customers with Verizon accounts. Who didn't see this coming?

This appears to be an emerging pattern in cloud computing: Competitors are getting more aggressive and even nasty toward one another.

The reasons are clear: The cloud-based technology market is beyond exploding, and most of the larger providers view 2014 and 2015 as the time for a cloud land grab. Most are sensitive to any obstacle that stands in the way of capturing that market, and that sensitivity manifested itself most recently as streaming-video services talking trash.

They're also going after writers like me when they disagree with our viewpoint. Although most cloud providers take the criticism in the spirit it's intended, a number of them views any slightly negative slant as "fighting words" and push back as hard as they can. With such providers, expressing a point of view about cloud concepts and technology doesn't just get you an angry comment on your blog page, but an angry communique to your editor, your employer, or your colleagues.

What's more, there are growing attacks on the messenger rather than on the message, which is another disturbing trend. The irony is that those who attack most often are typically representatives of companies that thought cloud computing was a mere fad as recently as a few years ago.

I write and speak a lot about cloud computing. Lately, I've seen the emergence of aggressive behavior that simply wasn't around 10 years ago in the "good old days" of cloud computing. It's sad.

As the technology grows in market share and in importance to enterprises, it's critical that analysts, reporters, bloggers, and consultants who focus on the cloud computing have a point of view that is honest, even if critical at times.

Nothing is perfect. Those who sell cloud technology or services understand there are good things and bad things about what they are doing. At the same time, enterprises are trying to figure out a confusing space, and they require all the points of view they can get. We need to encourage critical thinking, not stamp it out.

New Data Questions Netflix's Assertion

Excerpted from Streaming Media Report by Dan Rayburn

In the Netflix versus ISPs peering dispute, there are a lot of opinions and debate around who's at fault for letting some peering points degrade and who should be responsible for upgrading them. To date, many are having a hard time separating facts from opinions because Netflix and the ISPs haven't released any concrete data to back up their claims.

In most industries, if one company accused another of doing something wrong, it would be expected that the company making the claim would back up their position with detailed data that proves their point and leaves little doubt as to who's responsible for the problem. Netflix has yet to do that.

Most seem to be giving Netflix a pass, with very few demanding real transparency into what's taking place, or changed, that degraded Netflix performance back in September 2013. No company should try and force us to take their word for it, they should simply make the data public and let us decide on our own. Netflix says they are bringing transparency to the debate, but they are doing the opposite by using vague and high level terms with no definition.

To date, Netflix has yet to set forth any details on how they want the current business models to change, how it should be regulated, what they consider "strong" net neutrality or even submitted a proposal to the FCC.

The best example of this is how Netflix's player recently gave out messages saying that Verizon was at fault regarding quality issues, but then when challenged by Verizon to back up their claim, Netflix announced they would discontinue showing these messages on June 16th. Originally Netflix said these messages would be rolling out in a phased deployment on all networks, but in their blog post yesterday, they now say these messages were just a "test."

To me, it looks like Netflix simply created noise in the market, again with no data, and then when pressured by Verizon to prove their case, Netflix instead decided to stop sending the messages and now release any details. Why? If the problem lies within Verizon, Netflix should let us see the data that shows this and stand behind it. Why back down if they have the data to show where the problem is coming from?

This is just another example of many where all sides simply point the finger at each other and say it's the other guys fault, but then provide zero details to back up their claims. However, that may change soon as Netflix will likely publish network and performance graphs around a peering event, taking place in DC on June 18th, to bolster their argument.

At the same time, some ISPs are actively working to release some data to the market. Please click here for the full report.

BitTorrent Shows What a Non-Neutral Net Might Look Like

Excerpted from DSL Report by Karl Bode

Over the years most of you have probably seen this graphic, which tries to show what a non-neutral network looks like — a world where Internet service providers (ISPs) charge different amounts for different tiers of access for different content.

Not to be outdone, BitTorrent recently launched its Join the Fastlane website, which tries to show a fractured future where some content is available on some tiers at some speeds, some of the time.

In a blog post showing its support for net neutrality, BitTorrent CEO Eric Klinker has this to say:

"In a world where we speak in shared photos and video streams, to bias traffic is to bar free speech. In a world where Internet access is fundamental to enterprise and invention, to bias traffic is to effectively end innovation...

A fast lane marks the end of consumer choice. We will no longer be able to decide how we want to use the Internet. Instead the chasm between fast and slow content will continue to grow until we are are forced towards a curated Internet that is devoid of diversity."

Outside of urging people to comment at the FCC, BitTorrent's blog post and its letter to the FCC rather tap dances around specifically what it thinks the FCC should do, avoiding recommending solutions like reclassifying ISPs as Title II common carriers.

Broadband Shouldn't Be like Cable TV — Why Peering Matters

Excerpted from GigaOM Report by Stacey Higginbotham

Peering may feel esoteric and difficult to understand, but here's an example why consumers should care about how these interconnection fights play out between Netflix and ISPs.

When you're curled up on the couch, set to watch the second season of "Orange is the New Black," and the video stream pixelates or just stops, it's the modern-day equivalent of the "all circuits are busy now" message one can still hear on landline phones (or one could, if people were calling on them). And the issues behind both problems are similar — somewhere in the network there is too much demand and not enough capacity.

But unlike the days of landline phones, when one industry controlled the calling experience (telephone companies that were forced by FCC regulations to connect calls on their networks), our broadband networks and the Internet itself is controlled by varied industries and there are no rules around interconnections. This is why we're seeing Netflix and various ISPs battling it out in the press.

For much of the Internet's life this wasn't an issue, but in 2014 as video takes over more and more of network traffic (and cuts into the ISP's triple play bundle) ISPs have been pushing back against the large content providers like Netflix, Google, Amazon, and others. Where in many cases US ISPs (and most participants on the Internet) have signed interconnection agreements that link the two parties' networks together without one party charging the other, this is now changing.

Seeking control over what traffic gets on their network and another source of revenue, several of the nation's largest ISPs have been engaging in negotiations to charge Netflix, and transit providers that carry Netflix traffic for the ability to directly connect to the ISP networks. This has led to both Netflix and ISPs to engage in behavior that has hurt consumers. We've detailed the problem here and here, and also explained why this is an issue that the FCC should investigate.

While the FCC, the tech press, and a few other entities are paying attention to peering, it's a hard sell for consumers, since it's happening out of sight in data centers and requires and understanding of how the Internet works. Other than bad Netflix, consumers may never see the issue. And bad video streaming could be caused by any number of things — from bad Wi-Fi to a server problem at the content providers end.

Plus, it can be even harder to understand why Netflix shouldn't pay and why that might be bad for consumers.

But I have a perfect example of why this matters. It starts at my cable box. Since the beginning of the year, my husband and I have had trouble watching Amazon or Hulu during prime time over our Time Warner Cable connection. The video streams would fail and we'd get messages on our Blu-Ray player telling us our connection was too slow. Yet, a check on speedtest.net would reveal we were getting at least 30 Mbps not the anemic 1.2 Mbps or .3 Mbps our player would show.

TWC offered a software upgrade to our modem and was actually quite helpful with regard to sending someone out to fix the problem. While I'm well aware that any cable modem is a shared service, it's ridiculous to think that someone paying for 50 Mbps would be content to get 1 Mbps. Time Warner agreed.

Unfortunately, the problems remained and then my modem would just drop offline for 15 minutes to as much as 2 hours. This was untenable and the cable guys came back out to eventually replace the coaxial cable on my entire street. While the problem with both intermittent service and the prime time video playback are still occurring, they have lessened. But my husband and I decided to seek an alternative to Time Warner Cable.

We live in Austin, TX which is often held up as an oasis of broadband competition with an existing gigabit network provided by Grande Communications, a soon-to-be-gigabit network from AT&T and another planned gigabit network from Google. But none of those are available in my Austin neighborhood. So our choice was TWC or AT&T's U-verse with 24 Mbps down and 3 Mbps up.

AT&T was cheaper, but it also is having an interconnection fight with Netflix at the moment — leading Netflix to say that AT&T's Uverse speeds are slower than DSL. Since we watch a lot of Netflix and other Internet video services (we don't usually have cable in our house because we don't watch much TV), AT&T would only fix one of our problems.

And what if Netflix hadn't signed a peering agreement with AT&T, but Amazon had? What if TWC had a deal with Netflix but not Hulu? Then, here I am: a consumer who pays for the fastest broadband speeds available getting high quality access to only some of the Internet. That's not a choice consumers should make. When it comes to both peering and the network neutrality rules that the FCC is considering, ISPs are seeking to use their access to my home to charge everyone, at every point in the network, to deliver content.

This will make them the gatekeepers to content and force consumers into lose-lose situations with regard to picking a broadband provider. The Internet isn't like cable TV. We shouldn't have to pick from two or perhaps three service providers who have the deals in place to deliver the content we want. Especially if the other alternative is to go with a provider whose service doesn't even work all the time.

As we've said before, the only broadband that matters is the broadband you have access to at your home. In most places, that's not a competitive market. And with fights over interconnection agreements and the possibility that network neutrality transforms into paying for priority access, consumers get screwed again. Take it from me. Having a bunch of bad choices is like having no choice at all.

Coming Events of Interest

Enterprise Apps World — June 17th-18th in London, England. EAW is a two day show, co-hosted with Cloud World Forum, that will look at all the implications of going mobile in the workplace and how enterprise apps can help.

BroadcastAsia2014 — June 17th-20th at the Marina Bay Sands in Singapore. BroadcastAsia, back for its 19th year, continues to serve as the area's leading platform where professionals gather to network, exchange business ideas, gather market information and source for the latest products and solutions.

Silicon Valley Innovation Summit — July 29th-30th in Mountain View, CA.AlwaysOn's 12th annual SVIS is a two-day executive gathering that highlights the significant economic, political, and commercial trends affecting the global technology industries. SVIS features the most innovative companies, eminent technologists, influential investors, and journalists in keynote presentations, panel debates, and private company CEO showcases.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.

Copyright 2008 Distributed Computing Industry Association
This page last updated June 22, 2014
Privacy Policy