Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

Dancing on a Cloud

DataDirect Networks

Kaltura

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

August 27, 2012
Volume XL, Issue 9


CSC Leasing Sponsors CLOUD COMPUTING WEST 2012

The DCIA and CCA proudly announce that CSC Leasing has signed-on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.

CSC Leasing was founded in 1986 as an independent lessor of technology equipment. It provides custom lease solutions to meet the ever-changing business and competitive needs of growing organizations — from startups to Fortune 500 companies.

CSC's clients include companies across a wide array of industries, municipalities, and non-profits located throughout the Unites States. Its corporate offices are located in Richmond, VA and it has branch offices in Virginia, North Carolina, Colorado, and the Metro DC area.

Its principals bring a wealth of experience to the business and are actively involved with all transactions. Its leasing team is knowledgeable in both technology and finance. This, coupled with its hands-on personal approach, distinguishes CSC in the leasing industry and with its clients.

CSC Leasing is keenly focused on the needs of its clients and vendor partners. Its approach is efficient, expedient, and results-oriented. CSC works closely with customers and their reseller partners to structure leases that will fit their budget, cash flow, and technology requirements.

Since its founding, CSC has developed extensive industry knowledge that has helped shape viable leasing options for its clients. This has earned CSC the respect and business of clients in a significant number of key industry sectors.

CSC's primary concentration of business is the leasing of technology products in the areas of Enterprise Servers and Storage; Laptops, Desktops, and Printers; Routers/Switches & Telecommunications; Midrange Systems; Barcode & Retail Point of Sales Systems; Medical Equipment; and Visualization & Audio Systems.

CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry's fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.

CSC Leasing will participate in a panel discussion at the Investing in the Cloud conference within CCW:2012.

CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.

At the end of the first full-day of co-located conferences, attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.

So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.

Cloud Computing Goes Mainstream

Excerpted from Baseline Report by Dennis McCafferty

More than eight-in-ten companies currently deploy cloud computing solutions or services, and more than 50 percent plan to increase their cloud investment by 10 percent or more in 2012, according to the latest Annual Trends in Cloud Computing survey from CompTIA.

These numbers reflect a notable rise in mainstream acceptance of cloud computing, as 85 percent of survey respondents now feel more positive about the cloud than they did last year. And information technology (IT) departments are having to adjust by adding employees with cloud-friendly skill sets.

"Internal IT departments are on the edge of major transformation," says Seth Robinson, Director of Technology Analysis for CompTIA. "The option for cloud solutions is opening the doors for IT professionals to perform new tasks — or, at least, perform old tasks in new ways. It's also creating new job roles and functions to more tightly integrate IT teams with lines of business."

CompTIA's findings are based on two separate online surveys of 500 US IT and business professionals involved in IT decision making, as well as 400 IT firms.

Report from CEO Marty Lafferty

Photo of CEO Marty Lafferty"Big Data," "Internet TV," and "Cloud Computing" are three of the fastest-moving technologies identified in Gartner's 2012 Hype Cycle for Emerging Technologies.

Gartner analysts said that these technologies have moved noticeably along the Hype Cycle since 2011, while "Consumerization" is now expected to reach the Plateau of Productivity in two to five years, down from five to 10 years in 2011.

The Hype Cycle has been used by Gartner since 1995 to highlight the common pattern of over-enthusiasm, disillusionment, and eventual realism that accompanies each new technology and innovation.

The Hype Cycle for Emerging Technologies report is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that senior executives, CIOs, strategists, innovators, business developers, and technology planners should consider in developing emerging-technology portfolios.

Although the Hype Cycle presents technologies individually, Gartner encourages enterprises to consider the technologies in sets or groupings, because so many new capabilities and trends involve multiple technologies working together.

Often, one or two technologies that are not quite ready can limit the true potential of what is possible. Gartner refers to these technologies as "tipping point technologies" because, once they mature, the scenario can come together from a technology perspective.

The technology industry has long talked about scenarios in which any service or function is available on any device, at anytime and anywhere. This scenario is being fueled by the consumerization trend that is making it acceptable for enterprise employees to bring their own personal devices into the work environment.

The technologies and trends featured on this Hype Cycle that are part of this scenario include bring-your-own-device (BYOD), hosted virtual desktops, HTML5, and the various forms of cloud computing. Although all these technologies and trends need to mature for the scenario to become the norm, HTML 5 and hosted virtual networks are particularly strong tipping point candidates.

A world in which things are smart and connected to the Internet has been in the works for more than a decade. Once connected and made smart, things will help people in every facet of their consumer, citizen, and employee lives.

This broad scenario portrays a world in which analytic insight and computing power are nearly infinite and cost-effectively scalable. Once enterprises gain access to these resources, many improved capabilities are possible, such as better understanding customers or better fraud reduction.

The enabling technologies and trends on the 2012 Hype Cycle include quantum computing, the various forms of cloud computing, big data, complex-event processing, social analytics, in-memory database management systems, in-memory analytics, text analytics, and predictive analytics.

The tipping point technologies that will make this scenario accessible to enterprises, governments and consumers include cloud computing, big data and in-memory database management systems.

This scenario describes a world in which people interact a lot more naturally with technology. The technologies on the Hype Cycle that make this possible include human augmentation, volumetric and holographic displays, automatic content recognition, natural-language question answering, speech-to-speech translation, big data, gamification, augmented reality, cloud computing, near field communication (NFC), gesture control, virtual worlds, biometric authentication methods, and speech recognition.

Humans are social by nature, which drives a need to share — often publicly. This creates a future in which the "voice of customers" is stored somewhere in the cloud and can be accessed and analyzed to provide better insight into them.

The 2012 Hype Cycle features the following enabling technologies and trends: automatic content recognition, crowdsourcing, big data, social analytics, activity streams, cloud computing, audio mining/speech analytics and text analytics. Gartner believes that the tipping point technologies are privacy backlash and big data.

Share wisely, and take care.

July Saw Largest Monthly Expansion of IT Force in Three Years

Excerpted from CIO Insight Report by Don Reisinger

In the IT world, job data is a solid indicator of how companies are spending money, using resources, and perhaps most importantly, thinking about their plans for the future.

Large expansions in the IT workforce typically indicate good things for CIOs and other professionals, while bad news, like job cuts, tells us trouble is ahead.

In the latest data from the US Department of Labor Bureau of Labor Statistics, it appears good times could be back again.

In fact, as Foote Partners discovered in analyzing the information, the US IT labor force expanded more in July than it had in more than three years.

For CIOs that have hired new people, that might not come as much of a surprise. But for CIOs looking to bring on more IT professionals, it should be considered with caution: competitors are looking to hire qualified candidates.

So, it's better to make that move now than to wait. Here CIO Insight takes a look at the IT sector's job performance in July, and other key data extracted from the BLS report by Foote Partners.

Dell Makes Moves to Survive in Cloud-Centric World

Excerpted from NY Times Report by Quentin Hardy

Despite headwinds in its core business, Dell is trying hard to move into the new world of corporate computing.

On Tuesday, the company said that Marius Haas, a well-regarded executive with a noted history at Hewlett-Packard (HP), would take over Dell's Enterprise Solutions business, which sells the servers, network, and storage equipment for big corporate data centers and cloud computing systems.

It is a future that can't come fast enough: Mr. Haas's position was announced along with Dell's second-fiscal-quarter earnings. Dell had net income of $732 million, or 42 cents a share, on revenue of $14.5 billion.

Net income was 18 percent below year-earlier levels, owing largely to a collapse in demand for personal computers and laptops. The data center and cloud equipment businesses looked relatively strong.

On a nonstandard accounting basis, Dell had per-share earnings of 50 cents a share. While this was higher than the 45 cents a share projected by analysts surveyed by Thomson Reuters, Dell's stock traded lower after the markets closed because Dell had also projected that third-quarter revenue would be 2 percent to 5 percent below second-quarter levels.

The revenue drop seems mostly limited to what the company called "challenging" demand for PCs, laptops, and peripheral devices, particularly by consumers. The Enterprise Solutions business Mr. Haas is heading grew 6 percent over the quarter, Dell said, looks strong in the current quarter as well and is expected to bring more revenue in the future.

Michael Dell, the founder, has already said he wants to move Dell into providing comprehensive solutions for big data centers. Mr. Haas will be an important part of making this work, in a world seemingly dominated by Google, Apple, and Microsoft.

In July, Dell said it would buy Quest Software, which makes software for things like data backup and data center management, for $2.4 billion. Deals like that take Dell further into enterprise software and data center management, an important way for Mr. Haas to also sell servers.

At HP, Mr. Haas, a Dutch national, led the fast-growing networking division, and before that he played an important role under Mark Hurd, then the chief executive, as the head of strategy and corporate development.

"He was one of Hurd's golden boys," said a former HP executive, who asked not to be named in order to maintain relations with many of the tech companies. "He was well liked by the board, and people thought he'd play a role in top management." Mr. Haas left HP in 2011, after Mr. Hurd was replaced by Leo Apotheker. Before coming to Dell, he was at Kohlberg Kravis Roberts, looking for investments in technology for that company.

As the head of strategy at HP, Mr. Haas learned about the company's many enterprise computing businesses, and he probably developed relationships with senior corporate executives at a lot of big companies. He was also closely involved in some of HP's biggest acquisitions, including the purchase of Electronic Data Systems for $13.9 billion in 2008.

That may not be something Mr. Haas wants to talk about. Last month, HP announced that it would take a charge of $8 billion against the EDS acquisition, which never yielded the high-value growth HP had hoped for.

Cloud Computing Services Growth Is Off the Charts

Excerpted from CRN Report by Steven Burke

Customers are embracing recurring revenue cloud computing services at a breakneck pace.

That was the word from hundreds of partners and vendor executives attending the three-day XChange 2012 conference billed as Channel without Limits at the Gaylord Texan this week in Dallas, TX.

"Cloud services' demand is exploding," said Andrew Pryfogle, Senior Vice president and General Manager of cloud services for Intelisys, a master agency distributor of cloud and telecom services working with solution providers. "Customers are sick of writing checks and continuing to invest in on-premise boxes and professional services under the old information technology (IT) model."

Intelisys' solution providers are seeing as many as five deals a day driven by customers frustrated with the high cost and complexity of antiquated on-premise IT solutions, said Pryfogle.

One example, said Pryfogle, is SolutionSet, a digital consultancy in San Francisco, CA with 1,500 users at 17 locations, that moved from an on-premise call manager product to a cloud-hosted VoIP solution from iCore at a big cost savings for the customer and a robust recurring revenue stream for Intelisys partner FusionStorm, a national solution provider headquartered in San Francisco.

"The economics of the old solution didn't make sense anymore," said Pryfogle. "They were growing too fast, and it was too difficult to maintain the on-premise solution. They wanted to do it in the cloud."

Axciss Solutions, a Groveland, FL solution provider, has in the last year driven its recurring revenue cloud computing services business from 20 percent of sales to 85 percent of sales with about 1,500 customer seats now operating completely in the cloud, said Axciss CEO Michael Coburn.

Even with that astronomical recurring revenue growth, Axciss' Coburn said he sees the recurring revenue cloud computing services opportunity as just in its "infancy." He sees his company's recurring revenue cloud computing services opportunity growing exponentially.

"This is just starting to take off," said Coburn. "You are going to see 70 percent of all corporate data in the cloud in the next five years. Solution providers that don't get on board are going to go away."

ProVisionIT, an Orlando, FL solution provider, is moving all of its managed services customers to a cloud computing services model, said ProVisionIT CEO Josh Phillips. He said the company's goal is to move from just 20 percent of its business coming from cloud computing services to 100 percent within three years.

"One hundred percent of our focus right now is bringing on board new cloud computing customers," he said. "When a customer is at a point where they need to purchase new hardware that is when we move them to the cloud. We are moving those potential capital expenditure equipment sales to a cloud-based recurring revenue operating expense model."

Vigilant Technologies, a Chandler, AZ solution provider, is aiming to move from 50 percent of its business coming from cloud-based services to as much as 80 percent next year, said Vigilant Technologies CTO Carl Ingram. He sees a huge cloud computing services opportunity coming from Microsoft's Windows 8 and the Microsoft Surface Tablet. "That could double our revenue from $2 million to $4 or $5 million over the next two years," he said.

Alcala Consulting, a Los Angeles, CA cloud computing services provider, has had a number of its small business customers shutting down their local office and moving to a virtual cloud computing-based model with employees working at home.

"The cloud is a concept customers are grasping," said Alcala CEO Marco Alcala. "They want to know how quickly they can move to the cloud."

Amazon Glacier Is Introduced for Data Archiving in the Cloud

Excerpted from TechCrunch Report by Steve O'Hear

Amazon Web Services (AWS) has extended its cloud-storage offerings with the launch of a new product: Amazon Glacier, described as a "secure, reliable and extremely low-cost storage solution" designed for data archiving and backup.

The e-tailer is pitching it for use-cases where the data is infrequently accessed, such as media archives, financial and healthcare records, raw genomic sequence data, long-term database backups, and data that must be retained for regulatory compliance.

The latter is something where due to legal compliance, there a massive market and it's not hard to see how Glacier could appeal to small to midsize enterprises (SMEs) wishing to reduce the cost of meeting those requirements. Typically, says Amazon, companies currently rely on in-house legacy tape systems for this type of archiving.

To that end, Amazon claims that Amazon Glacier offers "significant savings" compared to on-premise solutions, costs start from as little as $0.01/GB/month.

Alyssa Henry, Vice President of AWS Storage Services, pushes the case for using Glacier, echoing the promise of the cloud in general:

"Amazon Glacier changes the game for companies requiring archiving and backup solutions because you pay nothing upfront, pay a very low price for storage, are able to scale up and down whenever needed, and AWS handles all of the operational heavy lifting required to do data retention well."

Amazon Glacier is available in the US-East (N VA), US-West (N CA and OR), Asia Pacific (Tokyo) and EU-West (Ireland) regions.

Google Compute Engine Rocks the Cloud

Excerpted from InfoWorld Report by Peter Wayne

You're sitting around. You have some computing to do. Ten years ago, you would ask your boss to buy a rack or two of computers to churn through the data. Today, you just call up the cloud and rent the systems by the minute. This is the market that Google is now chasing by packaging up time on its racks of machines and calling it the Google Compute Engine.

Google took its sweet time entering this corner of the cloud. While Amazon, Rackspace, and others started off with pay-as-you-go Linux boxes and other "infrastructure" services, Google began with the Google App Engine, a nice stack of Python that held your hand and did much of the work for you.

Now Google is heading in the more general direction and renting raw machines too. The standard distro is Ubuntu 12.04, but CentOS instances are also available. And you can store away your own custom image once you configure it.

Why rent machines from Google instead of Amazon or Rackspace or some other IaaS provider? Google claims its raw machines are cheaper. This is a bit hard to determine with any precision because not everyone is selling the same thing despite claims of computing becoming a commodity. Google sells its machines by the Google Compute Engine Unit (GCEU), which it estimates is about a 1GHz to 1.2GHz Opteron from 2007.

All of Google's machines rent for 5.3 cents per GCEU per hour, but that isn't really what you pay. The smallest machine you can rent from Google today, the so-called n1-standard-1-d, goes for 14.5 cents per hour. That's because the n1-standard-1-d — which comes with one virtual core, 3.75GB of RAM, and 420GB of disk space — is equivalent to 2.75 GCEUs, according to Google. You can get machines with two, four, and eight virtual cores all at the same price per GCEU.

These numbers are bound to evolve soon according to a member of the Google Compute Engine team. The product is said to be in "limited preview," and as it grows more polished, the company will probably experiment with adding more options with more or less power.

Is 5.3 cents per GCEU a good deal? It depends upon what you want to do with your machine. Rackspace prices its machines by the amount of RAM you get. It has stopped selling the anemic 256MB RAM VMs, but rents its 512MB boxes at only 2.2 cents per hour or $16.06 per month. If you want a machine with 4GB from Rackspace, it will cost you 24 cents each hour, about $175 per month.

Is that a better deal? If your computation doesn't need the RAM, a basic instance from Rackspace is much cheaper. Even if the CPU might not be as powerful, you would be better off with a cheaper machine. But I suspect many will need fatter machines because modern operating systems suck up RAM like a blue whale sucks up krill.

After you get past the differences over RAM and disk space, the Google machines are meant to be essentially the same as the machines from Amazon or Rackspace — or even the machines you might buy on your own. Like Amazon and Rackspace, Google makes it easy to start off with Ubuntu; after that, you're talking to Ubuntu, not Google's code. There are differences in the startup and shutdown mechanisms, but these aren't substantial. More substantial is Google's inability to snapshot persistent storage, as you can in Amazon, but Google promises this is coming soon.

If you're migrating from Amazon or Rackspace, you'll need to rewrite your scripts because the APIs are full of linguistic differences, even if they offer most of the same features.

Another big part of the equation is bandwidth. Google doesn't charge for ingress, but it has a fairly complicated model for egress. Shipping data to a machine in the same zone in the same region is free, but shipping it to a different zone in the same region is one penny per gigabyte. Then the cost for letting the data "egress" to the Internet depends upon whether it's going to the Americas/EMEA or the APAC. For what it's worth, egressing the data to some website visitor from the APAC is almost twice as expensive as egressing it to someone in the United States. The costs are set on a sliding scale with discounts for big egressers.

While the complexity of the pricing table will send the purchasing managers to their calculators, it's interesting what Google is trying to do with this scheme. By making intermachine communications free, Google is no doubt banking on people using the racks in the same zones to actually work together on solving problems. In other words, Google is giving us the tools for stitching together our own supercomputers.

In general, Google is doing a good job of making some of the dangers of the cloud apparent. Like compute instances in Amazon, Rackspace, and other IaaS clouds, each Google instance comes with "ephemeral disk," a name that makes the storage sound more fragile than it really is. Keep in mind that the file system that comes with your cloud computer — be it on Amazon, Rackspace, or Google -- is not backed up in any way unless you code some backup routines yourself. You can run MySQL on your cloud box, but the database won't survive the failure of your machine, so you better find a way to keep a copy somewhere else too.

Calling the storage "ephemeral" makes it obvious that the data might go elsewhere during a real failure or even a "maintenance window." If anything, the name might overstate the dangers, but it all becomes a gamble of some form or another. The solution is to purchase separate "persistent disk" space and store your information there. Or you might want to put it in Google Cloud SQL, the BigQuery data store, or one of the other services offered by Google.

If words like "ephemeral" still sound off-putting, the documentation says Google will negotiate service-level agreements for enterprise customers that begin with promises of 99.95 percent uptime.

Google is also making the dangers of location apparent. One section of the documentation addresses just how you should design your architecture around potential problems. The various zones and regions may go down from time to time, and it's your responsibility to plan ahead for these issues. Google makes the costs of shipping the data transparent, so you can come to intelligent decisions about where to locate your servers to get the redundancy you need.

Google Compute Engine is just one part of the Google APIs portal, a grand collection of 46 services. These include access to many of Google's biggest databases such as Books, Maps, and Places, as well as to some of Google's lesser-known products like the Web Fonts Developer API.

I suspect many developers will be most interested in using Google Compute Engine when they want to poll these Google databases fairly often. While I don't think you're guaranteed to be in the same zone as the service you want, you're still closer than when traveling across the generic Web. Google offers "courtesy" limits to many of these APIs to help out new developers, but you will end up paying for the best services if you use them extensively. These prices are changing frequently as Google and the developers try to figure out what they're really worth.

Google says some experimenters are already pairing the Compute Engine with the App Engine to handle expensive computations. In one of the experiments, Google worked with a biology lab to analyze DNA. The data was uploaded through an App Engine front end, then handed over to a block of Compute Engine cores to do the work. The Compute Engine machines were started up when the data arrived, and they were shut down and put back in the pool as soon as their work was done.

You can start and stop your machines by hand and track them with the Web portal, but I suspect many will end up using the command-line tool. Google distributes some Python code that handles most of the negotiations for reserving, starting up, and stopping servers. While the Web portal is OK for small jobs, the ability to easily write scripts makes the command-line version more useful.

The command-line tool is also more powerful. You can create instances through the Web GUI, but there's a limit to how far you can go. I couldn't figure out how to log in with SSH through the portal, then I switched back to the command line. Perhaps Google should check out some of the HTML5-based tools like FireSSH that integrate SSH with a web page. The only real challenge is finding a good way to hold the SSH keys.

One of the more interesting features is the way to bind metadata to each computer. Google is clearly intending for people to write their own automatic routines for bringing machines online and off. If you want your software to be self-aware, it can look at the metadata for each instance, and the instance can also read the metadata about itself. This lets you pass in configuration information so that each new machine is not born with a clean slate.

If you want to build your own collection of Linux boxes, Google Compute Engine offers a nice, generic way to buy servers at what — depending on the size of compute instance you need — can be a great price. The most attractive feature will probably be the proximity to the other parts of the Google infrastructure. Google is as much a data vendor as an advertising company, and the collection of APIs is growing nicely. I can see how some companies will want to run their computational jobs in the Google cloud just to be closer to these services.

Highwinds Incorporates Solid State Networks Technology into Game Delivery

Excerpted from TMCNet Report by Madhubanti Rudra

The Winter Park, FL based provider of market leading content delivery network (CDN) solutions has reportedly upgraded its game delivery network, Highwinds GDN, which was launched last year as the first CDN specifically tuned for game delivery.

In its expanded version, Highwinds GDN includes integrated download manager, launcher, patcher, and other new products and services. The expansion of its GDN suite is intended to benefit the online gaming companies, Highwinds.

The company reportedly extended its partnership with Solid State Networks to make this expansion possible. Both companies worked together to help meet the needs of game operators at the highest level by identifying and developing new capabilities.

In its enhanced version, Highwinds GDN includes Solid State's sophisticated game management product suite, improves player onboarding, simplifies game patching, allows publishers more opportunity to engage with players, supports multi-CDN deployment, load balancing and optional peer-to-peer (P2P) assisted delivery, provides for deep, robust, real-time analytics, and much more.

The new improvements are expected to have a meaningful impact on the business operations of the gaming companies, addressing the unique challenges of delivering online games to players around the world.

Solid State Networks has been developing game delivery solutions since 2005, and its portfolio includes the leading commercially available download manager, game patcher, and game launcher solution. The company provided Highwinds with some exclusive technologies within the gaming vertical.

"We introduced Highwinds GDN a year ago as the first CDN specifically tuned for game delivery, and since that time, our commitment to the online gaming community hasn't wavered. We listen to the needs of our customers, and we continue to develop innovative solutions for them," Founder and CEO of Highwinds Steve Miller said.

"Our partnership with Solid State is rooted in both the spirit and the strength of collaboration, bringing together leading technology and intelligence from both companies to benefit our mutual customers. The realization of these efforts and this partnership is a comprehensive go-to-market solution for game developers and publishers,"

In October of last year, Highwinds announced the expansion of its managed services

Telefonica Digital Pursues Apps & Online Business Services

Excerpted from Light Reading Report by Ray Le Maistre

The premise of an almost autonomous "new telco" division within a traditional network operator is an exciting one for the industry, suggesting as it does all sorts of startup-type behavior and "new world" mentalities.

That's all very well. But can such entities generate any revenues?

Yes (or "Si"), says the team at Telefonica Digital, formed in September 2011 by Telefonica SA to lead the group into the promised land of digital applications and online business models. (See Telefonica's Looking Trendy and Telefonica Holds Key to Digital Model.)

The Spanish giant has given the digital division not only the opportunity to act like a separate company but also to develop and make money from a number of existing and developing service lines, namely advertising, cloud, security, M2M, digital content distribution, eHealth, and financial services, as well as Tuenti (Spanish social networking service), the Terra Internet services unit and Media Networks (TV distribution in Latin America).

As a result, Telefonica Digital was, at launch, a business with annual revenues of $2.95 billion in 2011.

That's a great place to start, but Digital's CEO Matthew Key is tasked with generating annual sales growth of 20 percent each year up to 2015, by which time Digital is expect to be generating revenues of about $6.15 billion by 2015.

Key expects big things from some of those service lines: Up to $1.84 billion in annual revenues from content distribution in 2015 and up to $983 million from M2M in the same year.

And by that time there should be revenues from new developments, such as Firefox-based devices and from services currently in development or yet to be dreamed up in-house, or by the many startups that Telefonica Digital's Wayra unit is courting and financing.

There is also hope that such developments will also impact the Telefonica group at large by improving revenue per customer, reducing churn (through the development of sticky applications) and improving market share. "The more products customers use, the lower churn will be," stated Key at a recent media briefing in London. "We need to help Telefonica retain its relationships with its customers."

Key is equally clear about what Telefonica Digital will not be. "We don't want to be, or can't be, the next Facebook or Google, and we don't want to enter the hardware market. We're all about creating products that build on our core strengths," added the CEO.

Dallas HD Films Is Excited to Team up with Ignite Technologies

Excerpted from Dallas HD Films Report

Need to stream a live message from your CEO to all your employees? Or maybe provide your employees with a "YouTube" experience where everyone can rate and share videos? If you want to make sure your new HD video is viewed by all your employees without taking down the corporate network — Get Ignite.

The new partnership provides Dallas HD Film customers with various solutions to maximize their business video experience. Deliver HD video directly to employee desktops, smart-phones or tablets. Report on who viewed the video and for how long, or attach a poll to get fast feedback.

Click here to find out what Ignite can do for you.

Internet Archive Turns up the Speed with BitTorrent 

Excerpted from InfoToday Report by Nancy Herther

On August 7th, the Internet Archive gave peer-to-peer (P2P) file sharing a major boost by making more than 1 million books, movies, and other media immediately available as "torrents" from BitTorrent instead of solely relying on Hypertext Transfer Protocol (HTTP) for downloading content.

Using two of the Internet Archive's servers in addition to connecting distribution of content to others requesting the same material guarantees a faster delivery regardless of the users' mode of Internet connection.

Eric Klinker, BitTorrent CEO, noted that "BitTorrent is the now fastest way to download complete items from the Archive, because the BitTorrent client downloads simultaneously from two different Archive servers located in two different datacenters, and from other Archive users who have downloaded these torrents."

Internet Archive reflects the deep commitment that Internet Archive Founder Brewster Kahle has developed, working in true collaboration with libraries, volunteers, foundations, and with his own investments. The Archive was founded in 1996 when the company began archiving webpages. Today, the Archive includes a wide range of component archives, each focused on specific media or goals yet sharing the same overall mission.

The heart of the Archive is the Ebook and Texts Archive, which includes more than 3.5 million titles from Google Book Search, collaborations with libraries, and work at its 23 scanning centers across the globe. Collections can be browsed or searched and the main access point is the Open Library — "one web page for every book" — that provides basic book metadata gleaned from the Library of Congress, Amazon, or other sources, along with links to digital/ebook versions or other information. Funded both by Kahle's foundation and the California State Library, the goal is to create "an open, editable library catalog, building towards a web page for every book ever published" and currently includes more than 20 million records and 1 million free ebook titles.

The Moving Images Archive includes nearly 700,000 free movies, films, and videos ranging from "classic full-length films, to daily alternative news broadcasts, to cartoons and concerts." Many can be downloaded as well. In the past, there were frequent issues of access due to the heavy traffic, browser hang-ups due to user insufficient hard-disk or TMP disk space, issues between user computers, browsers, and all of the standards that exist for various players. With BitTorrent, I was able to download four titles quickly without any problem.

BitTorrent is based on the BitTorrent Protocol, invented by company co-founder Bram Cohen in 2001, providing an efficient, distributed way of delivering files. The company doesn't host content but, instead, acts as a form of switching station to move content quickly but carefully using new nonlinear models of the process. (A torrent holds information about the location of different pieces of the target file.)

If you've ever been frustrated by slow downloads that can freeze your computer, BitTorrent downloads are actually faster as more computers join in requesting a download. Instead of downloading programs in a single one-to-one string from a single source to each requester, BitTorrent is able to send pieces of the program to each of the requesting computers (or peers), then distribute those pieces from each peer so that each computer receives the entire completed program.

Anyone can easily and quickly download the BitTorrent program (which then appears as an empty screen much like newly-opened reference managers) until you begin to populate it with download requests. BitTorrent is fast, efficient, free, and comes without irritating ads or pop-ups. With more than 150 million users, it has become a global standard for delivering large files over the Internet, being used by companies such as Wikipedia, Twitter, and Facebook, among others.

When it announced the BitTorrent relationship on August 7th, the Archive began with 1.5 million torrents (nearly a petabyte of data) including live music concerts, the Prelinger movie collection, the LibriVox audio book collection, feature films, old time radio shows, more than 1.2 million books, and "all new uploads from patrons who are into community collections."

The Internet Archive is also hosting all of the original content for which it makes torrents available. Electronic Frontier Foundation's John Gilmore commented in the Archive's press release that, "I supported the original creation of BitTorrent because I believe in building technology to make it easy for communities to share what they have. The Archive is helping people to understand that BitTorrent is a great way to get and share large files that are permanently available from libraries like the Internet Archive."

In keeping with its open culture, the Archive is posting data on downloads and torrents at its site. Data released so far shows a strong interest and success in this new approach to sharing and accessing these files. Users are finding much faster file transfers of even large multimedia files, such as movies.

As a 501(c) (3) nonprofit, the Archive is using torrents to apply state-of-the-art technology to its mission of building an Internet library to offer "permanent access for researchers, historians, scholars, people with disabilities, and the general public to historical collections that exist in digital format."

In the past few years, the library digitization story has resembled the Tortoise and the Hare story. Google is the agile and well-financed Hare leaping ahead with its grand plans to digitize the world's literature in collaboration with library partners, who would gain digital copies and be spared the expense.

The Archive — the slow but sure Tortoise — is moving more slowly in a collaboration that requires that all participants share some of the costs of the process. With Google apparently now pulling back on its project — and many reports surfacing of how academic libraries have found themselves paying far more than they anticipated for their share of the cost of these services —the lowly Tortoise would seem to be moving into the lead.

"The Internet has put universal access to knowledge within our grasp. Now we need to put all of the world's literature online. This is easier to do than it might seem, if we resist the impulse to centralize and build only a few monolithic libraries," Kahle has written. "We need lots of publishers, booksellers, authors, and readers — and lots of libraries. If many actors work together, we can have a robust, distributed publishing and library system, possibly resembling the World Wide Web."

Lee Rainie, director of the Pew Research Center's Internet & American Life Project, believes that BitTorrent is a strong positive for the Archive and web searchers alike. "Everything in our data suggests that users will take advantage of and be grateful for such sharing," Rainie said.

Perhaps slow but steady will eventually win this race.

Cloud Control and File Sharing

Excerpted from CloudTweaks Report by Joseph Walker

A recent small business cloud computing survey from Microsoft found that a chief concern of potential small to midsize businesses (SMB) cloud customers is the security and privacy of their data. A full 70% of small businesses are concerned about where their data is stored. Just over half of all SMBs cite data privacy as a potential deal breaker for adopting cloud services. And only 36% of businesses think their data is as or more secure in the cloud than their current on premises solution.

Most data security and privacy concerns revolve around four general scenarios: 1) Hackers compromising data center servers that contain customer or proprietary information. 2) Hackers "sniffing" improperly secured network traffic. 3) Data center employees accessing (and possibly sharing) confidential information, especially within a corporate espionage or financial cyber crime context. 4) Employees losing improperly secured laptops or mobile devices with saved credentials for accessing cloud services.

Thankfully, simple and relatively inexpensive solutions exist for all of these concerns.

Local Data Encryption: Most cloud storage services offer end-to-end data encryption as a standard feature. Unfortunately, relying on a storage provider's encryption could still leaves data vulnerable to data center employees or hackers who directly compromise the data center's servers. The simplest method for cloud storage customers to ensure data security is locally encrypting files before uploading them to the cloud. Programs like BoxCryptor allow one-click encryption of individual files or folders.

Encrypted Backup Services: For customers who rely on the cloud for automated backup (without the hassle of individually encrypting files) a third party backup tool can provide an additional layer of security. For example, Duplicati will locally pre-encrypt all designated files using a single user-provided encryption key before automatically archiving and uploading data to a cloud storage provider of the customer's choice.

Email Encryption: Companies that share confidential information via email should seriously consider PGP for Outlook or GnuPG for Thunderbird. These products encrypt individual email messages using 256-bit AES encryption. Users who prefer webmail can also use FireGPG for Mozilla Firefox to encrypt their email. Email messages encrypted with PGP or GnuPG require that message recipients know the sender's unique encryption key to decrypt and read the contents of a message.

Third Party Services and Appliances: An entire industry has sprung up around data security in the cloud. Porticor is an example of one such company. The Israeli startup combines a virtual cloud appliance and key management service to securely encrypt data stored in the cloud for Microsoft and VMware cloud applications. Porticor enables companies to run applications in the cloud while keeping their data encrypted.

A number of third party apps, such as Lookout Mobile Security, also exist for locking or wiping mobile devices that may contain saved credentials for cloud services.

HTTPS vs. HTTP Web Services: Many websites offer both HTTP and HTTPS versions of their apps. HTTPS combines the standard HTTP web protocol with the SSL/TLS encryption protocol to provide secure end-to-end data transfer over the Internet. Users concerned with data security should select services which offer the much more secure HTTPS protocol.

When properly deployed, most of these solutions are all but foolproof, but they do require both employee training and commitment. For such security measures to be effective, businesses must invest time and effort into communicating the importance of data security and reinforcing standard security routines.

Coming Events of Interest

ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.

ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.

CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.

2013 International CES - January 8th-11th in Las Vegas, NV. With more than four decades of success, the International Consumer Electronics Show (CES) reaches across global markets, connects the industry and enables CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $195 billion US consumer electronics industry.

CONTENT IN THE CLOUD at CES - January 9th in Las Vegas, NV. Gain a deeper understanding of the impact of cloud-delivered content on specific segments and industries, including consumers, telecom, media, and CE manufacturers.

Copyright 2008 Distributed Computing Industry Association
This page last updated September 1, 2012
Privacy Policy