Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

Oracle

Savvis

SoftServe

TransLattice

Vasco

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

November 18, 2013
Volume XLVI, Issue 1


Don't Miss GOVERNMENT VIDEO IN THE CLOUD

The DCIA will present GOVERNMENT VIDEO IN THE CLOUD (GVIC), a Conference within the Government Video Expo 2013 (GVE) on Wednesday December 4th at the Washington Convention Center in Washington, DC.

The opening keynote by Tim Bixler, Federal Manager, Solutions Architecture, Amazon Web Services (AWS), will offer an "Update on Cloud Video Services Adoption in the Public Sector."

Two case studies will explore "Cloud Solutions for Government Video Production" by John Heaton, Director of Sales Engineering, Americas, Aspera, and "Cloud-Based Management of Government Video Assets" by Frank Cardello, General Manager, Platform, T3Media.

Cirina Catania, Independent Video Producer, will join the earlier speakers for a panel discussion covering "Considerations for Creating Government Video in the Cloud."

After a networking break, the second GVIC keynote by Adam Firestone, Director, Solutions, WSO2 Federal Systems, will address "Security & Reliability Concerns Unique to Government Video in the Cloud."

Two additional case studies will cover "Distribution of Government-Owned Video from the Cloud" by Adam Powers, VP of Media Technology & Solutions. V2Solutions and "Analysis of Aggregated Government Video Content" by Michael Rowny, CEO, PixSpan.

The closing GVIC panel discussion will add Larry Freedman, Partner, Edwards, Wildman, Palmer, and examine "Considerations for Cloud Dissemination of Government Video."

GVE, co-located with InfoComm's GovComm, brings the east coast's largest contingent of video production, post, digital media, and broadcast professionals together with government AV/IT specialists. The combined event features over 150 exhibits and nearly 6,000 registrants.

The CCA offers sponsorship opportunities for this event.

DCIA Member Company employees and DCINFO readers are entitled to a $100 discount by using registration code GVE.

Please click here to register.

A New Day at the FCC

Excerpted from TV Technology Report by Leslie Stimson

New Federal Communications Commission (FCC) Chairman Tom Wheeler says the commission is a "pro-competition" agency.

Wheeler told staffers that during his confirmation hearing, he described himself as an unabashed supporter of competition because competitive markets produce better outcomes than regulated or uncompetitive markets. "Yet we all know that competition does not always flourish by itself; it must be supported and protected if its benefits are to be enjoyed," he said.

Outlining his goals, Wheeler said it's important to promote economic growth, to maintain the historic compact between networks and users and to make networks work for everyone. "Our challenge is to be as nimble as the innovators and network builders who are creating these great opportunities," he said.

Wheeler, a former telecom and cable lobbyist and venture capitalist says most VC investments don't work out at intended; "but without taking those risks there can be no big rewards. The industries with which we work are always taking reasonable risks; I hope we won't shy away from a similar approach."

Diane Cornell, Wheeler's new Special Counsel, will head a temporary working group to look into proposals made by Congress, FCC staff and other stakeholders regarding the way the agency conducts business. He says a report will be on his desk within 60 days. His office will use crowdsourcing too, to gather ideas about rules that are past their prime and procedures that can be improved.

Wheeler praised Commissioner Mignon Clyburn's tenure as Acting Chairwoman, and said she and her colleagues addressed tough issues during that six-month period.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA commends US Senator Jay Rockefeller (D-WV), Chairman of the Senate Commerce Committee, for introducing legislation this week to bolster the advancement of cloud-based video offerings of independent Internet protocol television (IPTV) operators.

Senator Rockefeller drew a parallel between the challenges faced today by online video providers and those that confronted satellite providers in the 1990s.

His bill in many ways would do for IPTV services what the 1992 Cable Act did for satellite services: prevent incumbents from limiting access to television programming channels as a way to stifle competition.

We have increasingly come to believe that Congress needs to provide a legal framework for developers and distributors of new technologies to be able to retransmit conventional television programming — and for content rights holders to be fully compensated for such distribution.

The basic concept that non-facilities-based multichannel video programming distributors (MVPDs) should be able to enjoy the same rights and have the same responsibilities as those that own physical plant is one we fully support.

Online services should be able to choose to be treated like cable or satellite providers with comparable retransmission consent and must-carry regulations governing the carriage of broadcast signals by them and comparable access to carriage agreements for non-over-the-air television programming services.

While most television programmers believe a migration to IPTV is inevitable, they are also coming to agree that cable and satellite MVPDs will block such new and improved distribution technologies absent a legislative mandate.

The 63-page Consumer Choice in Online Video Act, which Rockefeller introduced on Tuesday afternoon, would bar cable, satellite, and large media companies from engaging in "anti-competitive" practices against the new so-called over-the-top (OTT) video distributors.

Under the bill, OTT MVPDs would be exempt from syndex, network non-dupe, and the sports blackout rule, all of which empower the Federal Communications Commission (FCC) to enforce broadcasters' local exclusivity.

IPTV service providers instead stand to benefit television programmers — including local broadcasters — by shifting the balance in carriage rights and retransmission negotiations more towards the rights holders.

In addition, Internet-based MVPDs are in a position to provide an immediate conduit for TV signals to be delivered to mobile devices, which incumbent video distributors have also resisted.

Rockefeller's bill would level the playing field for Internet-based services and accelerate the evolution of broadband network operators to new business models that support multiple MVPDs, not just the traditional ones that they control.

The essence of the legislation is that it would allow online video providers to choose to be considered like cable and satellite providers, opening for them the "pathway to negotiate for content."

Under the proposal, broadband providers would also be restricted from putting limits on Internet connections that could degrade the quality of online video services.

The bill tasks the FCC with monitoring broadband Internet service billing practices to make sure that consumers understand exactly how much they're paying for cable and Internet and that "broadband billing practices are not used anti-competitively."

Although the legislation would not prevent cable and satellite companies from offering usage-based pricing for their Internet services, the measure is intended to make billing clearer and more understandable.

In announcing the bill, Rockefeller noted that, "Evidence is growing that some traditional media and broadband companies are attempting to discourage the growth of online video platforms through various anti-competitive practices."

"My legislation aims to give consumers the ability to watch the programming they want to watch, when they want to watch it, how they want to watch it, and pay for only what they actually watch."

"I strongly believe that my legislation will help foster a consumer-centric revolution in the video marketplace."

While we agree with much of the bill as currently drafted, we cannot agree that antenna rental services should be exempt from paying retransmission consent fees, which seems inconsistent with the measure's other fair and balanced provisions.

The Aereos of the world need to receive the same treatment as other categories of MVPDs, which is the core approach taken by this bill, rather than an unjustifiable carve-out.

But correcting this shortcoming would be a relatively minor revision.

It is imperative that Rockefeller's likely successor in the next session of Congress, Senator Bill Nelson (D-FL), champion this well-intended, forward looking, and mostly thoughtful measure. Share wisely, and take care. 

Integration with Financial Cloud Services

Excerpted from Tech World Report

Enterprises, both small and large, are evaluating cloud computing and, in increasing numbers, are moving their IT infrastructure to the cloud.

While deploying a cloud application, or subscribing to a cloud-based service, organizations can struggle with integrating with their existing enterprise applications.

In this whitepaper, we give you a look at each type of integration and identify the associated functional area for each integration point.

Please click here to download the whitepaper.

Sign-Up Now for CONNECTING TO THE CLOUD

The DCIA will present CONNECTING TO THE CLOUD (CTTC), a Conference within the 2014 International Consumer Electronics Show (CES), on January 8th in the Las Vegas Convention Center, Las Vegas, NV.

The CCA is handling sponsorships.

CTTC at CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector or — as ABI Research's Sam Rosen referenced that category last week at CLOUD COMPUTING WEST — the "cloud electronics (CE) sector."

Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

An opening panel moderated by Tanya Curry-McMichael, VP of Strategy and Marketing, Verizon Digital Media Services, will examine "Millennials, Online TV, and Gaming: Now and Tomorrow."

What are the implications of the digital revolution in the way Millennials discover, access, and consume video, music, and gaming content online?

Hear it first-hand from young voices representing leading companies in the digital, social, and tech arenas.

Bhavik Vyas, Media & Entertainment Partner Eco-System Manager, Amazon Web Services (AWS), will further examine this issue in "Who's Connecting What to the Cloud?"

And Sam Rosen, Practice Director, TV & Video, Consumer Electronics, ABI Research, will address, "Where Are There Problems Connecting to the Cloud?"

Next, in two back-to-back presentations, Robert Stevenson, Chief Business Officer & VP of Strategy, Gaikai, will explore "Consumer Benefits of Cloud-Delivered Content: Ubiquity, Cost, Portability Improvements." And Reza Rassool, Chief Technology Officer, Kwaai Oak, will expose "Consumer Drawbacks of Cloud-Delivered Content: Availability, Reliability, Scalability Issues."

The follow-on panel with Jay Migliaccio, Director of Cloud Platforms & Services, Aspera; Andy Gottlieb, VP, Product Management, Aryaka; Larry Freedman, Partner, Edwards Wildman Palmer; David Hassoun, Owner & Partner, RealEyes Media; Jay Gleason, Cloud Solutions Manager, Sprint; and Grant Kirkwood, Co-Founder, Unitas Global, will discuss "The Impact on Telecommunications Industries of Cloud Computing."

Then two sessions will delve into "Telecommunications Industry Benefits of Cloud-Delivered Content: New Opportunities" with Doug Pasko, Principal Member of Technical Staff, Verizon Communications. And then "Telecommunications Industry Drawbacks of Cloud-Delivered Content: Infrastructure Challenges" with Allan McLennan, President & Chief Analyst, PADEM Group.

The next panel will address "The Impact on Entertainment Industries of Cloud Computing" with Mike King, Dir. of Mktg. for Cloud, Content & Media, DataDirect Networks; Venkat Uppuluri, VP of Marketing, Gaian Solutions; Mike West, Chief Technology Officer, GenosTV; Arnold Cortez, IT Consulting Specialist, IBM; Kurt Kyle, Media Industry Principal, SAP America; Adam Powers, and VP of Media Technology & Solutions, V2Solutions.

Two solo presentations with Les Ottolenghi, Global CIO, Las Vegas Sands Corporation, and Saul Berman, Partner & Vice President, IBM Global Business Services, will highlight "Entertainment Industry Benefits of Cloud Computing: Cost Savings & Efficiency" and "Entertainment Industry Drawbacks of Cloud Computing: Disruption & Security" respectively.

Additional sessions will introduce the subjects "Consumer Electronics Industry Benefits of Cloud-Based Services: New Revenue Streams" with Mikey Cohen, Architect & Principal Engineer, Netflix, and "Consumer Electronics Industry Drawbacks of Cloud-Based Services: Complexity" with Tom Joyce, SVP & GM, HP Converged Systems, Hewlett Packard.

The closing panel will draw on all the preceding sessions to more deeply analyze "The Impact on the Consumer Electronics Industry of Cloud Computing" with Michael Elliott, Enterprise Cloud Evangelist, Dell; David Frerichs, President, Media Tuners; Thierry Lehartel, VP, Product Management, Rovi; Russ Hertzberg, VP, Technology Solutions, SoftServe; Guido Ciburski, CEO, Telecontrol; and Scott Vouri, VP of Marketing, Western Digital.

Top program topics will include case studies on how cloud-based solutions are now being deployed for fixed and mobile CE products — successes and challenges; the effects on consumers of having access to services in the cloud anytime from anywhere — along with related social networking trends.

Also featured will be what broadband network operators and mobile Internet access providers are doing to help manage — and spur — the migration to interoperable cloud services.

Some in traditional entertainment industries find this technology overwhelmingly threatening and disruptive — others see enormous new opportunities; and the value proposition for CE manufacturers will also continue to evolve substantially to providing cloud-based value-adding services — rather than conventional hardware features.

Please register now for CTTC at CES.

BitTorrent Sync Goes Universal

Excerpted from Cult of Mac Report by John Brownlee

BitTorrent Sync is one of the best Dropbox alternatives out there. Drawing upon the power of BitTorrent, BitTorrent Sync allows you to keep folders synced between multiple Macs easily, but without storing them in the cloud or having to pay for things like storage.

If you're a BitTorrent Sync user — and you really should at least consider being one — great news. BitTorrent Sync just got an iPad app.

When it was launched on iOS in late August, BitTorrent Sync only came in iPhone and iPod touch flavors, which was a little odd, considering that you're probably more likely to make use of any desktop files you sync over BitTorrent on an iPad than an iPhone.

It appears, however, that the issue was simply one of resources. BitTorrent Sync has now been updated for iPad, and while it won't allow you to torrent files off of the Internet, it will allow you to sync your files locally.

Oh, and right, it's also been overhauled for iOS 7, while also being given the ability to send and sync files in other apps via Sync, as well as save pictures and video from sync folders to the Camera Roll, which is a pretty rad addition.

The universal version of BitTorrent Sync is now available on the App Store for the beautiful price of free.

Is Distributed Cloud Storage a Viable Option?

Excerpted from Midsize Insider Report by Daniel Cawrey

The popularity of distributed systems in technology circles is growing, and cloud computing is likely to be a benefactor in this trend. Proof of this comes via the peer-to-peer (P2P) purveyor BitTorrent.

According to TechCrunch's Ingrid Lunden, its Sync distributed cloud storage service has hit an impressive 1 million users. Also, a new application programming interface (API) for Sync was recently released that enables developers to build P2P cloud storage functionality into their applications.

If anything, IT professionals might find Sync's API useful because its storage characteristics lend to it a high amount of redundancy. Cloud services rely on a number of factors, including electricity, bandwidth and proper temperatures in the data center; but with distributed cloud computing, those risks can be minimized without a central authority.

This could, in theory, provide midsize businesses with better redundancy than that of cloud providers who use a smattering of data centers to provide a service. Redundancy is important; IT professionals keenly understand that fact, which lends credibility to the notion of distributed systems being valuable. It raises the question whether information distribution in this manner may ultimately be more reliable.

There remains some concern in the minds of IT professionals regarding data storage in this type of implementation notwithstanding the perceived benefits. In general, most businesses would not be comfortable with the idea of storing data on third-party systems. This discomfort appears even more reasonable when one considers that there could be countless unknown third parties using this type of distributed platform.

Nonetheless, it seems it would be very difficult to break into a distributed system such as this. The operation of BitTorrent's P2P technology relies on scattering bits of information, and Sync uses industry-standard AES-128 encryption. It is clear that this system is, in fact, built with security in mind.

Distributed systems are not very popular solutions in business IT today, but that could change with time. With BitTorrent Sync's API, small and midsize businesses have an opportunity to build on a reliable distributed cloud storage solution. It is an innovative idea, one that other cloud providers are likely to consider offering to companies in the future.

It is difficult to predict whether distributed cloud storage will take off for business use, but the potential is surely there. There is not much of a question about the concept's reliability, yet security could be a concern for many that it will take time to allay.

Security remains one of the largest issues with which small and midsize businesses are confronted in regard to cloud computing. Given the enormous value that companies place upon their business intelligence and other critical data sets, it is no surprise that information protection is incredibly important.

New Computing Model to Advance Medical Research

With the promise of personalized and customized medicine, one extremely important tool for its success is the knowledge of a person's unique genetic profile. This personalized knowledge of one's genetic profile has been facilitated by the advent of next-generation sequencing (NGS), where sequencing a genome, such as the human genome, has gone from costing $9 million to a mere $5,700. 

Now the research problem is no longer how to collect this information, but how to compute and analyze it. "Overall, DNA sequencers in the life sciences are able to generate a terabyte -- or 1 trillion bytes -- of data a minute. 

This accumulation means the size of DNA sequence databases will increase 10-fold every 18 months," said Wu Feng, a Professor with the Department of Computer Science in the College of Engineering at Virginia Tech. 

"In contrast, Moore's Law (named after Intel co-founder Gordon E. Moore) implies that a processor's capability to compute on such 'BIG DATA' increases by only two-fold every 24 months," added Weng. 

"Clearly, the rate at which data is being generated is far outstripping a processor's capability to compute on it. Hence the need exists for accessible large-scale computing with multiple processors ... though the rate at which the number of processors needs to increase is doing so at an exponential rate." 

For the past two years, Feng has led a research team that has now created a new generation of efficient data management and analysis software for large-scale, data-intensive scientific applications in the cloud. 

Cloud computing is a term coined by individuals in the computing field that in general describes a large number of connected computers located all over the world that can simultaneously run a program at a large scale. Feng announced his work in October at the O'Reilly Strata Conference + Hadoop World in New York City. 

By background to Feng's announcement, one needs to go back more than three years. In April 2010, the National Science Foundation teamed with Microsoft on a collaborative cloud computing agreement. 

One year later, they decided to fund 13 research projects to help researchers quickly integrate cloud technology into their research. Feng was selected to lead one of these teams. His target was to develop an on-demand, cloud-computing model, using the Microsoft Azure cloud. 

It then evolved naturally to make use of the Microsoft's Hadoop-based Azure HDInsight Service. "Our goal was to keep up with the data deluge in the DNA sequencing space. Our result is that we are now analyzing data faster, and we are also analyzing it more intelligently," Feng said. 

With this analysis, and the ability of researchers from all over the globe to see the same sets of data, collaborative work is facilitated on a 24/7 global perspective. "This cooperative cloud computing solution allows life scientists and their institutions easy sharing of public data sets and helps facilitate large-scale collaborative research," Feng added.

Cloud-Based Solutions Improving Data Protection

Excerpted from CenterBeam Report

IT operations and mission-critical data that's stored in company servers have become the foundation of business continuity. The dependency on advanced data backup is driving the demand for cloud computing solutions; new technologies and growing vendor offerings have enabled firms to embrace new data protection and DR strategies.

Cloud-based backups have provided businesses with new solutions that can simplify and achieve practically all such requirements, according to "Your Strategic Guide to Backup & Recovery," a joint publication by the editors of CIO, Computerworld, CSO, InfoWorld, ITworld and Network World.

Cited in the publication, an Enterprise Strategy Group study from earlier this year found that, currently, a quarter of respondents said they use "cloud-based data protection services in some capacity," including backup, data recovery and storage. An additional 46 percent stated they planned to implement cloud services for the aforementioned reasons.

While ESG collectively referred to cloud-based storage, backup and recovery as data protection as a service (DPaaS), one of the chief reasons that firms deployed this method was the ability to remotely store data at an off-site location for DR purposes.

Cost-effectiveness of DPaaS was an obvious incentive for the majority of organizations, but as cloud solutions improve, companies are beginning to build more trust in the maturing technology. The survey found that 59 percent of those polled protected no more than 40 percent of their applications in cloud-based backup services, however, more than half said they would be shifting over 40 percent of applications to cloud backups by 2015.

When ESG asked about the benefits of DPaaS, 36 percent said it reduced the cost of on-site data protection hardware; 33 percent noted it reduced IT staff expenses; and 37 percent cited improved security as the top benefit. While many businesses are slow to adopt cloud solutions because of security concerns, ESG's research seemingly shows these traditional hurdles to adoption are beginning to wane.

Noted in a recent IDG study regarding top security issues, as cyberthreats and the consumerization of IT - including the bring-your-own device trend - increase, so will the demand for DPaaS that incorporates solutions such as mobile device management, especially as consumer confidence in cloud-based services grow.

Greg Schulz, founder and senior analyst at consulting firm StorageIO, who was interviewed in the joint publication, said he's seen improvements in cloud backups in just six months. Once a skeptic, Schulz now has complete faith in DPaaS solutions.

"Every time I've bet against the cloud, I've been wrong," he said, according to the source.

Big Data & Cloud Computing Democratize Health Data

Excerpted from Federal Times Report by Andy Medici

Advances in big data and cloud computing are driving the democratization of health care data, according to federal officials.

Niall Brennan, the Director of the Office of Information Products and Data Analytics, said the ability to share data in the cloud "is going to revolutionize the way people access data."

While the agency would package data on to encrypted hard drives and ship them to researchers, Brennan said the same data can now be accessed through a cloud portal from any vetted research institution.

Brennan made the remarks at the Armed Forces Communications and Electronics Association Bethesda chapter's annual Health IT conference at the Centers for Medicare and Medicaid Services at the Health and Human Services Department in Bethesda, Md.

He said while the agency has collected data before it is working to make sure that health care providers, customers and researchers have the data, they need to make better decisions.

"We need the right data in the right form at the right time to the right people," he said.

By building better tools and sharing data across the cloud, more scientists and researchers will be able to access the data in productive ways, said Mike Tartakovsky, the chief information officer and director of the Office of Cyberinfrastructure and computational Biology at the National Institute of Health's National institute of Allergy and infectious Diseases.

"We need to democratize access," Tartakovsky said.

George Komatsoulis, the deputy director of CBIIT at the National Cancer Institute, said within the next 10 years every patient is going to want access to large-scale biomedical informatics because of the capability for personalized cancer treatments.

Komatsoulis said the agency is also reaching out to map the genome of 11,000 tumors and place the data in the cloud in order to help drive cancer research. To date, the agency has gathered 2.5 petabytes of data — which could take months to transfer to research institutions — he said.

By placing that data within the cloud, it can be accessed far more easily and in less time, Komatsoulis said.

"We are trying to democratize access to the data and provide compute and paid-for data sets so that scientists across the globe can go and work on this data," he said.

Turning Data into Revenue

Excerpted from Baseline Report by John Lucker

The concept that a company's data might actually be worth something to someone else isn't new. Many consider data to be a strategic asset and one of the most valuable off-balance-sheet assets a company might have.

For several years now, online companies have put their data to work, and some have earned hefty returns by making it available to others in a way that adds value without cannibalizing their own business. Social media companies sell some of their data to advertisers, and online merchants use their data to link with other merchant sites to target their customers via a variety of tools for cross-selling, up-selling, next-best offers and attrition management.

These activities are old news, however, and a new breed of savvy companies understands that data has real, lasting value and can be turned into an ongoing revenue stream. How? One way is via direct or brokered data markets—a business model that has defined a new market category. Here, companies sell, buy or barter data for mutual benefit.

For example, let's take a company that provides cloud-based services. This company would have a very expansive technical architecture. What if it could coalesce all the data it collected on the architecture's performance and reliability, and then provide that data to hardware and software vendors that develop and market products to serve the cloud?

Those vendors would welcome the field-tested insights that data could provide. They'd be willing to pay a good price for it too—if not in hard dollars, then in the form of free or discounted product enhancements or service-level customizations in exchange for the feedback.

In a similar arrangement, a snack food maker might put its proprietary product serving and nutrition information on a data market. In exchange, it could receive aggregate consumer feedback on the product from data brokers whose role is to facilitate value-added exchanges for each party in a data exchange.

As that data becomes productized, a mobile software developer might buy the information to include in its newest app that compares the information on various products and offers healthful recommendations. Or perhaps the company uses the data for users to track daily food consumption as part of a diet regimen. With these beneficial exchanges, everybody wins—either financially or indirectly via quid-pro-quo scenarios.

The caveats here are at least twofold. First, privacy and identity preservation are increasing concerns for consumers, and companies should be diligent in complying with right-to-use data agreements: End User License Agreements (EULAs). Second, those in the C-suite might worry that they'll give up their competitive edge by providing valuable information to current or future competitors.

If customer terms of service include salient details and appropriate opt-in or opt-out provisions that are coupled with worthwhile benefits to the consumer, privacy concerns can be satisfactorily mitigated. Companies also should think very carefully about how provisioned data could be manipulated to provide competitors with unanticipated benefits.

It's clear that much careful thought and strategic decisioning is necessary for effective data monetization. However, pioneering business case studies have proven that the potential value can be great.

Special note: Mistakes or failure are not options here, and executives must take the time to become confident that the potential risks will be offset by even greater potential rewards. There's money in the data market. Are you in?

Big Data's Little Brother

Excerpted from NY Times Report by Quentin Hardy

Premise, founded by David Soloff, and Joe Reisinger, uses smartphones to collect data.

The company created a smartphone application that is now used by 700 people in 25 developing countries. Using guidance from Mr. Soloff and his co-workers, these people, mostly college students and homemakers, photograph food and goods in public markets.

By analyzing the photos of prices and the placement of everyday items like piles of tomatoes and bottles of shampoo and matching that to other data, Premise is building a real-time inflation index to sell to companies and Wall Street traders, who are hungry for insightful data.

"Within five years, I'd like to have 3,000 or 4,000 people doing this," said Mr. Soloff, who is also Premise's chief executive. "It's a useful global inflation monitor, a way of looking at food security, or a way a manufacturer can judge what kind of shelf space he is getting."

Collecting data from all sorts of odd places and analyzing it much faster than was possible even a couple of years ago has become one of the hottest areas of the technology industry. The idea is simple: With all that processing power and a little creativity, researchers should be able to find novel patterns and relationships among different kinds of information.

For the last few years, insiders have been calling this sort of analysis Big Data. Now Big Data is evolving, becoming more "hyper" and including all sorts of sources. Start-ups like Premise and ClearStory Data, as well as larger companies like General Electric, are getting into the act.

A picture of a pile of tomatoes in Asia may not lead anyone to a great conclusion other than how tasty those tomatoes may or may not look. But connect pictures of food piles around the world to weather forecasts and rainfall totals and you have meaningful information that people like stockbrokers or buyers for grocery chains could use.

And the faster that happens, the better, so people can make smart — and quick — decisions.

"Hyperdata comes to you on the spot, and you can analyze it and act on it on the spot," said Bernt Wahl, an industry fellow at the Center for Entrepreneurship and Technology at the University of California, Berkeley. "It will be in regular business soon, with everyone predicting and acting the way Amazon instantaneously changes its prices around."

Standard statistics might project next summer's ice cream sales. The aim of people working on newer Big Data systems is to collect seemingly unconnected information like today's heat and cloud cover, and a hometown team's victory over the weekend, compare that with past weather and sports outcomes, and figure out how much mint chip ice cream mothers would buy today.

At least, that is the hope, and there are early signs it could work. Premise claims to have spotted broad national inflation in India months ahead of the government by looking at onion prices in a couple of markets.

The photographers working for Premise are recruited by country managers, and they receive 8 to 10 cents a picture. Premise also gathers time and location information from the phones, plus a few notes on things like whether the market was crowded. The real insight comes from knowing how to mix it all together, quickly.

Price data from the photos gets blended with prices Premise receives from 30,000 websites. The company then builds national inflation indexes and price maps for markets in places like Kolkata, India; Shanghai; and Rio de Janeiro.

Premise's subscribers include Wall Street hedge funds and Procter & Gamble, a company known for using lots of data. None of them would comment for this article. Subscriptions to the service range from $1,500 to more than $15,000 a month, though there is also a version that offers free data to schools and nonprofit groups.

The new Big Data connections are also benefiting from the increasing amount of public information that is available. According to research from the McKinsey Global Institute, 40 national governments now offer data on matters like population and land use. The United States government alone has 90,000 sets of open data.

"There is over $3 trillion of potential benefit from open government economic data, from things like price transparency, competition and benchmarking," said Michael Chui, one of the authors of the McKinsey report. "Sometimes you have to be careful of the quality, but it is valuable."

That government data can be matched with sensors on smartphones, jet engines, even bicycle stations, that are uploading data from across the physical world into the supercomputers of cloud computing systems.

Until a few years ago, much government and private data could not be collected particularly fast or well. It was expensive to get and hard to load into computers. As sensor prices have dropped, however, and things like Wi-Fi have enabled connectivity, that has changed.

In the world of computer hardware, in-memory computing, an advance that allows data to be crunched without being stored in a different location, has increased computing speeds immensely. That has allowed for some real-time data crunching.

General Electric, for example, which has over 200 sensors in a single jet engine, has worked with Accenture to build a business analyzing aircraft performance the moment the jet lands. G.E. also has software that looks at data collected from 100 places on a turbine every second, and combines it with power demand, weather forecasts and labor costs to plot maintenance schedules.

IBM also recently announced commercial deployment of software that learns and predicts the behavior of large, complex systems to improve performance while things are happening.

One customer, an Illinois telecommunications company called Consolidated Communications, uses the software to oversee 80,000 elements of its network, like connectivity speeds and television performance, for each of its 500,000 clients. IBM also announced new products it said would improve analysis and make it easier for customers to work with different kinds of data.

Traditional data analysis was built on looking at regular information, like payroll stubs, that could be loaded into the regular rows and columns of a spreadsheet. With the explosion of the Web, however, companies like Google, Facebook and Yahoo were faced with unprecedented volumes of "unstructured" data, like how people cruised the Web or comments they made to their friends.

New hardware and software have also been created that sharply cut the time it takes to analyze this information, fetching it as fast as an iPhone fetches a song.

Last month, creators of the Spark open-source software, which speeds data analysis by 100 times compared with existing systems, received $14 million to start a company that would offer a commercial version of that software.

ClearStory Data, a start-up based in Palo Alto, Calif., has introduced a product that can look at data on the fly from different sources. With ClearStory, data on movie ticket sales, for example, might be mixed with information on weather, even Twitter messages, and presented as a shifting bar chart or a map, depending on what the customer is trying to figure out. There is even a "data you might like" feature, which suggests new sources of information to try.

The trick, said Sharmila Shahani-Mulligan, ClearStory's co-founder and chief executive, was developing a way to quickly and accurately find all of the data sources available. Another was figuring out how to present data on, say, typical weather in a community, in a way that was useful.

"That way," said Ms. Shahani-Mulligan, "a coffee shop can tell if customers will drink Red Bull or hot chocolate."

Banks Bet on Cloud as IT Budgets Ease

Excerpted from Computer World Report by Matthew Finnegan

Spending on IT services in the UK financial sector has grown to £9.8 billion according to analysts TechMarketView, as banks increase budgets and rethink strategies on cloud computing.

The 'UK Financial Services Sector SITS Market Trends and Forecasts 2013' report, launched as part of TechMarketView's new FinancialServiceViews analysis stream, indicates that spending on software and IT services (SITS) is expected to continue grow by 3.6 percent each year up to 2016, outpacing spending in the wider private sector.

The report claims that banking, financial market and insurance firms are responding to a "better economic and commercial outlook" after the financial crash of 2008, and are seeking to invest in IT and services to help build competitive advantage.

The top priorities for spending in all three sectors will be around legacy system modernisation and simplification, meeting regulations, and the increasing use of cloud computing.

Cloud computing will offer benefits around cost for all sectors, for example helping incumbent banks reduce costs and lowering barriers to entry for new competitors, as the technology becomes more widely accepted.

The report stated: "It is clear from discussions with both customers and vendors that the sector's attitude to cloud services has evolved significantly over the past 18 months. Consequently, we expect that the move to cloud services will represent a major part of the sector's cost transformation."

The report also indicates that spending on enterprise software and applications services is set to grow at 3 percent per year in order to enable central cost control, as well as using data to improve customer experience and create more tailored offerings.

The use of business process services will also continue to grow, with a 6 percent increase in spending per year expected.

Meanwhile, infrastructure service investments are set to grow more slowly than other areas, at 2 percent, but will remain ahead of the wider market spending, as firms look to third party infrastructure suppliers in order to help drive cost reductions.

Despite the increase in IT investment, the report said that CIOs will face new demands to keep costs down.

"Our positive view on overall spending is however constrained by the very different financial environment in which the financial services companies will operate over the next four years, particularly when compared with the pre-2008 crash," the report states.

"Competition and greater regulatory scrutiny will mean that margins and return on capital across the whole of the financial services industry will continue to be under pressure."

Environmental Benefits of Cloud Computing

Excerpted from Rickscloud Blog by Rick Blaisdell

Both cloud computing and sustainability are emerging as trends in business and society. Most consumers, whether they are aware of it or not, are already heavy users of cloud-enabled services like email, social media, online gaming, and many mobile applications.

At the same time, sustainability continues to gain importance as a performance indicator for organizations and their IT departments. Corporate sustainability officers, regulators and other stakeholders have become increasingly focused on IT's carbon footprint, and companies of all sizes are also placing more emphasis on developing long-term strategies to reduce their carbon footprint through more sustainable operations and products.

While cloud computing may not seem all that eco-friendly at first glance, a closer look reveals a number of benefits. A six-month study conducted by Lawrence Berkeley National Laboratory and Northwestern University with funding from Google has found that moving common software applications used by 86 million US workers to the cloud could save enough electricity annually to power Los Angeles for a year.

The report looks at 3 common business applications: email, CRM software and bundled productivity software (spreadsheets, file sharing, word processing, etc.). Moving these software applications from local computer systems to centralized cloud services could cut IT energy consumption by up to 87 percent. This is the amount of electricity used each year by all the homes, businesses and industry in Los Angeles.

Moving to the cloud can mean big energy savings for an organization, both in direct power costs and indirect measures, such as the reduced need for shipping and manufacturing. Here are some of the ways that the cloud can help a company cut its carbon footprint down to size:

Fewer machines — With the cloud, server utilization rates are typically 60-70%, while in many small business and corporate environments, utilization rates hover around 5 or 10%. As a result, shared data centers can employ fewer machines to get the same capacity.

Equipment efficiency — Larger data centers often have the resources to allow them to upgrade to energy-saving equipment and building systems. Usually, this is not an option in smaller organizations where this efficiency is not the focus.

Consolidated climate control costs — In order for a server to run at its peak performance, its temperature and humidity level must be carefully controlled, and cloud providers can use high density efficient layouts that are hard for in-house centers to replicate.

Dynamically allocated resources — In-house data centers need extra servers to handle peak data loads, and cloud providers can dynamically allocate resources where necessary in order for fewer machines to sit idle.

Cloud computing has enormous potential to transform the world of IT: reducing costs, improving efficiency and business agility, and contributing to a more sustainable world. Do you use the cloud or other green technologies in your business?

Google Is Living a Few Years in the Future

Excerpted from ZDNet Report by Nick Heath

Want to understand the type of systems global businesses will be using in five years? Look at the technology used by Google today.

Enterprise has a history of riding in Google's slipstream. It was in 2004 that Google revealed the technologies that inspired the creation of Hadoop, the platform that it is only today starting to be used by business for big data analytics.

Hadoop's co-creator Doug Cutting believes industry will continue to borrow from Google's toolbox, and sees a bright future in enterprise for the recently announced Google Spanner.

"Google is living a few years in the future and sending the rest of us messages," he said at the O'Reilly Strata Conference in London.

Spanner was unveiled by Google last year as the technology that allows the search giant to provide almost instantaneous access to its services to millions of people worldwide without its software falling over. Primarily it stops Google's systems from getting tangled up while trying to keep up to date with what each other is doing.

In creating Spanner, Google had built a planet-spanning distributed database that allowed its global datacenters to keep in sync without suffering huge latencies.

At the heart of Spanner is Google's TrueTime service, which allows systems to get accurate timestamps based on readings from atomic clocks and GPS receivers installed in each of Google's datacenters. Because Google can rely on TrueTime systems in different datacenters being in sync it can ensure applications situated on other sides of the world are able to read, write and replicate data without falling out of step with each other.

For Cutting, Spanner shows the future possibilities for open source distributed processing platforms like Hadoop.

Hadoop allows data to be spread over large clusters of commodity servers and processed in parallel. Today the platform is generally used to analyse data that sits outside of online transaction processing (OLTP) systems that are the engine of businesses — the likes of e-commerce, CRM and HR systems.

Spanner demonstrates how major corporations may soon use a Hadoop-like platform run these OLTP systems at a globally distributed scale, said Cutting, who is also chief architect at Hadoop specialist Cloudera.

"I think it [Spanner] is the Holy Grail for big data," he said. "Just a couple of years ago people would talk about OLTP and say 'You can't do that sort of stuff on a Hadoop-like platform'. Google demonstrated that you can."

Google created Spanner because it needed a technology with global reach to underpin its massive software platforms, said Cutting, a need that other large enterprises may struggle with in future.

"In a lot of cases people are served just fine by their existing relational solutions to OLTP problems, and there's no need to drive it to Hadoop.

"However as enterprises become more Google like that might not be satisfactory. I think the rest of us will be driven there as well. In the next couple of years I think we'll see it."

Facebook is already demonstrating one way of effectively linking an Hadoop cluster spanning multiple datacenters worldwide with its Prism system.

The type of processing that can be carried out on a Hadoop cluster is evolving, with the general availability of Hadoop 2 last month refining Hadoop's software tools to make it easier to use clusters for more than batch processing.

As Hortonworks CEO Arun Murthy told ZDNet, the introduction of the a separate job scheduler called YARN widens potential uses for Hadoop.

"It opens up Hadoop to so many new use cases, whether it's real-time event processing, or interactive SQL. Machine learning is another example — people are building native machine-learning apps on top of Hadoop right now, thanks to YARN," he said.

Cutting said that another use for YARN is for dynamically reassigning computing resources from the Hadoop cluster according to factors that are important to an organisation, such as who is running the job or at what time, which could be spelled out in SLAs.

Cutting sees Hadoop evolving into an OS for the datacentre, a platform for running a broad range of backend software designed to run on server clusters.

"Changes are going very quickly. I think there will be a trend towards seeing it as a kernel for datacenters in the way that Linux is the kernel for single nodes. More and more it will be the center of a wide range of applications," he said.

"In the last year we've seen huge numbers of the traditional enterprise software vendors start to move and make their software available on top of Hadoop.

We've had SAS, Tableau Software, across the board people are starting to see this as a platform that their customers want. I think that's going to accelerate until it's the default platform for most vendors."

Coming Events of Interest

Government Video Expo 2013 - December 3rd-5th in Washington, DC. Government Video Expo, co-located with InfoComm's GovComm, brings the east coast's largest contingent of video production, post, digital media, and broadcast professionals together with the government AV/IT specialists. The combined event features over 150 exhibits and nearly 6,000 registrants.

GOVERNMENT VIDEO IN THE CLOUD - December 4th in Washington, DC. This DCIA Conference within Government Video Expo focuses specifically on cloud solutions for and case studies related to producing, storing, distributing, and analyzing government-owned video content.

International CES - January 7th-10th in Las Vegas, NV.  The International CES is the global stage for innovation reaching across global markets, connecting the industry and enabling CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $209 billion US consumer electronics industry.

CONNECTING TO THE CLOUD - January 8th in Las Vegas, NV. This DCIA Conference within CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector. Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

CCISA 2013 – February 12th–14th in Turin, Italy. The second international special session on  Cloud Computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed, and  Network-Based Processing.

NAB Show - April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.

Media Management in the Cloud — April 8th-9th in Las Vegas, NV. This two-day conference provides a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain, including production, delivery, and storage.

CLOUD COMPUTING EAST 2014 - May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.

Copyright 2008 Distributed Computing Industry Association
This page last updated November 25, 2013
Privacy Policy