Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

CSC Leasing

Dancing on a Cloud

DataDirect Networks

Extreme Reach

Hertz Neverlost

Kaltura

Moses & Singer

SAP

Scayl

Scenios

Sequencia

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

December 2, 2012
Volume XLI, Issue 11


A Great Day for Internet Rights & Innovation

Excerpted from CDT Report by Mark Stanley

On Thursday, the Senate Judiciary Committee passed legislation to update ECPA - the legislation would finally require the government to get a warrant before accessing e-mails and documents stored in the cloud. This would ensure our online information receives the same Fourth Amendment protections as postal mail, phone calls, and documents in our houses. It was an historic step in bringing Constitutional protections into the 21st Century.

The legislation received strong bipartisan support. In fact, the only Committee Member to record a 'no' vote was Senator Jeff Sessions (R-AL). For years, we've been seeking bipartisan support for ECPA reform in the Senate, with little to no success. Thursday, the partisanship that had previously prevented ECPA reform from moving forward crumbled, in a major way.

The Committee also (somewhat astoundingly) beat back an amendment from Senator Charles Grassley (R-IA) to create an exception to the warrant requirement for certain crimes involving women and children. The defeat of this bad amendment was due to the fact that emergency exceptions already in ECPA ensure quick law enforcement access to necessary information. Senator Mike Lee (R-UT), who joined all Democrats in voting against the amendment (and who was an adamant supporter of ECPA reform), rightly questioned the wisdom of putting some crimes under different standards than others.

In another remarkable turn of events, a controversial amendment that Grassley filed in September to allow federal agencies access to content without a warrant was not even offered.

The Committee also adopted an amendment by Senator Dianne Feinstein (D-CA) that would ensure video service providers obtain consent every two years to share consumers' video selections. (The bill would have amended the Video Privacy Protection Act to permit companies to only get consent once.)

The chances are slim this bill will pass in 2012, if only because there is so little time left. But this legislation will definitely be taken up again next year, early hopefully, and the outcome we got Thursday is a big step in finally getting Congress to pass, in 2013, a bill that applies Fourth Amendment protections to the Internet.

The work that went into this effort from the Digital Due Process coalition, the Vanishing Rights coalition, Internet users, and startups and VCs has set us on a path toward major reform in 2013.

DCINFO Editor's Note: The DCIA thanks all who responded to our call-to-action last week to contact their US Senators and voice support for ECPA reform.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe hope to see you this Tuesday December 4th at the INTELLIGENCE IN THE CLOUD (IITC) Workshop being conducted in partnership with the National Association of Broadcasters (NAB) and sponsored by Front Porch Digital in Washington, DC.

Cloud computing's massive impact on the management of commercial multimedia is poised to bring major changes to government and military.

IITC's full agenda of industry-leading speakers ensures that every attendee will leave with an understanding of valuable lessons learned to date.

Nine intensive case studies will reveal exactly how the cloud is revolutionizing collaboration, storage, distribution, and analysis of video and images.

Three panels will identify common challenges shared by the private and public sectors, regulatory considerations, and where we go from here.

Space is limited, so please register now. This hands-on workshop is a must-attend experience for everyone involved in the ongoing migration of sensitive multimedia to the cloud.

Our opening keynote will feature Dr. Suzanne Yoakum-Stover, Executive Director, Institute for Modern Intelligence (IMI), who will address What Does Cloud Computing Bring to the Intelligence Community? What's a working definition of "cloud computing" and related terminology for this workshop? What are the unique attributes of cloud computing that should benefit the missions of military and government agencies charged with managing large quantities of sensitive multimedia data?

In Securing Multimedia Traffic on Broadband Data Networks, Michael Weintraub Executive Director of Technology at Verizon Communications will offer observations from the perspective of a major broadband network operator on issues associated with securely storing and distributing multimedia data among multiple users of different clouds, with anecdotal examples from the private sector to illustrate how related hurdles have been overcome and lessons learned.

Next, Scott Campbell, Industry Principal, Media & Telecom, SAP America and Marlyn Zelkowitz, Global Director, Public Services Cloud Solutions, SAP America will discuss Business Intelligence & Mobile Device Management. What lessons can be learned from private sector business intelligence (BI) and mobile device management (MDM) activities in the cloud that have value to related military and government initiatives? SAP's Sybase acquisition and its work with Gartner and TMobile on real-time analysis combined with predictive analytics solutions provide important insights.

In Data Collaboration & Analysis, speaker Randy Kreiser, Chief Storage Architect, Federal Division, DataDirect Networks (DDN) will share DDN's real-world experiences in optimizing every step involved in gathering, analyzing, and sharing visual data — including full-motion video and imagery. What can be gleaned from its NRL "large data" program that will be useful to additional deployments by other branches of the military and in government agencies?

Next, a panel discussion will address Identifying Common Challenges. Moderator John Bordner, WISC Enterprises will lead a group of government and private sector professionals from Google, IMI, NGIA, and SAP, in exploring common challenges and ways to leverage the learning from commercial experience to date with ongoing government projects.

During lunch, in a break from our primary focus on operational and technological issues, a special session on Legal Considerations led by Lawrence Freedman, Partner, and Isaac Brown of Edwards Wildman Palmer will highlight late-breaking policy and regulatory developments in the cybersecurity and data privacy arenas and what to expect from lawmakers on these matters in 2013.

In Cloud 101 for Multimedia Assets, speaker Allan McLennan, President, The PADEM Group will present PADEM's recent IBC tutorial addressed the business models and investment opportunities surrounding the growth, ease, and security of cloud-based distribution of media, analytics, and services, especially in the over-the-top (OTT) video and the connected-TV space. How do the underlying cases studies relate to the problems that must be overcome for Military & Government implementation of cloud solutions for critical visual data?

John Heaton, Director at Aspera will discuss Collaboration, Transfer, Storage, Access. With its numerous cloud deployments for media, collaborative research, and government agencies for gathering intelligence data from the field faster and more reliably than previously, what insights can Aspera provide regarding key technologies needed for real-time collaboration, transfer, storage, and access?

In Managing Your Cloud for Security and Compliance, speaker Jeff Reich, Chief Risk Officer, Layered Tech will outline the essential characteristics of cloud computing that can make the cloud dynamic, agile, and responsive. Sometimes, traditional controls become ineffective. Layered Tech shares experiences gained from its cloud offering that provides security with guaranteed compliance. Some essential characteristics had to be interpreted in order to allow important process controls to work.

Sean Jennings, VP, Solutions Architecture, Virtustream will provide guidance on the key question of Private, Hybrid or Public -- Which Cloud for What Purpose. Lessons learned from XStream deployments as a secure high-performance cloud software solution, appliance, and managed service. How uVMTM technology enables granular accountability and works with existing hardware and virtualization software to run private clouds, manage hybrid clouds, and securely operate public clouds. How cloud staging and network services have helped overcome obstacles in transitioning assets to the cloud and co-locating non-cloud physical assets and applications.

In Secure Interactive Broadcasting, speaker Greg Parker, CEO, DarpaTV will offer lessons learned from a proprietary digital broadcasting network enabling live, secure, interactive broadcasting with two-way video and audio interaction among program hosts and audience members. Viewers watch and participate in the show (video, voice, text, surveys, polls, etc.), or create their own shows — through real-time cloud-based collaboration. A large multinational firm used DarpaTV for secure interactive CEO and CFO broadcasts to all employees around the world allowing workers to video-call into the show to ask questions and carry-out live voting polls on certain topics, and also to have secure multimedia (inserting images, video clips, audio segments) interactive broadcasts with their customers and suppliers for product development and production.

Eric Klinker, CEO, BitTorrent will provide valuable insights into What, not Where: What the Cloud Changes Next. The BitTorrent ecosystem is one of the world's largest clouds. BitTorrent was designed as a replacement for http, and already moves more content than all of the websites in the world combined. What are the advantages of a distributed cloud like BitTorrent? And what does this mean for the future of the Internet? BitTorrent Inc. CEO Eric Klinker will uncover some of the far-reaching strategic considerations, technical oversight and infrastructure needs associated with managing sensitive multimedia data in the cloud.

And finally, Prabhat Kumar, Managing Partner, i3m3 Solutions; David Sterling, Partner, i3m3 Solutions; and Vic Winkler, Author, "Securing the Cloud" will discuss Putting It All Together and Where Do We Go from Here in a panel that we will moderate. What were the most salient points and most important takeaways from each of the workshop's earlier sessions? What critical questions remain relating to an IP and security framework? How can IP Multimedia Subsystem (IMS) platforms enhance hybrid cloud and legacy TDM environments? And finally, how can security frameworks best be applied to cloud environments?

We hope to see you at what promises to be an extremely valuable and instructive workshop. Share wisely, and take care.

Deep Dive: ECPA and the Future of Electronic Privacy

Excerpted from EFFector Report

Thursday was a watershed moment in the fight for electronic privacy: the Senate Judiciary Committee overwhelmingly passed an amendment that mandates the government get a probable cause warrant before reading our e-mails. The battle isn't over – the reform, championed by Senator Patrick Leahy (D-VT), still needs to pass the rest of the Senate and the House, and be signed by the President to become a law. But Thursday, thanks to thousands of people speaking out, we were able to begin the process of overhauling our archaic privacy laws into alignment with modern technology.

It was a big win for us, even if it was only the first step in the process of reforming privacy law to keep the government out of our inboxes. So we're dedicating this EFFector to the battle to reform outdated privacy law: what the government can get, what the law ought to be, and what we're doing to fix the gaping loopholes that leave users vulnerable to government snooping.

The Fourth Amendment protects us from unreasonable government searches and seizures. In practical terms, this means that law enforcement has to get a warrant – demonstrating to a judge that it has probable cause to believe it will find evidence of a crime – in order to search a place or seize an item. In deciding whether the Fourth Amendment applies, courts always look to see whether people have both a subjective expectation of privacy in the place to be searched, and whether society would recognize that expectation of privacy as reasonable. The Supreme Court made this point clear in a landmark 1967 case, Katz v. United States, when it ruled that a warrantless wiretap of a public payphone violated the Fourth Amendment.

In 1979, the Supreme Court created a crack in our Fourth Amendment protections. In Smith v. Maryland, the Court ruled that the Fourth Amendment didn't protect the privacy of the numbers we dialed on our phones because we had voluntarily shared those numbers with the phone company when we dialed them. This principle – known as the Third Party Doctrine – basically suggests that when we share data with a communications service provider like a telephone company or an e-mail provider, we know our data is being handed to someone else and so we can't reasonably expect it to be private anymore.

The government took this small opening created by Smith v. Maryland and blew it wide open. It argued that this narrow 1979 decision about phone dialing applied to the vast amount of data we now share with online service providers – everything from e-mail to cell-phone location records to social media. This is bogus and dangerous. When we hand an e-mail message to Gmail to deliver on our behalf, we do so with an intention that our private communications will be respected and kept in strict confidence, and that no human being or computer will review the message other than the intended recipient. But the government argues that because we handed our communications to a service provider, the Fourth Amendment doesn't require them to get a warrant before snooping around our inbox.

Luckily, the courts are beginning to agree with us. In a leading case where EFF participated as amicus, United States v. Warshak, the Sixth Circuit Court of Appeals agreed with us that people had a reasonable expectation of privacy in their e-mail, even if it is stored with a service provider, and therefore the government needed a search warrant to access it. And in the recent Supreme Court case, United States v. Jones, Justice Sotomayor said that she thought the Third Party Doctrine was outdated, while she and four other Justices – including Justice Alito – raised concerns about the information gathered by our cell-phones.

It's not just the Constitution, however. Congress has made clear that certain forms of data are protected by federal statute as well. Following the Katz decision, Congress passed the Wiretap Act in 1968, supplementing the strong Fourth Amendment privacy protections in phone conversations by enacting a comprehensive set of federal statutes. These statutes were designed to ensure that law enforcement has a compelling reason before intercepting phone calls.

And as electronic communication started to become more prevalent, Congress passed the Electronic Communications Privacy Act (ECPA) in 1986 that somewhat improved the privacy rights around certain electronic communications. But as it reflects the technology of 1986, ECPA has aged poorly. It doesn't address documents stored in the cloud, information revealing our personal associations, or the vast quantities of location data our mobile devices collect on us everyday. And, as a result of loopholes in the law, the Department of Justice, citing ECPA, has argued that it has a right to access emails without a warrant as soon as they are 180 days old, or have been opened and left on the server.

We think that 180-day limit and a distinction between opened and unopened email is arbitrary and wrong. As the Washington Post said in an editorial earlier this week, "If you left a letter on your desk for 180 days, you wouldn't imagine that the police could then swoop in and read it without your permission, or a judge's."

That's why this week's vote was so important: it was a critical first step in updating ECPA to evolve with the modern technologies we use today, and to close archaic loopholes that give government too much access with not enough judicial oversight.

Cloud Works for Government, Says Ovum

Excerpted from ComputerWorld Report by Stephen Bell

Case studies conducted by industry analyst Ovum into cloud computing implementations "illustrate that benefits were greater than expected, while risks and difficulties were lower than typically experienced by traditional ICT projects", says Ovum's Steve Hodgkinson.

Speaking to a Wellington breakfast seminar last week Hodgkinson related the positive experience of a number of Australian public-sector bodies with the cloud. It's time, he says, that we turn from talking theory and look at the practical lessons learned by some of the early adopters of cloud solutions.

There were passing references to NZ Post's use of Google applications. Hodgkinson describes NZ Post as "a leader in use of Google Apps at large scale".

Auckland University is another early Google Apps adopter worth examining, he says.

However, agencies that have made a success of cloud computing are still reluctant to talk on-the-record about what they have done in detail. This means it is difficult to use them as the basis of meaningful case-studies.

This leads to an unbalanced view of the cloud as "evil, immoral and dangerous"; something that offshores local jobs and is a risk to security and sovereignty, Hodgkinson says.

A lot of cloud developments still "have a kind of serendipitous feel to them", he says and this encourages a cautious view. A significant exception is the New South Wales Department of Trade and Investment.

"It has gone through a full-on public procurement tender process to select a software-as-a-service provider for an ERP application as the kernel of a new shared-services strategy," says Hodgkinson.

The department is implementing SAP Business by Design for finance, human resources and procurement across nine agencies. Implementation started in July, after a 13-week tender process and the applications are scheduled to go live in early January.

If that comes to pass as expected, it will be a striking contrast with the typical large-scale government-agency development and "a great example of a fundamental transformation in the whole approach to doing IT in government."

Frequently expressed fears of security shortcomings in cloud-computing use fail to make a realistic comparison with the security of in-house systems, says Hodgkinson. A certain constituency of developers with lucrative government clients are hoping for a major security failure involving the cloud to preserve their seat on the "gravy train", he suggests.

But the comparison of cloud and in-house services goes further than security, he says. It is reflected in general capability to develop well and speedily. In that respect, "my view is that there is an increasing gap between the capability of mature enterprise-grade cloud services and the capability of the average government agency — because of the constraints under which agencies operate," he says.

"It's all about focus," Hodgkinson explains. "Cloud providers are free to choose what goes into their service catalog"; they can concentrate on meeting selected needs that they know they can provide well.

"Agencies are in the business of providing any services government asks for.

"That's not to say all government services can be put into the cloud," he says. "They can't; but some can."

In expanding specialized in-house services to shared services, as many groups of government agencies are, they face the challenge of "re-engineering things that were never designed to be shared".

Outsourced cloud services, on the other hand are designed from the ground up to be shared."

Cloud Technology Overturns Healthcare IT Assumptions

Excerpted from HealthLeaders Media Report by Scott Mace 

I'm here to say that healthcare should be thankful it has come late to part of the technology party.

Why? Because healthcare doesn't have to play by the so-called rules that existed a few years ago. Healthcare can challenge the assumptions that drove decisions a short while ago and take advantage of cloud computing technology that overturns the conventional wisdom — and price structure — of IT services.

Want an example? Recently, I spoke to Qualsight, a healthcare provider you probably haven't heard of, even though it serves more than 75 million health plan members.

Chicago-based Qualsight launched eight years ago to connect independent ophthalmologists to healthcare plan sponsors to provide their members laser vision correction services. Today, the ophthalmologists operating out of 800 locations let Qualsight boast of being the nation's largest Lasik services manager.

Surprise number one: The main third-party vendor Qualsight uses to process credit cards for payments is PayPal, the eBay subsidiary.

That's right, PayPal is big business now.

Like others who deal with credit card information, PayPal requires Qualsight to comply with the Payment Card Industry (PCI) standard. With 800 practices, Qualsight could have implemented its own virtual private network (VPN). But instead, Qualsight is using a cloud-based, HIPAA-compliant VPN and database server to securely serve transactions through the cloud. Instead of going with the usual Oracle or MySQL database, Qualsight uses open-source PostgreSQL.

The key to making all this work, apparently, is to find just the right cloud hosting vendor, which in Qualsight's case is Firehost. "We've been with Firehost for probably a year and a half by now, and I've been very happy," says Carlos Navarro, Manager of IT at Qualsight.

In January 2010, Qualsight was running its own instance of the database from its offices. Such on-premise operation is another assumption of many healthcare providers today.

Then came the hackers.

"Nobody was here in the office," Navarro says. "There was an attempt to hack us from China. We determined that later, there were 15,000 attempts, and they successfully did penetrate. However, no damage was done."

The intrusion made Qualsight consider the possibility of perhaps its database elsewhere.

Before the evaluation was complete, fate intervened one more time. A major power outage in Chicago took Qualsight's services offline for six hours.

"We lost a lot of data, and at this point, the company decided we need to select a cloud vendor very quickly," Navarro says.

Firehost's security stood out. Navarro hasn't regretted the decision.

"The applications are shared among 800 practices, and most of the information that they're entering is completely HIPAA," he says. "We're talking about patients. We're talking about patient social security numbers. We're talking about outcomes of surgeries. We're talking information that's very delicate."

The switch over to the cloud was accomplished in a single weekend. "There were some changes that were required on our end, programming changes, just to make it compatible, but we did this over a weekend, so the practices never noticed anything," Navarro says.

Like other providers I've talked to about the cloud, Navarro takes solace in the kind of penetration testing that a cloud provider such as Firehost can attempt on a monthly basis—testing that a healthcare provider can hardly claim as a core competency. "This is all part of the service," he says.

The average healthcare executive can be forgiven for forgetting that the software powering today's systems is a patchwork quilt of updates, security fixes, and bug workarounds. The CIOs reading this, however, know all too well that it becomes less practical every day for this cost to be shouldered entirely by your average hospital or healthcare provider. 

Remember this when you're watching IT assumptions from the past decade crash and burn all around us: Every organization that's switched to the cloud seems to have its own version of the hackers-from-China story or the power outage story.

Remember this when you have to hire outside consultants to test your firewall's open ports, and then wonder how long it's been since the last test. Three months? Six months? Would your auditors be happy? Is not doing this testing often enough meeting the spirit and letter of the HIPAA law?

A lot of CIOs tell me they don't like the lack of transparency of cloud services. There's a reason they call it the cloud. What goes on inside there is, well, cloudy.

That doesn't dissuade Navarro. "I am not sure about all the details inside that makes the cloud tick," he says. "We get a report on a daily basis where we can see at any time and go historically back, I think three months or so, any intrusions or attempts of intrusions, which is phenomenal. We can see our backups. We can see reports on our vulnerability tests. We can see basically any information, anything that's eventually protecting our data."

Cloud vendors have to be very, very good at managing and applying all these fixes, or they'll be out of business in a real hurry

Maybe, in a few years, this wheel will turn again and the pendulum will swing away from hosted applications. But I doubt it. Cloud technology just makes sense. Obviously,  your mileage may vary. But as long as the cloud vendors say what they mean and mean what they say, the cloud will proliferate.

Cloud Is Now Mainstream in Banking Sector

Excerpted from Delimiter Report by Renai LeMay

It's finally happened. After years of expressing concern about the privacy risks, regulatory challenges, and technical inadequacies of the new clutch of technologies broadly known as "cloud computing," Australia's financial services sector has embraced the new paradigm wholesale. It's about time.

It wasn't that long ago that the words "cloud computing" represented something of a taboo for Australia's big banks and insurance giants. In the context of stern warnings from regulators like the Australian Prudential Regulatory Authority and a strong history of maintaining their own physical IT infrastructure in all areas, this is one sector which didn't jump on the cloud computing paradigm early. While other industries embraced technologies such as Salesforce.com, Google Apps and Amazon Web Services with aplomb, the land of cloud computing was one which, to Australia's financial services giants, appeared cloaked in stormclouds.

You only need to search any of Australia's major IT media outlets for "cloud" and "banks" to see this kind of pessimism in action. The most recent batch of anti-cloud sentiment was probably reported this time last year, when ANZ Bank Chief Information Officer Anne Weatherston pooh-poohed the cloud phenomenon and said it could be another five years before Australia's banks got serious about the cloud, but the anti-cloud sentiment started as early as 2007, when the Commonwealth Bank evaluated Google Apps and found it wanting.

I've covered enterprise IT in Australia for the best part of a decade now, and during that time I found it hard to imagine that we might see a major Australian bank using a software-as-a-service (SaaS) platform such as Salesforce.com for a mission-critical purpose, for example; let alone platforms such as Amazon Web Services or Google's Cloud Platform which have been the darling of startups almost since they were launched.

But something cracked this year in that steel gate of denial.

This month Amazon Web Services launched a dedicated datacenter in Australia to serve local clients. The foundation customer? None other than the Commonwealth Bank itself; one of the largest consumers of IT products and services in Australia, and one of the most anti-risk.

We now live in a world where the Chief Information Officer of the Commonwealth Bank — a man personally responsible for spending billions of dollars on information technology — stood up in front of the entirety of Australia's IT industry and said that he had had enough of all of the "excuses" which the industry had been holding the cloud back with for the past decade.

Security excuses for not moving to the cloud, regulatory excuses, financial excuses: According to Michael Harte, all of these excuses are "absolute garbage."

And CommBank is reaping the benefits of using Amazon's platform: to the tune, Harte told the AustralianIT this week, of tens of millions of dollars. Hardly pocket change: And you don't have to be an accountant to imagine that the bank is one of Amazon's largest Australian customers right now. And the bank is not the only major Australian financial services organization to feel this way.

At a media briefing several weeks ago, National Australia Bank (NAB) was singing from the same choirbook. In fact, the bank's Group Executive of Group Business Services Gavin Slater and Executive General Manager of Enterprise Transformation Adam Bennett, couldn't stop talking about the cloud.

NAB, the pair told media, now only paid for what it actually used in its IT infrastructure outsourcing contract with IBM. But it's not just servers and storage that the bank is consuming from the cloud. It's also telecommunications, where the bank has moved its huge enterprise contract with Telstra to a consumption-based pricing model as well, during its recent network consolidation.

Then there's Oracle's CRM on demand platform, which NAB is deploying to service its business banking platform. The platform is hosted by Oracle in Australia. And like CommBank, NAB is also getting its head around public cloud computing platforms, which it doesn't host customer information on, but it does host marketing material from. Then there's the way the bank has set up its entire private cloud infrastructure along environmental lines in an effort to contribute heavily to its carbon neutrality.

You'll have to stop me there; I've rabbited on about what CommBank and NAB are doing long enough.

But allow me to talk a little about Bank of Queensland, which last month revealed it was deploying Salesforce.com as one of its new customer relationship management platforms for front-line staff, and how the bank recently told its investors that software as a service platforms would allow it to boost its flexibility and cut its capital investment in IT infrastructure.

Or, allow me to talk a little about ANZ, which was already also using Salesforce.com, although Singapore's regulators don't appear too keen about the situation. Or about ING Direct, which has been so enthusiastic about Microsoft's private cloud offerings that it decided to virtualize its entire core banking infrastructure in a so-called "bank in a box" model that allows it to deploy copies of its entire banking platform to developers for testing and development purposes.

Or what about Westpac, which already has its own private cloud and also uses Microsoft's Windows Azure platform, as well as implementing a project to punt its collaboration platform into the cloud. The bank has even publicly discussed the potential to pool some IT infrastructure and applications between banks — for example, in the area of mortgage platforms, where the bank believes there is no competing advantage in the different platforms operated by the major banks.

To be honest, I could go on like this forever, but I suspect most of you would get bored; unless you work in banking IT infrastructure, in which case you're probably making a note of all of these examples, so that you can demonstrate to your boss that everyone else is doing this kind of stuff and your particular organization should be too.

So how and why did this happen?

Personally I don't know for sure, but I suspect that it's as simple as the fact that Australia's major financial services organizations experimented with various cloud computing technologies over the past several years, and nothing much went wrong, apart from a few early bugs. Certainly I haven't heard of any catastrophes involving mass customer data loss or huge outages.

Buoyed by early wins, I would surmise, Australia's financial services giants further examined the business case for adopting cloud computing technologies in various areas of their businesses and found it so compelling that they couldn't ignore the opportunity. And things progressed from there. Right now, I'm betting that most of Australia's banks and insurers are looking to push everything they can into the cloud. In the absence of any examples of huge problems (the "excuses" that Michael Harte talked about at the Amazon launch) and in the presence of the cloud's compelling business case, there is no reason to do anything else. And what we're seeing right now is the banks gradually going hog wild for the cloud.

The "tens of millions" of dollars in savings figure which Michael Harte threw around this week? That sounds like the very definition of what Gartner, in its legendary hype cycle, would describe as the "Plateau of Productivity". I doubt many would have thought we'd be here this soon; I certainly didn't just a year or so ago.

So what's next? That's the easy part. Government. With Australia's financial services sector jumping on the cloud bandwagon, the next target for cloud computing vendors should be the nation's departments and agencies, which have historically been even more risk-averse than the banks. We're already seeing some early deployments of SaaS and cloud computing technologies in forward-thinking states such as NSW, and I'm sure that virtually everyone in IT management in any government agency in Australia right now is aware that they will need to integrate the cloud into their IT strategy for the next five years — or be left behind.

Looking to Industry for the Next Digital Disruption

Excerpted from NY Times Report by Steve Lohr

When Sharoda Paul finished a postdoctoral fellowship last year at the Palo Alto Research Center, she did what most of her peers do — considered a job at a big Silicon Valley company, in her case, Google. But instead, Ms. Paul, a 31-year-old expert in social computing, went to work for General Electric.

Ms. Paul is one of more than 250 engineers recruited in the last year and a half to GE's new software center here, in the East Bay of San Francisco. The company plans to increase that work force of computer scientists and software developers to 400, and to invest $1 billion in the center by 2015. The buildup is part of GE's big bet on what it calls the "industrial Internet," bringing digital intelligence to the physical world of industry as never before.

The concept of Internet-connected machines that collect data and communicate, often called the "Internet of Things," has been around for years. Information technology companies, too, are pursuing this emerging field. IBM has its "Smarter Planet" projects, while Cisco champions the "Internet of Everything."

But GE's effort, analysts say, shows that Internet-era technology is ready to sweep through the industrial economy much as the consumer Internet has transformed media, communications, and advertising over the last decade.

In recent months, Ms. Paul has donned a hard hat and safety boots to study power plants. She has ridden on a rail locomotive and toured hospital wards. "Here, you get to work with things that touch people in so many ways," she said. "That was a big draw."

GE is the nation's largest industrial company, a producer of aircraft engines, power plant turbines, rail locomotives and medical imaging equipment. It makes the heavy-duty machinery that transports people, heats homes and powers factories, and lets doctors diagnose life-threatening diseases.

GE resides in a different world from the consumer Internet. But the major technologies that animate Google and Facebook are also vital ingredients in the industrial Internet — tools from artificial intelligence, like machine-learning software, and vast streams of new data. In industry, the data flood comes mainly from smaller, more powerful and cheaper sensors on the equipment.

Smarter machines, for example, can alert their human handlers when they will need maintenance, before a breakdown. It is the equivalent of preventive and personalized care for equipment, with less downtime and more output.

"These technologies are really there now, in a way that is practical and economic," said Mark M. Little, GE's Senior Vice President for Global Research.

GE's embrace of the industrial Internet is a long-term strategy. But if its optimism proves justified, the impact could be felt across the economy.

The outlook for technology-led economic growth is a subject of considerable debate. In a recent research paper, Robert J. Gordon, a prominent economist at Northwestern University, argues that the gains from computing and the Internet have petered out in the last eight years.

Since 2000, Mr. Gordon asserts, invention has focused mainly on consumer and communications technologies, including smart-phones and tablet computers. Such devices, he writes, are "smaller, smarter and more capable, but do not fundamentally change labor productivity or the standard of living" in the way that electric lighting or the automobile did.

But others say such pessimism misses the next wave of technology. "The reason I think Bob Gordon is wrong is precisely because of the kind of thing GE is doing," said Andrew McAfee, Principal Research Scientist at MIT's Center for Digital Business.

Today, GE is putting sensors on everything, be it a gas turbine or a hospital bed. The mission of the engineers in San Ramon is to design the software for gathering data, and the clever algorithms for sifting through it for cost savings and productivity gains. Across the industries it covers, GE estimates such efficiency opportunities at as much as $150 billion.

Some industrial Internet projects are already under way. First Wind, an owner and operator of 16 wind farms in America, is a GE customer for wind turbines. It has been experimenting with upgrades that add more sensors, controls, and optimization software.

The new sensors measure temperature, wind speeds, location and pitch of the blades. They collect three to five times as much data as the sensors on turbines of a few years ago, said Paul Gaynor, chief executive of First Wind. The data is collected and analyzed by GE software, and the operation of each turbine can be tweaked for efficiency. For example, in very high winds, turbines across an entire farm are routinely shut down to prevent damage from rotating too fast. But more refined measurement of wind speeds might mean only a portion of the turbines need to be shut down. In wintry conditions, turbines can detect when they are icing up, and speed up or change pitch to knock off the ice.

Upgrades on 123 turbines on two wind farms have so far delivered a 3 percent increase in energy output, about 120 megawatt hours per turbine a year. That translates to $1.2 million in additional revenue a year from those two farms, Mr. Gaynor said.

"It's not earthshaking, but it is meaningful," he said. "These are real commercial investments for us that make economic sense now."

For the last few years, GE and Mount Sinai Medical Center have been working on a project to optimize the operations of the 1,100-bed hospital in New York. Hospitals, in a sense, are factories of healthcare. The challenge for hospitals, especially as cost pressures tighten, is to treat more patients more efficiently, while improving the quality of care. Technology, said Wayne Keathley, President of Mount Sinai, can play a vital role.

At Mount Sinai, patients get a black plastic wristband with a location sensor and other information. Similar sensors are on beds and medical equipment. An important advantage, Mr. Keathley said, is to be able to see the daily flow of patients, physical assets and treatment as it unfolds.

But he said the real benefit was how the data could be used to automate and streamline operations and then make better decisions. For example, in a typical hospital, getting a patient who shows up in an emergency room into an assigned bed in a hospital ward can take several hours and phone calls.

At Mount Sinai, GE has worked on optimization and modeling software that enables admitting officers to see beds and patient movements throughout the hospital, to help them more efficiently match patients and beds. Beyond that, modeling software is beginning to make predictions about likely patient admission and discharge numbers over the next several hours, based on historical patterns at the hospital and other circumstances — say, in flu season.

The software, which Mount Sinai has been trying out in recent months, acts as an intelligent assistant to admitting officers. "It essentially says, 'Hold off, your instinct is to give this bed to that guy, but there might be a better choice,' " Mr. Keathley explained.

At a hospital like Mount Sinai, GE estimates that the optimization and modeling technologies can translate into roughly 10,000 more patients treated a year, and $120 million in savings and additional revenue over several years.

The origins of GE's industrial Internet strategy date back to meetings at the company's headquarters in Fairfield, CT in May 2009. In the depths of the financial crisis, Jeffrey R. Immelt, GE's chief executive, met with his senior staff to discuss long-term growth opportunities. The industrial Internet, they decided, built on GE's strength in research and could be leveraged across its varied industrial businesses, adding to the company's revenue in services, which reached $42 billion last year.

Now GE is trying to rally support for its vision from industry partners, academics, venture capitalists and start-ups. About 250 of them have been invited to a conference in San Francisco, sponsored by the company, on Thursday.

Mr. Immelt himself becomes involved in recruiting. His message, he says, is that if you want to have an effect on major societal challenges like improving health care, energy and transportation, consider GE.

An early convert was William Ruh, who joined GE from Cisco, to become vice president in charge of the software center in San Ramon. And Mr. Ruh is taking the same message to high-tech recruits like Ms. Paul. "Here, they are working on things they can explain to their parents and grandparents," he said. "It's not a social network," even if the GE projects share some of the same technology.

The Cloud Is Robin Hood: Bridges Gap between Rich & Poor

Excerpted from VentureBeat Report by Christina Farr

Who would have thought that cloud computing would be the modern day equivalent of Robin Hood?

In a report published by the University of San Diego, Unlocking the Benefits of Cloud Computing for Emerging Economics, researchers found countless benefits in increased global access to cheap data storage and processors. The authors intimated that in the future, cloud computing technologies will be an economic stabilizer.

Cloud-based technologies have experienced explosive growth in recent years, and evidence suggests that they will continue to grow. By 2014 Gartner predicted that 60 percent of the world's server workloads will take place on virtualized cloud servers.

It's great news for vendors and businesses, but what does this mean for people in countries like India, Mexico, and South Africa?

The researchers make the case that cloud computing is keeping the cost of storing information down, and is making broadband faster as more people can access it. In combination, these two factors will enable people in the low-and middle-income bracket to enter into the competitive global economy.

"This growth emerges from the cloud's economic advantages of scale and scope that lower costs, improve speed of service, expand operational flexibility for users and reduce risks in IT deployment," the report explains.

If conditions are reasonable (broadband is sufficient and there is the freedom to operate data centers), there are five major implications for people in the developing world:

The cloud enables people to be more competitive in "higher value-added products because goods and services are becoming more information and communications technologies (ICT) intensive." As ICT grows in intensity, the cloud enables emerging economics to tap the economic gains.

It's vital to ensure that these countries can compete in the "knowledge economy".

The cloud can stimulate the growth of small to medium sized businesses, and improve job creation. It's beneficial for entrepreneurs as it reduces the cost and upfront investment in the necessary IT infrastructure.

It enables governments to more effectively share and deliver information to citizens.

There is a strong synergy between the growth of the cloud and the build-out of broadband networks.

With an eye to the future, the cloud will improve transparency, and create a dialog between governments and citizens. As a final step, the researchers recommend that these governments work with multiple stakeholders to improve access to information and create policies that will enable people-in-need to benefit from the cloud.

Coming Events of Interest

INTELLIGENCE IN THE CLOUD - December 4th in Washington, DC. This workshop continues the NAB's series of programs developed for military and government professionals to demonstrate how advances in the commercial industries can benefit the military and government sectors. The atmosphere for the workshop is interactive with attendee participation welcome.

Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.

2013 International CES - January 8th-11th in Las Vegas, NV. With more than four decades of success, the International Consumer Electronics Show (CES) reaches across global markets, connects the industry and enables CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $195 billion US consumer electronics industry.

CONTENT IN THE CLOUD at CES - January 9th in Las Vegas, NV. Gain a deeper understanding of the impact of cloud-delivered content on specific segments and industries, including consumers, telecom, media, and CE manufacturers.

2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.

NAB Show 2013 - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event. 

CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

Copyright 2008 Distributed Computing Industry Association
This page last updated December 9, 2012
Privacy Policy