Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

Amazon Web Services

Aspera

Dax

Equinix

YouSendit

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

April 8, 2013
Volume XLIII, Issue 5


CLOUD COMPUTING CONFERENCE at NAB This Week

The Distributed Computing Industry Association (DCIA) invites DCINFO readers to attend the CLOUD COMPUTING CONFERENCE at the 2013 NAB Show in the Las Vegas Convention Center, Las Vegas, NV.

The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04.

This third annual DCIA event within the NAB Show, takes place this Monday and Tuesday, April 8th-9th, in N249 at the Las Vegas Convention Center in Las Vegas, NV.

The opening keynote speaker will be Amazon Web Services' Global Digital Media Business Strategy Leader Mark Ramberg (MON 10:30 AM. Marquee keynotes include Disney's Program Director, Cloud Hosting, Chris Launey (MON 1:45 PM), and IBM's Lead Partner for Global Business Services, Saul Berman (TUE 1:45 PM)," said DCIA CEO Marty Lafferty.

CCC at NAB will feature more than seventy speakers in a two-day event track that will demonstrate the new ways cloud-based solutions are providing increased reliability and security, not only for commercial broadcasting and enterprise applications, but also for military and government implementations.

From collaboration during production, to post-production and formatting, to interim storage, delivery, and playback on fixed and mobile devices, to viewership measurement and big-data analytics, cloud computing is having an enormous impact on high-value multimedia distribution.

Sponsors for this year's CLOUD COMPUTING CONFERENCE include Amazon Web Services, Aspera, DAX, Equinix, and YouSendIt.

US Digital TV Users Soaring

Excerpted from Media Daily News Report by Wayne Friedman

US digital TV users are climbing faster than expected. The number of US digital TV users — those who view at least one TV show per month via the Internet — will climb 37% in four years to 145 million in 2017, from 106 million in 2012. 

This amounts to digital TV user growth climbing at a 6.9% compound annual growth rate — a higher increase than previously forecast in August 2012 by eMarketer. Next year, it says digital TV viewers will cross a critical tipping point — surpassing 50% of the US Internet user population. 

Those users who watch at least one movie per month on any Internet-capable device will climb to 115 million in 2017 from nearly 80 million in 2012, a 9.7% annual growth rate. 

A Belkin and Harris Interactive survey of US Internet users said 12% would consider replacing their cable or satellite subscription with a streaming media subscription, such as Netflix or Hulu Plus in 2013. 

A total of 30% of respondents were inclined to at least consider cord-cutting. Still, another 37% "strongly disagreed" when asked whether they would consider replacing cable and satellite with only digital Internet TV. 

Evidence of growing digital TV/movie usage, says eMarketer, comes from Netflix — which reported US streaming revenues of $2.19 billion for 2012, growing moderately from quarter-to-quarter, with its US rental DVD revenues totaling $1.14 billion and declining each quarter.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe Cloud Computing Association (CCA) and the Distributed Computing Industry Association (DCIA) proudly announce our first wave of speakers and charter sponsors for CLOUD COMPUTING EAST 2013 (CCE:2013) taking place from May 19th through 21st at the Marriott Boston Copley Place in Boston, MA.

Three sectors of the economy that are leading the way in adopting cloud-based IT solutions are government, healthcare, and financial services.

CCE:2013 will focus on these three major sectors, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency, and streamlining costs.

An opening workshop on Sunday afternoon will orient delegates and provide a fundamental grounding in cloud models, deployment methods, and technology definitions.

Then a welcome reception will offer great networking opportunities for speakers, exhibitors, and audience members.

On Monday morning Stefan Bewley, Director, Altman Vilandrie; Fabian Gordon, CTO, Ignite Technologies; and Cameron Jahn, Product Marketing Manager, ShareFile, will present attendees with an overview of cloud computing adoption in the healthcare, government, and financial services sectors.

Brad Maltz, Chief Cloud Architect, Lumenate; Michelle Munson, President & CEO, Aspera; Matt McSweeney, VP, Sales, AppNeta; and Omar Torres, Director, Cloud Services Architecture and Operations, VeriStor, will then explore the latest trends and newest cloud offerings being introduced in these markets.

Next, Brian Benfer, Director, Healthcare, ShareFile; Joe Foxton, VP, Business Development, MediaSilo; Larry Veino, Director, Solutions Architecture, Presidio; and Larry Freedman, Partner, Edwards Wildman Palmer, will candidly assess current obstacles to adoption.

After a pace-changing workshop entitled Take the Bore out of Boardroom by Wes and Amy Peper, and conference luncheon, attendees will have the opportunity to choose from among three event tracks offering half-a-dozen panel discussions and some two-dozen case studies.

Chris Christy, Healthcare Principal, SAP America, for example, will offer a healthcare business-process-as-a-service (BPaaS) case study from the healthcare services sector.

Adam Firestone, Director, Defense and Government Solutions, WSO2, will present a case study on a different deployment model from the public sector.

And Yung Chou, Platform Technology Evangelist, Microsoft, will offer a case study on hybrid clouds in the financial services space.

At day's end, there will be an evening networking reception in the Exhibit Hall.

On Tuesday, we will continue with in-depth examinations of the subject matter — such as data storage considerations for healthcare with David Cerf, EVP, Corporate and Business Development, Crossroads; for government with Marlyn Zelkowitz, Global Director, SAP America; and for financial services with Chris Poelker, VP, Enterprise Solutions, FalconStor.

The hCLOUD (health) track will address the challenges of an enormous, fragmented, and technologically disconnected industry in adopting cloud-based solutions to help it become more efficient, collaborative, and interactively connected.

The gCLOUD (government) track will explore how of an explosion in data has created an unprecedented need for redundant and highly secure storage, along with fundamental changes to natural resource management, transportation and utility grid monitoring, public safety, law enforcement, and emergency responsiveness.

The fCLOUD (financial) track will reflect the fact that international financial transactions and currency exchange; domestic banking and insurance services; as well as timely and efficient investment decision-making are all being impacted by cloud computing. "The Cloud" is becoming the most advanced platform for an industry that makes up over one-fourth of our economy.

Platinum sponsors for CCE:2013 include Citrix Systems and ShareFile, and gold sponsors are A10 Networks, Aspera, FalconStor, and VeriStor.

Citrix Systems transforms how businesses and people work and collaborate. With market-leading cloud, collaboration, networking, and virtualization technologies, Citrix makes complex enterprise IT simpler and more accessible for 260,000 organizations. Citrix products touch 75 percent of Internet users every day, and the company partners with more than 10,000 companies in 100 countries.

ShareFile is an enterprise data solution that enables IT to deliver a robust data sharing and sync service to meet the mobility and collaboration needs of users and the data security requirements of enterprises. By making follow-me data a seamless and intuitive part of every user's day, ShareFile enables optimal productivity for today's highly mobile, anywhere, any-device workforce.

A10 Networks is the technology leader in application networking. Aspera is the creator of next-generation transport technologies that move the world's data at maximum speed regardless of file size, transfer distance, and network conditions. FalconStor software redefines data protection, with solutions that transform traditional backup and recovery into next-generation service-oriented offerings. And VeriStor is an end-to-end solutions provider specializing in enterprise data storage, virtual infrastructure, cloud services, migration and technology financing.

Cloud computing is revolutionizing the way virtually every sector of the economy does business and according to a recent study, spending on cloud-based technologies will grow by over 70% per year; creating over 213,000 new jobs each year for at least the next 5 years.

The gCLOUD, hCLOUD, and fCLOUD conference tracks will share a common exhibit hall, luncheons and networking breaks and receptions. The exhibit hall will serve as a hub around which participants from industry verticals and multi-disciplinary backgrounds will be able to interact, do business, and exchange ideas.

Investors and strategic-minded value-added resellers (VARs) will not want to miss hearing about the next BIG thing in the CLOUD! Share wisely, and take care.

What Netflix Could Do for Cloud Computing

Excerpted from Information Week Report by John Engates

It's popular to kick Netflix for its outages, for building on a proprietary cloud service or for not streaming enough of its extensive movie library. The company is often depicted as a poster child for cloud lock-in — the prisoner of Amazon.

There's some validity to each of these criticisms. But I think they miss a larger point. IT is rapidly evolving beyond the Amazon-focused "Cloud v1.0," whose shortcomings Joe Masters Emison recently called out in these pages. Thanks to the 165 companies who sponsor OpenStack, and the 860 developers who have contributed to that open-source project, we are well on our way to open standards that will end customer reliance on any one cloud provider's proprietary code. Netflix isn't going to stop the open cloud movement. But it's a shame that it doesn't join it, because it's in a unique position to take a leading role -- for its own benefit, and that of all cloud users.

The truth is that Netflix has been on the cutting edge of cloud innovation right from the beginning. It has some of the planet's smartest and most creative developers and systems architects working on a really big problem. And they're coming up with amazing solutions. The team there is probably at the pinnacle of engineering prowess when it comes to building a complex business at scale on a public cloud.

You don't have to take my word for it. Netflix has open-sourced many of the tools it created and you can see for yourself how hard they've worked to make a workable service on a closed platform. It's a gift to the community of cloud users and something that can help accelerate cloud adoption. I'm excited to see the results of its recently announced developer challenge.

But it could be doing much more. It could be making all of public cloud computing stronger, more stable and better able to perform at scale. Instead, it has limited the scope of its work to just adding tools to make a single, proprietary cloud run a little less erratically.

So I take exception when I read that "Netflix Is Ruining Cloud Computing" by creating tools to help people better use a major cloud service. It's not ruining cloud; it's just not contributing at the elemental level. It's adding intricate workarounds instead of rolling up its sleeves and fixing the underlying problems.

A perfect example of this is the Simian Army. This is a set of tools that help developers wrap their heads around the idea that their instances could fail at any time. When you can't see what's driving your infrastructure, you just have to accept regular instance failure. It's just a fact of life for some public cloud users (including many smaller ones who lack Netflix-scale resources to architect apps across multiple "availability zones" and geographies.)

Of course cloud outages don't have to be something to engineer around. Power users such as Netflix could actually work directly on the code that runs the cloud and make it more stable and scalable for themselves and other users. That's the beauty of OpenStack. Companies and individuals can improve the underlying architecture of cloud computing instead of laboring to create the ever-more complex workarounds and band-aids that are required to use a closed cloud.

That's what bums me out about the Netflix challenge. I'd love to see that creative energy applied to actually making better cloud code. Anyone can find the weaknesses in an open-source cloud operating system and improve on them. Just fix it. Then everyone who uses an open cloud can benefit from those insights, innovations and improvements.

Netflix isn't ruining cloud computing. It's pushing the boundaries of what's possible in the closed system it's committed itself to. But I'd like to see it doing more. Creating open-source tools on a closed cloud isn't enough.

We at Rackspace believe an open cloud takes us all further. That's why we open-sourced the cloud by co-founding OpenStack, in collaboration with NASA. That's why we contributed millions of dollars worth of our code to the project, then turned OpenStack over to an independent foundation. We're delighted that OpenStack now has more than 8,200 registered members in more than 100 countries, and big corporate sponsors such as Dell, HP, Red Hat and IBM (which recently announced that all of its new cloud hardware and software will be based on OpenStack.)

The cloud is still a new technology. It needs to be improved at the elemental level, not bettered by a series of bolt-ons. The open cloud will be made more reliable by the wisdom of crowds -- by the sheer number of developers, from many companies and cultures, working to make it better each day. It's a pity that Netflix has isolated itself. But the rest of us will work together to take this thing as far as we can.

The Big Bet at Intel That Could Change TV

Excerpted from Variety Report by Andrew Wallenstein

Employees at Intel Corp. are free to roam almost anywhere across the vast Santa Clara, Calif., campus they call home. Certain laboratories are off-limits of course; that's understandable when the assembly of microprocessors could be contaminated by a human eyelash. But there is one other building where a standard-issue Intel ID isn't going to get you past the guards inside the 80,000-ft. square space, one of the oldest edifices at the company's headquarters. Only 300 of the more than 100,000 people who work for Intel have clearance. And they've been instructed not to talk.

If there's anything conspicuously different about the structure, it's the massive 32-foot satellite that sits atop an adjoining parking garage. Dishes that size are typically found on the premises of a breed of company nothing like Intel: pay-TV distributors who need the kind of equipment that can sweep the entire hemisphere to acquire all national TV signals. Without this kind of hardware, it would take 48 separate satellites to accomplish the same task. The Intel of old would have no use for a tool like this one.

Having the only clue of the work he's done to date be visible to his Silicon Valley neighbors has a deeper meaning for Erik Huggers, head of the division housed inside, Intel Media, one he hopes will soon resonate among U.S. consumers.

"When you think Intel, you think something inside, a component of something sitting in your computing device," he said. "I think there's an opportunity for the company for the first time to have the perception of 'Intel outside.' If we succeed, I do think it will change the perception of Intel dramatically."

Huggers has walled off Intel Media from the rest of the company for most of the past two years because he believes achieving his goal requires developing a start-up culture separate from the one the chip manufacturer has cultivated for itself. There's also a more practical concern: He doesn't want anyone to get a glimpse of the product Intel Media is solely devoted to creating until it is ready.

But full disclosure became an inevitability a little over a year ago when the first press leaks began to shed light on just what Intel was developing. By the end of 2012, in the walk-up to the Consumer Electronics Show, the leaks became a stream so steady that the company was essentially forced to confirm what anyone who cared by then pretty much already knew: Intel intended to invade the pay-TV business with a nationwide multichannel TV service of its own.

The move could be deemed either inspired or insane, depending on your point of view. It's a direct challenge to leading pay-TV providers like cable operator Comcast and satcaster DirecTV, which may often be characterized as vulnerable but aren't actually showing signs of weakness right now. Intel is making a headlong leap into a business where Google has barely stuck a toe in the form of Google Fiber, which has rolled out its own broadband network in two Midwest cities. And though everyone expects Apple to rock this competitive field with some sort of solution of its own, what it is doing exactly and when it will arrive is completely undefined.

The notion that a Silicon Valley hardware stalwart would make a play for the living room seemed so preposterous it was still a little shocking to see Huggers go public in March with additional details about his plans at an industry conference. Most salient was his vow that Intel would start selling a device that would deliver video to TVs via broadband by the end of the year. He'd gone farther than any other tech giant rumored to be considering similar services, from AT&T to Sony: actually committing to coming to market.

What made the move all the more unlikely is that Intel isn't the innovative force it was when the company first came on the scene in 1968. Today it's seen as still sturdy but kinda stodgy. Consumers have a vague awareness of the brand left over from the days before its fortunes faded right alongside the company with which it is most associated: Microsoft, which used Intel's chips to power so many of the machines operating Windows.

The bigger knock against Intel is its terrible track record when it comes to diversifying beyond its comfort zone creating chips for PCs and servers.

And yet the maturation of that business makes branching out a must. "I do believe there's a big-picture plan at Intel to avoid missing the next big shift," said Doug Freedman, an analyst who covers Intel for RBC Capital Markets.

And while sources at several leading conglomerates confirm that Intel is in discussions with them to secure content, no deals have been announced (though Bloomberg News reported March 26 that the company has gotten closer with several major cable groups). To make those deals happen, Huggers has enlisted some media-biz insiders with impeccable credentials to consult on the project, including attorney Ken Ziffren, former MTV Networks affiliate sales chief Nicole Browning and TV programming exec Garth Ancier. But the absence of carriage contracts isn't a confidence-builder. "Intel has nothing if they don't get the programmers on board," noted Richard Greenfield, who also analyzes the tech sector for BTIG Research.

Bottom line: If Intel is going to launch a product with a breadth of content on par with the cable, satellite and telco giants with which it hopes to compete, it could cost billions of dollars in carriage fees. Huggers declined to specify how much the company, which has a market capitalization of approximately $100 billion, is spending on the project, but made clear this is not some skunkworks lark. "We're ambitious," he said. "We wouldn't be investing the way we're investing if we thought we were going to be a niche player."

Huggers understands how improbable Intel's foray is. But that's exactly the challenge that enticed him away from his previous post leading digital media at the BBC — another crusty monolith that didn't seem too well-poised for a technological revolution until it happened on his watch. He's eager to do it again in a bid to redefine what Intel is.

"We know this is very left field, we know this is crazy, but as a company we understand we must become more user-experience-driven," he admitted.

Here's a precis of what little is known at this point: Intel intends to allow subscribers to purchase a package of broadcast and cable channels that will be supplemented by various VOD options. The package will also be available across mobile devices. Please click here for the full report.

Cloud: It's Time to Just Do It — Ready or Not

Excerpted from InfoWorld Report by David Linthicum

It's speaking season, so I'm at a conference each week through mid-May. As always, I'm looking for what's hot or trendy in cloud computing right now, and trends point to a new acronym: JDID (just do it, dummy).

The chatter is not about what cloud computing is or what new concepts vendors are trying to push. The theme now is how to get this stuff working in the enterprise and making money for the business. The C-level executives are moving past the studies and strategies, and they want real results for their money.

As a result, those in IT charged with creating and implementing cloud computing strategies are almost in a panic. They are tasked with getting something running, no matter if it's a small private storage cloud, a few instances on Rackspace, or an application migrated to Azure. It's all about the doing, but in the JDID context, a few common issues are popping up and making it hard to deliver:

This is new stuff, so it's difficult to find people with experience. Enterprises are working their way through their first projects without the experience and talent typically required. That will result in lots of mistakes and a few failures.

The technology is showing its age -- meaning it's too young. For example, many organizations using OpenStack distributions are working through some of the limitations the standard imposes on OpenStack products, due to its early state of maturity.

The technology solutions are much more complex than we originally expected. Most private clouds are made up of four to six different technologies, covering usage monitoring, security, and management, so system integration and a good amount of testing is required.

This JDID trend is only beginning. Over the next few years, we'll see how well cloud computing actually meets the needs of the business. The fact is, it will follow the same patterns as the adoption of other platforms over the years, including the discovery that there is no magic. At the end of the day, it's just software.

Mind-Bending World of Cloud-on-Cloud Computing

Excerpted from Wired Magazine Report by Cade Metz

John Engates is the Chief Technology Officer at Rackspace, and even he had trouble wrapping his mind around the way his company runs its most important of technologies.

Rackspace, you see, runs its cloud software on its cloud software.

At first blush, this seems like some sort of cruel joke. Cloud software isn't the easiest concept to grasp — even when you're running the stuff all by itself. And it doesn't help that the world's PR departments have co-opted the cloud moniker and applied it to, well, just about everything.

But in the end, Rackspace's cloud-on-cloud arrangement makes good sense. This sort of multilayered setup is rather common in the world of computer science, and if you take the time to think through what Rackspace is doing, it may even help you grasp the very real but often elusive ideas that underpin cloud computing — ideas that are remaking the way the world runs software and stores data.

Headquartered in San Antonio, TX, Rackspace is one of Amazon's chief rivals in the cloud computing game. Much like Amazon, Google and Microsoft, it operates a set of web services that give you instant access to computing power, letting you run sweeping software applications and store massive amounts of data without setting up your own fleet of machines. But the company is also one of the driving forces behind a movement to change the way these cloud services are built.

Little more than two years ago, Rackspace teamed with NASA to create an open source software project called OpenStack. The aim was to bootstrap what you might call a Linux for cloud computing — free software anyone could use to create their own cloud services — and the project has taken off with a speed few expected. It's now backed by everyone from Red Hat to HP, IBM, and Cisco.

But Rackspace is still the project's flagship. It now runs OpenStack in the massive data centers that underpin its own services — and then some. As Engates explains, the company prefers to run OpenStack on top of OpenStack.

"We're running our cloud inside of a cloud," Engates says. "All of the control nodes that are necessary to serve the customers on our cloud are running in another OpenStack cloud."

He chuckles, acknowledging just how odd this sounds. But then he says — quite plainly — that the arrangement is only natural. "It's a better way of doing things," he says. "We're merely eating our own dog food. The application we're offering to customers is running on that same application."

OpenStack is a way of pooling resources from a vast collection of machines, including processing power and storage space. Rather than running your software application on a particular server, you run it on OpenStack, a platform that spans hundreds of servers, and this platform can grab you as much processing power as you need, whenever you need it (at least in theory).

This makes it easier to launch applications, but it is also makes it easier to expand or "scale" them — to reach more users with more servers. And when a server fails, the platform is smart enough to move the machine's work to a new one.

But OpenStack is itself a software application. It too can run on OpenStack, and there's good reason for it to do so.

If OpenStack runs on Openstack, Rackspace can more easily deploy it and expand it and update it. The setup isn't that far removed from using Amazon Web Services to run some other cloud service — something that happens all the time. Salesforce.com's cloud service, Heroku, for instance, runs atop Amazon.

This is just the way software works. You use software to create software, and sometimes, you use software to create itself.

"The art that software developers have perfected more than any other art is the art of decomposition. They create these blocks — like LEGO building blocks — that you can build incredibly complex things out of, including the building blocks themselves," says Chandler Carruth, a Google engineer who helps design the developer tools the company uses to build its many software applications. "Of course you would run OpenStack on OpenStack. I'm sure, the first time someone did it, it seemed like such a new idea. But it just makes sense."

In similar fashion, Carruth and Google uses those developer tools to, well, build those developer tools.

Rackspace isn't the only one backing the idea of running OpenStack atop OpenStack. Monty Taylor, a former Racker who now works on OpenStack at HP, is overseeing an effort to transform this idea into an open source project. It's called TripleO, for OpenStack on OpenStack.

Taylor believes this is only the beginning of a new way of thinking about computing power. OpenStack is designed so that a central server — or controller — can configure all the other machines on a network to behave as a whole. Nowadays, you start by loading software on this central server and plugging it into your network. But soon, Taylor says, you'll be able to do this with a notebook or some other handheld device.

"I'll be able to hand you a thing to run on your laptop, and you can plug it in to the network at the data center, bootstrap the whole thing, and then run all of your bare-metal physical infrastructure like it was a cloud," he says. "Then, of course, one of the things you can do with that is run a cloud on it."

The Nitty-Gritty of Mobile Cloud Computing

Excerpted from EzineMark Report by David Parker

Instead of developing mobile apps for the rigid platform, cloud computing has made developing applications much flexible. Cloud computing encompasses on demand availability of the storage, software, and processing power. It is embedded with device independence, reliability, ease in access, security and least required maintenance. Cloud computing services have become easy to reach for all enterprises that deliver compatible and resilient services to employees and customers with enhanced business agility.

Bringing the cloud computing services to the mobile environment is termed as Mobile cloud computing. It consolidates the elements of cloud computing and mobile networks, providing excellent services to mobile users. In mobile cloud computing, powerful mobile configuration is not much required since all the entangled and complicated computing modules are processed in the cloud itself.

It is the amalgamation between cloud computing and mobile network offering optimal applicability for mobile users. Rather than keeping on individual devices when data is kept on Internet, cloud computing exists there providing on demand access.

One of the key issues that most of the cloud providers are considering is securing user protection and integration of application data. As mobile cloud computing is a combinational field of cloud computing and mobile networks, the security related matters are categorized as: mobile networks user's security and cloud security.

Mobile network user's security — different mobile handsets such as smart-phones, laptops, PDAs, and cellular phones are well aware of the numerous security instability and vulnerabilities such as malicious codes.

Some cloud computing applications can prompt to security breaches as these devices troubling the user ultimately.

The best way of capturing security vulnerability is to run security software and anti-virus programs on mobile devices. Mobile devices are litigated with processing limitations which is why securing them from these threats could be more hazardous as compared to regular computers. Number of approaches can be developed transferring vulnerability detection and protection mechanism to the cloud. The application must handover to the user after it passes through some level of threat evaluation. All file activities before sending to the user need to be verified if it is malicious or not.

Scenarios are created for privacy issues when personal information such as your current location or user's hidden information is revealed.

Information secured on the clouds — Enterprise and individuals both take the bonanza of storing large amount of data in the cloud.

The integrity of the information stored on the cloud must be properly ensured by the user. Every user access must be authenticated. Different approaches can be proposed ensuring integrity of the information stored on the cloud.

Numerous authorization mechanisms have been proposed with the means of cloud computing in order to secure the data access. Some make use of open standards and support the integrity of the various authorization methods.

Android to Join the Search for Black Holes

Excerpted from Android Authority Report by Robert Triggs

Some of you may have heard of distributed computing projects like Folding@Home or SETI@Home, if not then let me explain. Distributed computing essentially involves performing a computational intensive task across multiple computers in order to achieve a common goal. @Home networks simply allow users all over the world to contribute to various computing projects by using their spare PC processing power.

It's pretty clever but this isn't a new idea, SETI@Home started all the way back in the 90s, and currently at least 400,000 machines are collaborating over various networks to solve problems ranging from looking for aliens to studying Parkinson's disease.

Ok so where does all this fit in with Android, black holes and other crazy space phenomena? Well in a recent interview with Wired, Professor David Anderson, the computer scientist behind the original SETI@Home project, talked about the continued effort to bring distributed computing to Android, including a relativity new project called Einstein@Home.

For the last six months, Anderson and his team have been building and updating their BOINC software which allows for distributed computing on Android smart-phones and tablets. There's actually an app already available in the Google Play Store. The move to Android has only recently been possible thanks to the increasingly powerful hardware used in the latest smart-phones, as a result the BOINC app is now specifically designed to take advantage of ARM chip-sets.

He also mentioned that the Einstein@Home computing network would be supported in the next few months, which will allow users to assist in the discovery of pulsars, black holes, and gravitational waves. It's pretty awe-inspiring to potentially be a part of discovering some of the most mysterious and illusive aspects of our universe, and you won't even need to break out a calculator or telescope.

The upcoming updates will also hook Android users into other projects which are running on IBM's World Community Grid, which is looking to help tackle Malaria through drug research.

Also you needn't worry that computing all this stuff will drain your battery or totally take over your device. BOINC will only start data-crunching when your smartphone is charging and connected to a WiFi network, and even then it makes sure not to interfere with other CPU intensive tasks or overheat your device.

If you're interested in becoming involved in the search for black holes or want to dedicate some of your computer's spare processing power to tackling cancer, then check out BOINC for Android or similar projects like Folding@Home. After all, it's for science.

DaaS, MaaS & DRaaS: Next Phase of Cloud Computing

Excerpted from ReadWriteCloud Report by Scott Geng

It's no secret that the public cloud market has been growing like gangbusters. In fact, a recent Gartner study found spending on public cloud services is growing at more than 28% per year and private cloud spending is three times that of public cloud. That projects total cloud spending in 2016 to hit $240 billion.

Cloud computing (both public and private) will pave the way forward for how companies will deploy new IT services. Lower price points will help those organizations innovate faster, launch new services more quickly, be more responsive to market conditions and evolve their own business models.

The focus in the industry over the past few years has been on the core cloud management services of SaaS, PaaS and IaaS. But to truly understand how cloud computing is evolving you have to dive deep below the surface. Two major developments are driving the evolution of cloud: Management and Specialization.

In the management space, innovations like self-service portals have given IT shops and end-users a much-preferred way to request and consume services.

Specialization, meanwhile, is a natural development of any market. A few of the specialized services that will contribute significantly to the adoption of cloud based products and services in 2013 include Desktop-as-a-Service (DaaS), Metal-as-a-Service (MaaS) and DisasterRecovery-as-a-Service (DRaaS).

Desktop management is a fundamental service for IT organizations. It's critical for keeping the employees of a company productive. But there have been long standing challenges with managing the traditional desktop. The investment in desktop hardware can be a significant capital expense, especially for large organizations and day-to-day management of these devices can be a huge time sink.

DaaS solutions are secure, cost-effective, easy-to-use and portable — you can get the same desktop on any device.

According to the 451 Research Group, "Interest in third-party DaaS is at a fever pitch." IT consumerization, BYOD (Bring Your Own Device) initiatives, increase in mobile workers, Windows 7 migrations and security/IP concerns are driving organizations to reevaluate their desktop strategy.

Virtual Desktop Infrastructure (VDI) was supposed to address many of these challenges, but it came with its own set of issues. While it has been promoted as a technology that can save businesses money, large upfront capital expenses and complexity have created barriers to virtual desktop adoption.

With DaaS, savings come from operational expense reductions from centralizing and reducing administration and hardware savings over time. DaaS delivers faster desktop deployment, enhanced security, less downtime and lower support costs - and can enable a truly mobile workforce.

MaaS - the dynamic provisioning and deployment of whole physical servers, as opposed to the provisioning of virtual machines - is a drastically underrated cloud service. MaaS services will finally open the floodgates to allow any application to be run in the cloud — any application with any service level. That means multi-tiered apps with a backend Oracle database, home grown, performance-intensive applications, low latency trading applications, etc.

It's been hard for people to pay attention to MaaS, mostly because server virtualization has been "the shiny new toy" over the past few years and frankly MaaS is not an easy thing to provide. But that may change once IT administrators see the speed, scalability, agility and simplicity with which they can deploy and protect their underlying server infrastructure.

The statistics are clear — a large percentage of servers have been virtualized in the enterprise (40% - 50% now and heading to 60% - 70%). However, there are still a large number of applications that remain running on bare metal. That important (and underappreciated) fact means that MaaS could be a key ingredient to driving more widespread adoption of cloud technology.

Over the past few years, IT departments have had to live in a culture of cost reduction — it's just been the way of life. That culture has resulted in aging equipment, overworked staff and lots of cut corners - a perfect recipe for higher failure rates. The fact is that hardware failure and human error are still the leading causes of unplanned outages - but devastating storms and other catastrophes are also forcing businesses to get serious about geographic disaster recovery planning. Some estimates put 2011 weather related disaster costs at almost $150 billion worldwide, up 25% from 2010. And that is just weather, and doesn't include the earthquake and tsunami in Japan.

Another strong driver for disaster recovery is public cloud outages. The public cloud companies are under intense scrutiny - every major outage is noticed and publicized. The statistics show that public cloud outages are on the rise year-over-year, and because so many businesses use these services, public-cloud service failures are felt very broadly — e.g. the Amazon outage that impacted Netflix.

One of the forces driving the next phase of cloud computing adoption is the delivery of specialized services like DaaS, MaaS and DRaaS. These services will help improve the service level of cloud resources, boost efficiency and automation and deepen the consumerization of IT resources. They will also give companies more confidence in placing business-critical applications into cloud infrastructures.

Amazon & Google Cut Cloud Computing Prices

Excerpted from The Motley Fool Report by Evan Niu

E-tail giant Amazon has now announced that it is reducing prices on cloud computing services in its Amazon Web Services division. Instances of Microsoft Windows running on its Elastic Cloud Compute, or EC2, service will receive the price cuts of up to 26% as Amazon continues to reduce costs and pass those savings on to customers.

Amazon has a long history of reducing AWS pricing. In its last earnings release, the company noted that AWS has lowered prices 24 times since launching in 2006, with 10 price reductions in 2012 alone.

At the same time, rival Google similarly said it would lower prices for its Google Compute Engine, nine months after launching the service. Google is dropping prices by 4% across all Compute Engine pricing. Additionally, the search giant has expanded availability and added new features.

It's incredible to think just how much of our digital and technological lives are almost entirely shaped and molded by just a handful of companies. Find out "Who Will Win the War Between the 5 Biggest Tech Stocks?" in The Motley Fool's latest free report, which details the knock-down, drag-out battle being waged by the five kings of tech. Click here to keep reading.

Cloud's Next Era Near, Cisco Says

Excerpted from Information Week Report by Charles Babcock

The reorganization of computing into larger, more demand-responsive cloud-based data centers run by Google, Amazon Web Services, Rackspace and others is part of a shift in business that replaces transaction systems with "systems of interactions," said Cisco Systems VP of cloud computing Lew Tucker on Wednesday in an address at the Cloud Connect 2013 conference, a UBM Tech event in Santa Clara, CA.

The transaction systems were systems of record. The interaction systems are "systems of engagement" that will be key to business success in the future, Tucker said, crediting Geoffrey Moore, author of "Crossing the Chasm," with coining the "systems of engagement" phrase.

Tucker gave one of the opening keynotes at the Cloud Connect 2013 show, and said there were deeper trends behind mammoth data centers like Facebook's Prineville, Ore., complex and consumers' love of smart-phones, iPads and other handheld devices. The small computing device communicates from many locations with the big data center, using a small application to get a piece of work done, he noted.

"I don't think we'll see any more big productivity suites, like Microsoft Office," he said. Instead, users will learn a constantly changing mix of small apps that do the things they're most interested in doing now. It's all part of corporations trying to become more responsive and interactive with their environment -- to behave "less like organizations, more like organisms," Tucker said.

"Analytics becomes business critical" because data is being generated by the Internet of things, the billions of devices about to be connected to the Internet and feed data into it, said Tucker. By 2020, there will be 50 billion connected devices, and businesses will rely on the information they provide to help them map what they should be doing next, he said.

The billions of connected devices drive a need for cloud storage and cloud analytics; the creation of big data drives business decision-making and businesses' need to keep employees in constant collaboration and communication, driving a need for a new style of internal networking: the software-defined network that responds more flexibly to changing conditions, he said.

The changes are not only driving cloud computing but changing the role of IT in business. The "systems of interaction" both hold the key to the future for a business and also contain the value of IT to the business, he said.

Space Monkey Shows-Off P2P Storage System

Excerpted from TechCrunch Report by Anthony Ha

We've written about Google Ventures backed Space Monkey before, but last week we actually got to film the peer-to-peer (P2P) storage service in-action.

Co-founder Clint Gordon-Carroll described the technology as "our way of disrupting the cloud" — you store your data on your own Space Monkey device, but it's then encrypted and backed up on other devices across the company's user network.

The goal is to give you the advantages of cloud storage (backup, sharing, and accessibility from any device) at faster speeds and lower costs (a basic subscription costs $10 a month and includes a terabyte of storage).

You can see Gordon-Carroll walk through the Space Monkey interface towards the end of the video above. He browses the folders he's stored on Space Monkey, then starts playing music without any noticeable delay. He also shows off the pinning feature, where certain files are also stored locally on your computer for offline access.

I asked Gordon-Carroll and his co-founder Alen Peacock about their target user. They said they're looking for people with lots of data.

"It's a generic market, but we see a trend, and that trend is the amount of data people are generating or creating over time," Gordon-Carroll said.

As for what's next, they said that they've got most of the technology and manufacturing in place. In the next few weeks, they plan to launch a Kickstarter campaign to fund their efforts to actually bring Space Monkey to market.

Coming Events of Interest

2013 NAB Show - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB Show - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.

Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.

CLOUD COMPUTING EAST 2013 - May 19th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Copyright 2008 Distributed Computing Industry Association
This page last updated April 18, 2013
Privacy Policy