Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

Oracle

Savvis

SoftServe

TransLattice

Vasco

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

November 11, 2013
Volume XLV, Issue 12


GOVERNMENT VIDEO IN THE CLOUD in WDC on 12/4

The DCIA will present GOVERNMENT VIDEO IN THE CLOUD (GVIC), a Conference within the Government Video Expo 2013 (GVE) on Wednesday December 4th at the Washington Convention Center in Washington, DC.

GVE, co-located with InfoComm's GovComm, brings the east coast's largest contingent of video production, post, digital media, and broadcast professionals together with government AV/IT specialists. The combined event features over 150 exhibits and nearly 6,000 registrants.

The CCA offers sponsorship opportunities for this event.

Please click here to register. DCIA Member Company employees and DCINFO readers are entitled to a $100 discount by using registration code GVE.

The needs for cloud solutions for producing, storing, distributing, and analyzing government-owned video content are greater than ever.

GVIC will focus on cloud solutions for government video, including intelligence, surveillance, and reconnaissance; and other use cases such as agency communications and law enforcement.

The opening keynote by Tim Bixler, Federal Manager, Solutions Architecture, Amazon Web Services (AWS), will offer an "Update on Cloud Video Services Adoption in the Public Sector."

The first two case studies will be: "Cloud Solutions for Government Video Production" by John Heaton, Director of Sales Engineering, Americas, Aspera, and "Cloud-Based Management of Government Video Assets" by Frank Cardello, General Manager, Platform, T3Media.

The first GVIC panel will add Cirina Catania, Independent Video Producer, to the discussion and cover "Considerations for Creating Government Video in the Cloud."

After a networking break, the second GVIC keynote by Adam Firestone, Director, Solutions, WSO2 Federal Systems, will address "Security & Reliability Concerns Unique to Government Video in the Cloud."

The next two case studies will cover "Distribution of Government-Owned Video from the Cloud" by Adam Powers, VP of Media Technology & Solutions. V2Solutions and "Analysis of Aggregated Government Video Content" by Michael Rowny, CEO, PixSpan.

The closing GVIC panel discussion will add Larry Freedman, Partner, Edwards, Wildman, Palmer, and examine "Considerations for Cloud Dissemination of Government Video."

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA salutes incoming Federal Communications Commission (FCC) Chairman Tom Wheeler and looks forward to working with his new administration to continue broadband development initiatives spearheaded by former Chairman Julius Genachowski and to launch additional endeavors that will help advance the distributed computing industry.

Chairman Wheeler's carefully chosen staff includes Chief of Staff Ruth Milkman, Senior Counselor Phil Verveer, Head of the Technology Transitions Policy Task Force Jon Sallet, Special Counsel Diane Cornell, Special Counsel for External Affairs Gigi Sohn, Acting Managing Director and Advisor to the Chairman for Management Jon Wilkins, Acting Chief of the Wireless Telecommunications Bureau Roger Sherman, and Legal Advisors to the Chairman Daniel Alvarez, Maria Kirby, and Renee Gregory.

The new Chairman takes over the Commission at a time when the speed of innovation of Internet-based services in the private sector holds promise to continue at a breathtaking pace, and the potentially beneficial impacts to society and the economy of transformative technologies which rely on network connectivity — with cloud computing arguably chief among them — are nothing short of enormous.

At the same time, the challenges to continued advancement have never been greater due to several factors, ranging from threats to end-users' trust of their network operators and service providers as a result of old and new abuses; to threats to innovators' abilities to deploy new offerings as a result of unchecked hampering by threatened industry incumbents whose entrenched business models are being disrupted by these advances.

President Obama could not have nominated a better candidate for FCC Chairman than Tom Wheeler, whose breadth of experience and track record of accomplishments uniquely qualify him for this role.

We're pleased that the Chairman has tasked Diane Cornell with heading a temporary working group over the next two months to identify FCC regulations that are past their prime and FCC procedures that can be improved upon — and to employ crowdsourcing, among other methods, as a contemporary communications technique to be employed by this group.

DCINFO readers should seize this opportunity to be a part of that crowd.

If successful, this effort will help prioritize matters the Commission needs to address to ensure continued expansion of such promising areas as mobile cloud computing, big data, and the migration to IP transport of high-value multimedia.

The concepts behind "content everywhere available to each person at any time" have an excellent chance of moving closer to reality under Chairman Wheeler's expert guidance.

We are also heartened by Chairman Wheeler's acknowledgment of the Commission's responsibility "to act in the public interest, convenience, and necessity" to assure that innovation and technology advance with speed while preserving the relationship of trust between networks and those connected by them.

And we're thrilled by his description of himself as "an unabashed supporter of competition because competitive markets produce better outcomes than regulated or uncompetitive markets."

An increase in wireless broadband spectrum is an obvious need at this time.

But our chief concerns as a new and formative industry center on two other areas: privacy violations by unbridled over-reaching federal agencies; and collusive resistance to change by entrenched industries whose power structures permit and support anti-competitive behavior.

Specifically, we need resolution of the issues that threaten our continued growth internationally as a result of Edward Snowden's exposures of scandalous NSA practices.

And even more importantly, we need resolution of the real issues at the core of the Aereo broadcast retransmission dispute.

An extension of compulsory licensing to this new technology could provide a stopgap measure to end the current litigation, protect the now necessary dual revenue streams for over-the-air TV stations, and enable innovators like Aereo and FilmOn to emerge from the shadow of copyright infringement.

Our view of the real problem here goes much deeper, however.

And it is that independent IPTV needs to be legitimized as a multichannel video program distribution (MVPD) channel and enabled to enter into carriage agreements with television programming services.

It's time for the realtime distribution of TV channels to break free from the current limitations that shackle them — of only being licensed for exclusive delivery by broadband network operators.

The future can best be secured with the support of a pro-competition FCC by encouraging and not discouraging investment, by nurturing and not stifling innovation; by increasing and not reducing competitive opportunities, by protecting and not violating the trust of consumers, and by ensuring that the benefits of new communications technologies are accessible to all and not just a few.

We are fully aware that the FCC alone does not have the power unilaterally to address the issues that are posing such serious threats to the further advancement of our industry, the economy, and society at large.

But the Commission's abilities to advise Congress and to influence other agencies are strong, and its leadership in these areas can be unequalled in the federal government.

And to fulfill its role as the "Optimism Agency," act it must, and as Chairman Wheeler has requested, act nimbly. Share wisely, and take care.

CONNECTING TO THE CLOUD in LV on January 8th

The DCIA will present CONNECTING TO THE CLOUD (CTTC), a Conference within the 2014 International Consumer Electronics Show (CES), on January 8th in the Las Vegas Convention Center, Las Vegas, NV.

The CCA is handling sponsorships.

CTTC at CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector or — as ABI Research's Sam Rosen referenced that category last week at CLOUD COMPUTING WEST — the "cloud electronics (CE) sector."

Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

An opening panel moderated by Tanya Curry-McMichael, VP of Strategy and Marketing, Verizon Digital Media Services, will examine "Millennials, Online TV, and Gaming: Now and Tomorrow."

What are the implications of the digital revolution in the way Millennials discover, access, and consume video, music, and gaming content online?

Hear it first-hand from young voices representing leading companies in the digital, social, and tech arenas.

Bhavik Vyas, Media & Entertainment Partner Eco-System Manager, Amazon Web Services (AWS), will further examine this issue in "Who's Connecting What to the Cloud?"

And Sam Rosen, Practice Director, TV & Video, Consumer Electronics, ABI Research, will address, "Where Are There Problems Connecting to the Cloud?"

Next, in two back-to-back presentations, Robert Stevenson, Chief Business Officer & VP of Strategy, Gaikai, will explore "Consumer Benefits of Cloud-Delivered Content: Ubiquity, Cost, Portability Improvements." And Reza Rassool, Chief Technology Officer, Kwaai Oak, will expose "Consumer Drawbacks of Cloud-Delivered Content: Availability, Reliability, Scalability Issues."

The follow-on panel with Jay Migliaccio, Director of Cloud Platforms & Services, Aspera; Andy Gottlieb, VP, Product Management, Aryaka; Larry Freedman, Partner, Edwards Wildman Palmer; David Hassoun, Owner & Partner, RealEyes Media; Jay Gleason, Cloud Solutions Manager, Sprint; and Grant Kirkwood, Co-Founder, Unitas Global, will discuss "The Impact on Telecommunications Industries of Cloud Computing."

Then two sessions will delve into "Telecommunications Industry Benefits of Cloud-Delivered Content: New Opportunities" with Doug Pasko, Principal Member of Technical Staff, Verizon Communications. And then "Telecommunications Industry Drawbacks of Cloud-Delivered Content: Infrastructure Challenges" with Allan McLennan, President & Chief Analyst, PADEM Group.

The next panel will address "The Impact on Entertainment Industries of Cloud Computing" with Mike King, Dir. of Mktg. for Cloud, Content & Media, DataDirect Networks; Venkat Uppuluri, VP of Marketing, Gaian Solutions; Mike West, Chief Technology Officer, GenosTV; Arnold Cortez, IT Consulting Specialist, IBM; Kurt Kyle, Media Industry Principal, SAP America; Adam Powers, and VP of Media Technology & Solutions, V2Solutions.

Two solo presentations with Les Ottolenghi, Global CIO, Las Vegas Sands Corporation, and Saul Berman, Partner & Vice President, IBM Global Business Services, will highlight "Entertainment Industry Benefits of Cloud Computing: Cost Savings & Efficiency" and "Entertainment Industry Drawbacks of Cloud Computing: Disruption & Security" respectively.

Additional sessions will introduce the subjects "Consumer Electronics Industry Benefits of Cloud-Based Services: New Revenue Streams" with Mikey Cohen, Architect & Principal Engineer, Netflix, and "Consumer Electronics Industry Drawbacks of Cloud-Based Services: Complexity" with Tom Joyce, SVP & GM, HP Converged Systems, Hewlett Packard.

The closing panel will draw on all the preceding sessions to more deeply analyze "The Impact on the Consumer Electronics Industry of Cloud Computing" with Michael Elliott, Enterprise Cloud Evangelist, Dell; David Frerichs, President, Media Tuners; Thierry Lehartel, VP, Product Management, Rovi; Russ Hertzberg, VP, Technology Solutions, SoftServe; Guido Ciburski, CEO, Telecontrol; and Scott Vouri, VP of Marketing, Western Digital.

Top program topics will include case studies on how cloud-based solutions are now being deployed for fixed and mobile CE products — successes and challenges; the effects on consumers of having access to services in the cloud anytime from anywhere — along with related social networking trends.

Also featured will be what broadband network operators and mobile Internet access providers are doing to help manage — and spur — the migration to interoperable cloud services.

Some in traditional entertainment industries find this technology overwhelmingly threatening and disruptive — others see enormous new opportunities; and the value proposition for CE manufacturers will also continue to evolve substantially to providing cloud-based value-adding services — rather than conventional hardware features.

Please register now for CTTC at CES.

CES 2014: What to Look for

Excerpted from CES Blog by Shelly Palmer

The 2014 International CES (January 7th-10th in Las Vegas) is around the corner. Thinking back to last year's show, there were a few trends that stood out:

The Goldilocks Screen-size Strategy. If the 4″ iPhone screen was too small and the 5.7″ Samsung Galaxy Note screen was too big, perhaps the 4.7″ HTC One screen was just right. Now it's even more extreme — phones are getting bigger, tablets are getting smaller — from the 1.6-inch screen of Samsung's Galaxy Gear all the way up to the 110" 4KTVs that everyone was lusting after — a screen is a screen is a screen. Just pick the one you like.

Health and Wellness. 2013 was the year of The Quantified Self. I've documented my Quantified Journey and spent plenty of time talking about The Quantified Executive and the problems those executives face. Health and fitness apps and technology were all the rage last year at CES, and there's no sign of slowing down. We're only going to get more data to help us understand our bodies better and quantify ourselves more. Health and wellness may have burst onto the scene at the last CES, but it'll be even bigger and better this January.

The Connected Home. Just like health and wellness, we're getting more technology to make more areas of our home smarter than ever. Just think about the areas our homes have gotten smarter in the last year: thermostats, smoke detectors and even door locks. I could go on, but you get the point: our homes are smarter than they've ever been and they're going to get even smarter next year.

Ultra HD TVs and Curved OLED Screens. While Ultra HD (4K) hasn't taken off yet — 65″ 4KTVs are still pretty expensive — 2k curved OLED HDTVs are shipping and both Samsung (the Galaxy Round) and LG (the G Flex) have unveiled smartphones with curved screens in the past month.

Self-Driving Cars. Google and Tesla are each working on self-driving cars. Mercedes' latest S-Class car features some self-driving tech. In the past week, self-driving cars have made major splashes in the tech news world. Google has improved its self-driving cars so much that they're statistically safer than the average driver. They're so safe, that an independent research study by the Eno Center for Transportation said that wide proliferation of self-driving cars could cut down vehicular injuries by 90 percent and save us $450 billion annually. That's amazing. Expect self-driving cars to be on full display in January.

What else will we see at the 2014 International CES?

Blending the physical and digital world. Look at wearables. Look at Google Glass. We're becoming exo-digitally enhanced humans and we're incorporating our digital world more and more into our physical world.

Driverless cars. You think cutting down injuries by 90 percent and saving us $450 billion annually won't get the attention of every major automotive manufacturer?

Consumer health care. All over the place. The Quantified Self.

WIWWIWWIW Video. Think Aereo, think Netflix's TV disruption, think mobile apps (like this one from FiOS) that let you watch live TV on the go. Couple all that with major strides in amazing and affordable internet speeds like Google Fiber. Now think about all the companies that will follow suit in the worlds of free broadcast TV, original content online and mobile apps letting you stream live TV and you'll see exactly where WIWWIWWIW video is headed.

Robots. From treating concussions to drawing blood to protecting sporting events, robots are making our lives safer, healthier and easier. Technology (and robots) are changing the way we live and revolutionizing healthcare and other industries. Expect even more robots in January.

To stay on top of the world of tech, you need to be diligent. You can't just read about the latest news — you need to think about what new tech really means and how it will change consumer behavior (your behavior and your life).

To really get the most out of CES, take an official tour of the show floor.

Millennial Leaders Are Taking Charge at Work

Excerpted from Baseline Report by Dennis McCafferty

They represent the future of our global economy, and if you ask them about their capabilities and prospects, they'll tell you that the future is in great hands.

They're the modern-day "Millennial Leaders," as they're dubbed in a recent survey released by Telefonica and the Financial Times.

For the purposes of the survey, Millennial Leaders are defined as those who strongly agree that they're on the cutting edge of technology.

They seem to be professionals you'd want on your IT team: These leaders are very driven to succeed at work and are having a much easier time transitioning from an academic to a corporate environment than other millennials are.

"The Millennial Leaders are those who are most likely to drive change through their use of cutting-edge technology," says Jose María Alvarez-Pallete, Chief Operating Officer of Telefonica.

"They are also likely to participate in solving local and global challenges and to strive for career leadership."

The findings also provide some insight into millennials in general—not just the leaders. Nearly 12,200 millennials worldwide took part in the research.

Click here to see more.

Charter CEO Surprised Users Want Broadband with No TV

Excerpted from DSL Report by Karl Bode

Charter's third quarter earnings indicate that the cable operator's third-quarter net loss shrank to $70 million, down from $103 million one year earlier. Video losses also slowed, Charter losing 27,000 TV subscribers down from 71,000 last year. 

The company also managed to add 86,000 broadband subscribers, and broadband revenues jumped 23% to $575 million courtesy of price hikes.

However, the most interesting bit has to be comments made by Charter CEO Tom Rutledge during the company's conference call with the press and analysts. Notably, Rutledge expressed "surprise" at the fact that most of their broadband subscriber growth last quarter came from users that only signed up for broadband (aka "single play" users).

"I would say that the one thing that surprised me...is that our broadband-only growth has been greater than I thought it would be," said Rutledge. 

"The bigger part of it is that Charter's video product was inferior, and we had brand issues around that," said the CEO. "While you can see some of these trends occurring throughout the whole industry, it's more exaggerated at Charter because of the way we let our video product deteriorate." 

This "surprise" is coming from an industry that has spent the last several years insisting that the broadband-only cord cutter was akin to yeti and unicorn.

While a candid CEO is refreshing, perhaps Charter and Charter chair John Malone should spend a little less time relentlessly pushing for cable industry consolidation, and a little more time paying attention to what their customers want (more flexible programming options, lower prices) if the goal is to avoid additional surprise.

BitTorrent Awarded Distributed Storage Patent

Excerpted from The Register Report by Simon Sharwood

BitTorrent has been awarded a patent for something called "Distributed storage of recoverable data."

Available here, the patent is described as, "A system, method, and computer program product to replace a failed node storing data relating to a portion of a data file."

The invention seems to resemble something an awful lot like RAID storage for resources located on different bits of a wide area network or, if you will, a cloud.

Let's step through the patent, beginning with its explanation of related art, to whit:

"A central problem in storage is how to store data redundantly, so that even if a particular piece of storage fails, the data will be recoverable from other sources. One scheme is to simply store multiple copies of everything. While that works, it requires considerably more storage for a particular level of reliability (or, contrapositively, it provides considerably less reliability for a particular amount of storage)."

Nothing to tax a storage admin's mind there, nor in the next bit:

"To achieve better reliability, erasure codes can be used. An erasure code takes an original piece of data and generates what are called 'shares' from it. Shares are designed so that as long as there are enough shares that their combined size is the same as the size of the original data, the original data can be reconstructed from them."

BitTorrent's scheme is to create a "tracker" that knows where each share is stored and, if a share is erased, to copy data from other locations that hold the same data to restore the desired level of distributed redundancy.

"The available storage nodes each contain a plurality of shares generated from a data file," the patent's abstract says. "These shares may have been generated based on pieces of the data file using erasure coding techniques. A replacement share is generated at each of the plurality of available storage nodes. The replacement shares are generated by creating a linear combination of the shares at each node using random coefficients. The generated replacement shares are then sent from the plurality of storage nodes to the indicated new storage node. These replacement shares may later be used to reconstruct the data file."

There's a lot of detail that goes into the reconstruction but you probably get the idea by now. You may also be thinking that sounds too good to be true, and you're right because the patent also says "The above technique faces limitations when used for distributed storage over the Internet. For Internet storage, the scarce resource is bandwidth, and the storage capacity of the end nodes is essentially infinite (or at least cheap enough to not be a limiting factor), resulting in a situation where the limiting factor on any storage is the amount of bandwidth to send it."

BitTorrrent says it has found a way to overcome that problem with a scheme that behaves an awful lot like, well, BitTorrent.

Just what BitTorrent plans to do with the software is anyone's guess. The company has tried to go "straighter" over the years, with products like secure messaging, "bundles" and a share 'n' sync tool.

Perhaps a storage application is in the works? BitTorrent is occasionally used as a file distribution method by makers of commercial software, but it's hard to see business users queueing up to buy "Backup Software Brand X — Powered By BitTorrent."

BitTorrent is not the only outfit keen on erasure codes: Singaporean researchers are trying to put them to work, while they're also a key part of RAID 6.

Why Are so Many Businesses Switching to Cloud Computing?

Excerpted from The Guardian Report

As businesses look to maximize their IT return on investment (ROI) and increase functionality, hosted services are playing an increasingly prominent role.

Cloud computing adoption is continuing apace, across a host of industry sectors, as more organizations recognize the benefits it can potentially offer.

According to the Cloud Industry Forum, the number of first-time cloud computing users has increased by 27% over the last 18 months. The organization expects that, by the end of 2013, over 75% of UK businesses will be using at least one cloud service formally.

In addition, 80 per cent of current cloud users will have increased their spending in this area.

But why exactly are businesses choosing to spend on hosted services, delivered remotely by a third party, rather than investing in their own on-premise infrastructure?

Here are some of the main benefits of cloud technologies:

Reduced costs.

With businesses able to source IT services on-demand according to need, there is less of a requirement for capital expenditure. Cloud vendors are responsible for managing the majority of servers and connections, and for ensuring the security of IT hardware - whereas cloud subscribers are merely consumers. They do not physically own the IT solutions they use, but they have no need to do so since services are provided online at a fraction of the cost.

Flexibility and scalability.

Cloud subscribers can increase or decrease their use of cloud services according to demand and how much they want to spend. There is no longer any need to pay for services you do not require - businesses merely sign up for the specific IT functions they require. If needs change over time, they can simply pay more or less each month for access to cloud services.

Mobility and agility.

Cloud computing empowers professional people to work from a variety of locations, from a variety of devices. They can sign in to their cloud account - for instance, through Office 365 - on a PC, laptop, smartphone or other mobile device, and pick up where they left off earlier on a different platform. This is because files, documents, software and locations are available online, with no physical tie to the system being used. They are technology-neutral, meaning that - so long as you have an internet connection with sufficient bandwidth - you can work from almost anywhere.

Easier upgrades.

The cloud vendor is responsible for upgrading cloud solutions, meaning subscribers do not need to worry about keeping up to date. When new solutions become available, cloud computing providers will invest in them and then make them available to subscribers. Failing to keep up to speed is bad business for the provider, since they are likely to be competing against various other broadband vendors for work.

Business continuity.

With cloud services enabling employees to work from almost any location, an on-premise IT disaster will not have the severity of implications it could otherwise. Many employees will be able to carry on working from another location, given that they can access the tools and solutions they need over the internet. So even if there is a fire, flood, theft or technology outage, it should be possible to keep functioning normally.

IT security.

Cloud computing providers' reputations are built on providing secure, constantly available services to their customers. As such, they invest significant amounts of money securing their servers, data centres and connections, which has a positive knock-on effect to other users. Individual companies may not be able to spend thousands of pounds on IT security, but as a cloud user, they benefit from economies of scale.

Check out more hints and tips here.

Cloud Computing Is All About Trust

Excerpted from Windows IT Report by Paul Thurrott

As we move more and more of our data and computing infrastructure into the cloud, the primary concern in many ways moves from technical concerns around performance, functionality, and the like to something more basic, even primal: trust. And when it comes to the cloud, it's all about trust.

Oddly enough, trust might be a Microsoft strong point moving forward, especially when you consider that 58 percent of the firm's revenues currently come from its enterprise customers.

Those customers don't just occupy the pole position in Microsoft's customer list. They represent its primary growth opportunity as the company moves forward into its devices and services future.

Microsoft, to date, has proven to be a trusted partner in getting these customers online in the first place, in connecting their PCs and servers to each other and to those of its partners.

Microsoft has pushed forward with on-premises infrastructure when doing so made the most sense and is now offering a hybrid strategy that will get customers over the humps—conceptual and technical—needed to get them to the cloud more fully.

All along the way, the firm has proven itself, and again, and again, to be trustworthy. We can quibble with specific products and strategies, of course. But it's hard to argue with the broad results.

Contrast this with some of Microsoft's peers.

Google Chairman Eric Schmidt is in the news this week for describing alleged NSA spying on the company as "outrageous," itself an outrageously ironic (and hypocritical) assertion for a leader of a company that pathologically compromises customer privacy in its bid to attach advertising to everything we view online.

In the midst of launching a slew of gorgeous new hardware devices, consumer electronics giant Apple made news recently for the wrong reasons as well: Apple announced that it was giving away its iWork productivity suite to anyone who buys a new Mac or iOS device.

Tech reporters heralded this as a major assault on Microsoft office but declined to mention that to achieve parity between the Mac and iOS versions of those apps, it removed features from the Mac versions rather than make the iOS version significantly better.

The result? I'm sure it will get better over time, but today's iWork doesn't just not threaten Office, it's not even as good as the free Office Web Apps.

Lesson? Apple isn't, has never been, and most likely will never be, a trusted enterprise partner. This is a company that excels at marketing and making shiny devices for consumers.

And remember when BlackBerry was the definition of enterprise trust? The rapidly nose-diving firm can't sell any of its newest phones, has laid off its most important employees, toyed with a $4.7 billion buyout offer that would have seen the final dissolution of the company, and then this week suddenly reversed course yet again: It has fired its most recent CEO, Thorsten Heins — who, to be fair, had the most thankless job in tech — and, armed with a $1 billion investment that Fairfax Financial Holdings has curiously decided to flush down the toilet, will instead make another go of it. Yep, they're back. Do you trust BlackBerry? No, you don't. Because you're not crazy.

Although it's often drudgery, I spend a lot of time moderating user comments, both here on Windows IT Pro and on the SuperSite for Windows. It's been at turns amusing and alarming to see a new kind of comment spam appear this year whose aim, very clearly, is to spread fear, uncertainty, and doubt (FUD, for you tech acronym fans) about cloud computing.

And the NSA has neatly provided a convenient whipping boy for anyone who values spreading FUD over a reasoned — and reasonable — conversation about the issues surrounding the move from on-prem to cloud. There isn't a cloud-related article on this site, I bet, that hasn't been targeted by someone, or someones, who have an axe to grind.

Most of these comments are so similar and lacking in original thought they're clearly coming from the same person or organization. They basically involve a variation on the sentence "NSA spying proves that the move to cloud computing will be a temporary one and that only on-premises servers can be trusted."

I'll keep my eye out for those pointless NSA comments. But I know that many reading this, of course, aren't sold on the cloud. You see it as a threat to your career, which is both serious and understandable, and you can cite valid reasons — regulatory and legal requirements key among them — why such a move is either impossible today or at least highly complicated. You at least know what you're talking about. But the more I study where Microsoft and the broader industry are headed, the more I see the cloud as inevitable. Yes, like aging or taxes. (Pick your own favorite negative comparison.) But eventually unavoidable by all.

And that's where I keep coming back to Microsoft. Whether you view cloud computing as a golden opportunity for good — a chance to modernize your infrastructure, lower costs, and expand your organization's capabilities — or as the tech version of an unwelcome and invasive medical procedure, one thing will always be true: You'll need a trusted partner to help get you through it.

Gartner's 2014 Trends: Smart Machines, Personal Cloud…

Excerpted from AntHill Online Report by Bala Murali Krishna

When Gartner talks, the IT world does lend an eager ear. What the think tank says is eagerly awaited at year's end.

In keeping with a recent tradition, the US company presented its vision for 2014 and beyond at its annual symposium in Orlando, FL. It did an encore at a conference on the Gold Coast last month.

Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.

"We have identified the top 10 technologies that companies should factor into their strategic planning processes," said Gartner's David Cearley. "This does not necessarily mean adoption and investment in all of the listed technologies, but companies should look to make deliberate decisions about them during the next two years."

Gartner defines a strategic technology as "an existing technology that has matured and/or become suitable for a wider range of uses;" or even "an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years."

Gartner believes the convergence of a "Nexus of Forces" — social, mobile, cloud and information — will continue to drive change and create new opportunities, creating demand for advanced programmable infrastructure that can execute at web-scale.

Gartner's top 10 "strategic technology trends" for 2014 may not always play out the same year but are seen as a trend. For those curious about Gartner's 2013 predictions, click here. Or else read on for what it foresees for 2014 and beyond:

Mobile device diversity, management. 'Everything everywhere' — a reference to ubiquitous computing across any device — anywhere is an enterprise utopia. But that will remain one, at least through 2018, simply because corporate resources are, well, limited.

Mobile apps and applications. An improved JavaScript performance will begin to push HTML5 and the browser as a mainstream enterprise application development environment. Still, no single tool will be optimal for all types of mobile applications. So expect to employ several.

Internet of Things. The Internet is expanding to more and more things but enterprises or even technology vendors are not ready. Enterprises can, instead, leverage the Internet of people, information and places.

Hybrid cloud and IT as service broker. With cloud computing still evolving, the new imperatives would include bringing together personal clouds and external private cloud services. So be ready to welcome such things as cloud service brokers and private infrastructure as a service (IaaS).

Cloud/client architecture. Cloud/client computing models are shifting, thanks to robust capabilities of client devices, mobile and desktop alike. This could help reduce bandwidth and cloud costs but the growing complexities and demands of mobile users will drive apps toward server-side computing and storage.

The Era of Personal Cloud. This has consequence. The device becomes far less important, though still necessary.

Software-defined anything. Software-defined anything (SDx) is a term for the growing trend toward improved standards for infrastructure programmability and data center interoperability driven by automation inherent to cloud computing, DevOps and fast infrastructure provisioning.

Web-scale IT. The rise of large cloud service providers such as Amazo and Google has raised the bar. Consequently, enterprises need to re-invent the way IT is delivered across the company. Gartner calls this "web-scale IT."

Smart machines. Siri has been around for a while now but expect more savvy personal assistants including automated chauffeurs. The "smart machine era" will blossom through 2020 with a proliferation of "contextually aware, intelligent personal assistants, smart advisors (such as IBM Watson), advanced global industrial systems and public availability of early examples of autonomous vehicles." Gartner believes the smart machine era could be the "most disruptive" phenomenon in the history of IT.

3-D printing. Worldwide shipments of 3D printers are expected to grow 75% in 2014 and nearly double the following year as the technology emerges as a real, viable and cost-effective means for designs, streamlined prototyping and short-run manufacturing.

Cloud Migrations: Don't Forget About the Data

Excerpted from InformationWeek Report by Micahel Daconta

Gartner Research Director Richard Watson once observed, "When the CIO issues the simple directive: 'Move some applications to the cloud,' architects face bewildering choices about how to do this, and their decision must consider an organization's requirements, evaluation criteria and architecture principles."

Unfortunately, many architects assume that migrating your legacy systems means migrating your applications to the cloud. However, the reality is that such a migration involves both data migration and application migration. Each of these must be assessed, planned, designed and executed separately and then integrated. They are two parts of the same migration of a legacy system, but each requires different analysis steps and different skill sets.

By focusing on your data migration separately from your application migration, you will ensure that both will scale properly.

It's worth noting that government agencies are taking the lead in this area. For example, the Department of Defense Cloud Computing Strategy calls for both metadata tagging and a data cloud that will implement "data-as-a-service" (DaaS). The creation of that requires separating the application from its data during the migration phase. This is the same strategy that the Intelligence Community is pursuing with its community cloud.

The goal of both data and application migration to the cloud is to achieve scalability and elasticity. Your general migration model for analysis is to start with a clear understanding of the sources of the applications and data to be migrated, examine your options in achieving scalability, and then select the target implementation that achieves those objectives.

Focusing on the source for your data migration, you need to look at how your application currently stores its data. Typically that is in one of three ways: in a relational database management system (accessed via SQL), in a no-SQL data store, or in files on the file system. Your first assessment is whether your data needs to be scalable in terms of processing and storage space. For your target options the major cloud providers all offer SQL stores, no-SQL stores and BLOB (binary large objects) storage.

Data migration consists of three parts: the migration lifecycle on your data subsystem, the transfer of legacy data, and, finally, the integration with the rest of the migrated application. Let's examine each in more detail:

1. Data Migration Lifecycle

Similar to your systems development lifecycle (SDLC), the migration lifecycle begins with an assessment phase instead of a requirements phase. Additionally, the development phase is replaced by the migration phase. I explain this in greater detail in my new book, The Great Cloud Migration; for now, suffice it to say that data migration is assessed by cloud platform type, scalability requirements, data type or by data volume. At the end of the assessment you will have selected a target implementation that achieves your scalability and elasticity goals.

2. Transfer of Legacy Data

Moving your data from your legacy data stores to the cloud is a unique opportunity for quality control, metadata tagging, data dictionary, data lineage and other data management best practices. While you could assume your data is fine and opt to move it without change, leveraging the cloud migration to clean and harmonize your data across your enterprise is a golden opportunity. Some organizations may go even further and look to cloud migration as an opportunity to centralize their data and abstract it via an enterprise data layer.

3. Integration with the Migrated Application

Once your connectivity to the target platform is completed, you must integrate your data storage subsystem with the rest of your migrated cloud application. This integration depends on how loosely coupled the data storage subsystem is with the rest of the application.

By focusing on your data migration separately from your application migration, you will guarantee that both achieve your goals for scalability, elasticity and metered billing.

Forgetting the steps needed to migrate your data can be costly from both a migration perspective and an enterprise perspective. Instead, seizing upon your cloud migration as an opportunity to implement enterprise-wide information management practices can help the organization become more efficient and more effective at the same time. Yes, you will have killed the proverbial two birds with one stone!

A Tale of Two Object Stores

Excerpted from Network Computing Report by Howard Marks

As the volume of unstructured data they need to store has grown over the past few years, organizations have discovered that their data is pushing up to, or over, the limitations of classic block and file based storage systems. Object storage systems, such as Amplidata's AmpliStor, Data Direct Networks' WOS, and Amazon's S3, provide the ability to store huge numbers of objects across exabytes of disks.

More recently, the designers of new storage systems are using object storage concepts on the back end of their systems while actually providing more traditional block or file access.

Traditional object stores -- although it seems a bit strange to call object stores traditional -- present their data through RESTful, HTTP-based Get/Put APIs. Relieved of the overhead of maintaining a hierarchical directory structure and having to support in-place data updates with all the locking overhead that implies, object stores can more easily scale-out to enormous dimensions.

No current object store is as pure as Seagate's new kinetic drives, which use a native key value store, actually maintaining data on the disk drive sequentially by key. Some of the best known object stores, such as OpenStack Swift and EMC's ViPR, run on top of much more conventional file systems, scaling beyond the limitations of file systems by spreading the objects across many of them. Others translate objects to block IDs before writing them to local SATA drives.

The new class of object storage systems isn't out to create hugely scalable systems with RESTful interfaces but to take advantage of the power and scalability of object storage to build block and/or file based systems. Rather than map an incoming file or object to an object in their back end, storage systems from vendors like Exablox, SolidFire and Coho Data break incoming logical volumes and/or files into smaller objects and then store those.

Several of these new-age storage systems break the data into fixed size objects of 4KB-64KB, calculate a hash for each block and then use the hash value as the URI for the data chunk, turning their back end into CAS (content addressable storage). Each node, and ultimately each disk drive in the system, is responsible for storing those objects over range of hash values.

Data protection is typically provided by assigning two or three disk drives, in separate nodes, to hold each range of hash values and replicating the object across them. As the more observant reader will have already figured out, systems using this type of small object CAS get data deduplication as a side benefit of the architecture.

Since the object back-end, like a more traditional object store, doesn't modify objects in place when a volume or file is updated, new objects are created to store the new data and the file's metadata is modified to include the new object. Logical volumes, and files, are defined by their metadata, which makes snapshots and file versioning essentially free in terms of both performance and capacity consumed.

The real question is whether the vendors of these systems should call them object stores. From a technical point of view, they do use object technology but when I hear object store, I think of flat name spaces, RESTful interfaces and essentially unlimited scaling, all with limited random access.

These new systems, all of which use a significant amount of flash, are more scalable general purpose storage systems that just happen to use object storage as underlying technology. Do I care? Sure, but I'm not convinced we should lump them in with the RESTful crowd.

How to be Successful with Cloud Computing

Excerpted from CenterBeam Report

Cloud computing has presented businesses with significant opportunities, including more agility, streamlined processes and lower IT costs. Since cloud adoption has shown no signs of slowing down, organizations that have yet to make the switch can take certain measures to establish a cloud deployment strategy that is effective and smooth.

Steps for successful cloud deployment.

While utilizing a managed service provider is essentially the easiest way to deploy cloud-based solutions, there are multiple steps businesses can take to ensure success, according to CloudTweaks contributor Brian King.

Before making the switch, organizations should identify which business infrastructure components, processes and systems can be outsourced.

Businesses need to calculate the amount of money they'll save before adopting cloud-based solutions. Maintaining legacy hardware can be costly, and cloud services eliminate those associated costs. Making sure stored data abides by compliance requirements is important, especially as it pertains to security and how data is transferred. Some examples include HIPAA compliance for healthcare data and Sarbanes-Oxley for financial systems. SSL encryptions are becoming the norm for cloud compliance policies.

If a company is considering the cloud, it's important to know the differences between the three main platforms - i.e., private, public and hybrid: Private - hosted internally to employees and provides high levels of security; Public - software-as-a-service providers utilize a public cloud to offer their services (e.g. Google Drive, Facebook); or Hybrid - offers the most effective deployment, including business using public cloud computing resources to scale more quickly.

A service-level agreement with managed service providers guarantees that in the event of downtime, businesses are able to restore operations quickly and keep performance maximized. Creating policies so employees understand how to use the cloud correctly, and addressing security implications by implementing endpoint security reduce the risks of data leaks.

Monitoring how the cloud is deployed and used can provide valuable insight regarding future trends and issues. Additionally, managing users and licenses are important for effective use of the cloud. Tools such as unified endpoint management can make this process significantly easier.

Lastly, staying informed on changes and improvements in cloud computing technology will help businesses use it to its full potential.

It's more than just economical.

While lower IT costs is certainly one of the most beneficial aspects of the cloud, that's not the sole reason businesses are deciding to adopt it, according to a recent article by Guy Clapperton for The Guardian.

Phil Bagnall, head of operations for the children's luggage manufacturer Trunki, noted the main reason for adopting the cloud was because the company did not want to run its own IT department as it grew. Aside from not being cornered by IT demands and the need to hire skilled staff, the problems associated with legacy technology also created an unnecessary burden.

"A good cloud provider offering applications (and they don't all - some specialize in storage space or in other elements) will update software without bothering the end user; it's just there next time they log on," noted Clapperton.

Coming Events of Interest

Government Video Expo 2013 - December 3rd-5th in Washington, DC. Government Video Expo, co-located with InfoComm's GovComm, brings the east coast's largest contingent of video production, post, digital media, and broadcast professionals together with the government AV/IT specialists. The combined event features over 150 exhibits and nearly 6,000 registrants.

GOVERNMENT VIDEO IN THE CLOUD - December 4th in Washington, DC. This DCIA Conference within Government Video Expo focuses specifically on cloud solutions for and case studies related to producing, storing, distributing, and analyzing government-owned video content.

International CES - January 7th-10th in Las Vegas, NV.  The International CES is the global stage for innovation reaching across global markets, connecting the industry and enabling CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $209 billion US consumer electronics industry.

CONNECTING TO THE CLOUD - January 8th in Las Vegas, NV. This DCIA Conference within CES will highlight the very latest advancements in cloud-based solutions that are now revolutionizing the consumer electronics (CE) sector. Special attention will be given to the impact on consumers, telecom industries, the media, and CE manufacturers of accessing and interacting with cloud-based services using connected devices.

CCISA 2013 – February 12th–14th in Turin, Italy. The second international special session on  Cloud Computing and Infrastructure as a Service (IaaS) and its Applications within the 22nd Euromicro International Conference on Parallel, Distributed, and  Network-Based Processing.

NAB Show - April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.

Media Management in the Cloud — April 8th-9th in Las Vegas, NV. This two-day conference provides a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain, including production, delivery, and storage.

CLOUD COMPUTING EAST 2014 - May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.

Copyright 2008 Distributed Computing Industry Association
This page last updated November 17, 2013
Privacy Policy