Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

CSC Leasing

Dancing on a Cloud

DataDirect Networks

Extreme Reach

Hertz Neverlost

Kaltura

Moses & Singer

SAP

Scayl

Scenios

Sequencia

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

February 18, 2013
Volume XLII, Issue 10


Obama's Executive Order: What You Need to Know

Excerpted from ZDNet Report by Zack Whittaker

Embargoed until the delivery of this week's State of the Union address, US President Obama signed his expected and highly anticipated Cybersecurity Executive Order. With potentially serious implications for US and foreign citizens' privacy, here's what you need to know.

There was grave concern that the President could effectively sign into law some, if not most, parts of the proposed Cyber Intelligence Sharing and Protection Act (CISPA). Though it was passed by the US House, it failed to gain traction in the Senate, and also faced threats by the White House to veto the bill altogether.

However, CISPA is now being brought back by the House. According to TechDirt, nothing has been changed since it first stalled in the Senate.

The final Executive Order doesn't have half of the concerning privacy implications that CISPA does, and according to The Hill has also garnered support from a major privacy group, the American Civil Liberties Union (ACLU). Having said that, the privacy implications of this cybersecurity order have yet to be defined, and could still pose a significant risk to the privacy of web citizens.

In the President's State of the Union address, however, he repeated his call for Congress to "pass legislation to give our government a greater capacity to secure our networks and deter attacks." In the past, action by Congress has fallen afoul of not only privacy groups, but also online activists and the concern of the wider web population.

Although the privacy implications may not be as stark or concerning as CISPA would have been, there is still a lot of uncertainty around what the Obama administration plans to do regarding the ever-growing threat of cyberterrorism and cyberattacks. And as ZDNet's Violet Blue explained, certain terms have yet to be defined, which could lead to potential abuses by the government.

This executive order was designed to simply set up the foundations in which a "framework" can be constructed between the government and private sector industries. This executive order doesn't mean that intelligence sharing will automatically begin tomorrow, and there is a long road ahead until a system can be set up that is effective, reliable, and as secure as it can possibly be.

The "framework" will effectively allow intelligence to be gathered on cyberattacks and cyberthreats to privately owned critical national infrastructure — such as the private defense sector, utility networks, and the banking industry — so they can better protect themselves, as well as the general US population, the economy, and other nations that are reliant on US support.

However, certain terms have yet to be defined. "Cyberthreat" and "cyberintrusions" remain vague, leading to the suggestion that those involved in distributed denial-of-service (DDoS) attacks, one of the main "weapons" of choice for protest by hacktivist groups on the web, could also be at risk of being targeted by the US government.

The executive order spelled out what "critical national infrastructure" actually is, making it easier for the US government to identify businesses and private sector organizations that hold the keys to the wider US economy:

"Critical infrastructure means systems and assets, whether physical or virtual, so vital to the United States that the incapacity or destruction of such systems and assets would have a debilitating impact on security, national economic security, national public health or safety, or any combination of those matters."

This could range from energy networks to telecommunications networks, and ultimately companies that offer services that are important to the effective running of the economy, such as cloud-based services and Fortune 500 companies, those with a massive stake on the stock market, and companies that offer services that are vital to the government.

For now, the order explicitly excludes certain companies — although not named, private firms that offer social networking and consumer products and services — from the list of critical infrastructure.

The text states that "within 150 days of the date of this order", Secretary of Homeland Security Janet Napolitano shall use a "risk-based approach to identify critical infrastructure where a cybersecurity incident could reasonably result in catastrophic regional or national effects on public health or safety, economic security, or national security."

Please click here for the full report.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe're very excited to announce new speakers for our upcoming 2013 CLOUD COMPUTING CONFERENCE at the NABShow taking place at the Las Vegas Convention Center on April 8th and 9th in Las Vegas, NV.

This year's event track will demonstrate the new ways cloud-based solutions are providing increased reliability and security, not only for commercial broadcasting and enterprise applications, but also for military and government implementations.

From collaboration during production, to post-production and formatting, to interim storage, delivery, and playback on fixed and mobile devices, to viewership measurement and big-data analytics, cloud computing is having an enormous impact on high-value multimedia distribution.

Our 2013 conference has been extended from one to two full-days reflecting the increased importance of and growing interest in its subject matter.

Experts will provide a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain.

DAY ONE will begin with an "Industry Update on Cloud Adoption."

How are cloud-based technologies currently being deployed throughout the audio/video (A/V) ecosystem? What file-based workflow strategies, products, and services are now working best?

A panel discussion with Dr. Frank Aycock, Appalachian State University; Jonathan Hurd, Altman Vilandrie; Rob Kay, Strategic Blue; and Patrick Lopez, Core Analysis will thoroughly examine this emerging market segment.

Next, we'll discuss "Outstanding Issues: Reliability & Security." What remaining pitfalls cause producers and distributors to resist migrating to the cloud? How are liability, predictability, privacy, and safety considerations being addressed.

Speaker Shekhar Gupta, Motorola Mobility, will introduce the topic. And then a panel with Lawrence Freedman, Edwards Wildman Palmer; Tom Gonser, Docusign; Jason Shah, Mediafly; and John Schiela, Phoenix Marketing International, will follow-up with further discussion.

Then "Cloud Solutions for Content Creation" will be our subject. How is cloud computing being used for collaboration and other pre-production functions? What do dailies-screening and editing in the cloud offer the content production process?

Speaker Patrick MacDonald King, DAX will explore this area first. And then a panel with Sean Barger, Equilibrium; Morgan Fiumi, Sfera Studios; Rob Green, Abacast; and Brian Lillie, Equinix will continue our examination.

"Post-Production in the Cloud" will follow. What do cloud solutions bring to post-production functions such as animation and graphics generation? How are formatting, applying metadata, and transcoding improved by cloud computing?

Our DAY ONE Marquee Keynote Chris Launey of Disney will speak first.

Then a panel with Jim Duval, Telestream; Joe Foxton, MediaSilo; Jun Heider, RealEyes; and Bill Sewell, Wiredrive will delve into this topic in more detail.

Next, we'll discuss "Cloud-Based Multimedia Storage." How are data centers and content delivery networks (CDNs) at the edge evolving? What do business-to-business (B2B) storage solutions and consumer "cloud media lockers" have in common?

Speaker Jean-Luc Chatelain, DataDirect Networks, will address the topic first. And then a panel with Bang Chang, XOR Media; Tom Gallivan, Western Digital; Tom Leyden, Amplidata; and Douglas Trumbull, Trumbull Ventures, will follow up with further discussion.

DAY ONE will end with "Content Delivery from the Cloud." How is cloud computing being used to enable distribution and playback on multiple fixed and mobile platforms? What does the cloud offer to improve the economics of "TV Everywhere?"

Speaker Chris Rittler, Deluxe Digital Distribution, will explore this area first. And then a panel with Scott Brown, Octoshape; Brian Campanotti, Front Porch Digital; Malik Khan, LTN Global Communications; and Mike West, GenosTV will continue the examination

DAY TWO will open with four cloud implementation case studies.

How was cloud computing used most successfully during 2012 in the multimedia content distribution chain? What lessons can be learned from these deployments that will benefit other industry players?

Case studies will be presented by Jason Suess, Microsoft; Michelle Munson, Aspera; Keith Goldberg, Fox Networks, and Ryan Korte, Level 3; and Baskar Subramanian, Amagi Media Labs. Then the presenters will join in a panel discussion.

Next, we'll look at "Changes in Cloud Computing." How is the cloud-computing industry changing in relation to content rights-holders? What new specialized functions-in-the-cloud, interoperability improvements, and standardization are coming this year?

David Cerf, Crossroads Systems; Margaret Dawson, Symform; Jeff Malkin, Encoding; and Venkat Uppuluri, Gaian Solutions will join in a panel.

Then the OPEN Group will lead a discussion of cloud standards.

"A Future Vision of the Cloud" will explore what to expect next. What do the latest forecasts project about the ways that cloud-computing solutions will continue to impact the A/V ecosystem over the long term? How will the underlying businesses that are based on content production and distribution be affected?

Panelists Lindsey Dietz, ODCA; John Gildred, SyncTV; Mike Sax, ACT; and Sam Vasisht, Veveo will join in the discussion.

"Military & Government Cloud Requirements" will follow. How do the needs of military branches and government agencies for securely managing multimedia assets differ from the private sector? What do these requirements have in common with commercial practices?

Michael Weintraub, Verizon, will speak first. Then Scott Campbell, SAP America; Fabian Gordon, Ignite Technologies; Linda Senigaglia, HERTZ NeverLost; and Alex Stein, Eccentex will go into more depth.

Next, we'll explore "Unique Cloud-Based Solutions." What are cloud solutions providers currently developing to address specific considerations of the intelligence community (IC) in fulfilling its missions? How will these approaches evolve and change during 2013?

DAY TWO Marquee Keynote Saul Berman of IBM, will address this area first.

Then Kris Alexander, Akamai; Rajan Samtani, Peer Media; Ramki Sankaranarayanan, PrimeFocus; and Dan Schnapp, Hughes Hubbard & Reed will continue this examination.

Four relevant cloud Case studies will follow.

How is cloud computing being used to help securely manage sensitive multimedia? What lessons can be learned from these deployments that will benefit military and government organizations?

Grant Kirkwood, Unitas Global; Jack Pressman, Cyber Development Group International.; Randy Kreiser, DataDirect Networks; and John Delay, Harris will present case studies.

These presenters will then join in a panel discussion.

The Conference Closing will tie back to the commercial sector. How do those involved in multimedia production, storage, and distribution leverage cloud-based solutions to their fullest potential? What resources are available for comparing notes and staying current on the latest developments?

Our closing session speakers will be Steve Russell, Tata Communications and Jeffrey Stansfield, Advantage Video Systems.

There are special discount codes for DCINFO readers to attend the NABShow. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04. Share wisely, and take care.

Intel to Launch Web TV Service This Year

Excerpted from Wall St. Journal Report by Don Clark

Intel confirmed plans to offer a paid Internet video service and accompanying set-top box (STB), an unusual gamble for a chip maker that has rarely marketed directly to consumers.

The Silicon Valley company said the service will be introduced later this year. It didn't disclose the name of the offering — which will carry a new brand separate from Intel's — nor its pricing.

Intel joins an array of companies attempting or considering ways to help transform the TV-watching experience; they include media firms as well as technology players such as Apple, Microsoft, and Google.

Intel plans to offer a selection of live and on-demand TV programming, "catch-up" features and a programming interface that makes it much easier to find shows than with existing guides on STBs from cable and satellite companies, said Erik Huggers, who heads a new group at the company called Intel Media.

But there will be several unique elements, he said.

For one thing, the STB the company plans to offer — which will be powered by Intel chips — will include a high-definition (HD) video camera and microphone that will enable several novel applications.

For example, Mr. Huggers said, the camera and facial-recognition technology will be able to identify who is watching the TV and tailor programming appropriately — such as blocking children in a household from watching adult TV shows. Families in multiple locations will also be able to conduct videoconferencing more easily, without laptop computers or tablets that can only easily show one face at a time, he said.

In another scenario, Mr. Huggers said, users could see each other in different locations while watching and commenting on the same show, a modern-day analog to the days families once congregated around the TV set. For those concerned about privacy, a shutter will allow users to make the camera inoperable, he said.

"I think we can bring an incredible experience," Mr. Huggers said.

Intel faces no shortage of skepticism. For one thing, like other companies entering the field, it must negotiate with content companies for rights to TV programming. Mr. Huggers said such negotiations are in process, but provided no details.

The company also isn't expecting, at least initially, to be able to side-step one of users' biggest complaints about cable-TV packages — that users have to take "bundles" of shows, some of which they might not want. Mr. Huggers said content companies are "not ready" for offering pure a la carte programming, but Intel expects to be able to offer "more intelligent" or "more convenient" selections of shows.

In addition, Intel isn't expecting that its services will necessarily be less expensive than cable bills today, but the experience will be much better. "It's not a value play," Mr. Huggers said.

If Intel's service and set-top device can boost demand for video services, there is little reason that content companies shouldn't embrace it, said Paul Zwillenberg, a London-based partner at Boston Consulting Group who works with media and technology companies. "I think it's only good for the industry," he said.

Intel, of course, mainly sells chips to computer makers rather than market them to end-users. But Mr. Huggers — who previously worked at British Broadcasting Corp. (BBC), and helped it launch a high-profile service called iPlayer — said Intel Media has hired a team of veteran digital-TV specialists and is being operated as a largely separate unit with offices apart from the rest of Intel.

He noted that Microsoft did much the same thing, and successfully, when entering the videogame console market with Xbox.

As for the reception from content companies, Mr. Huggers noted they were approached before his group conceived the service and had input on its features — rather than being consulted after the fact. "We have been working hand in glove with the industry to figure this out," he said.

Adap.tv Could Draw TV Crowd to Online Video

Excerpted from RTMDaily Report by Tyler Loechner

Adap.tv last week announced In-Target Audience Optimization, a technology that allows advertisers to deliver ads to the right demographics with the help of Nielsen Online Campaign Ratings and comScore Validated Campaign Essentials.

Perhaps most importantly, Toby Gabriner, President, Adap.tv, believes that In-Target Audience Optimization will draw advertisers in from TV. He says that there is now a "currency that can attract the TV buyers into the online video world." The technology allows for TV-like targeting from sellers and more targeted audiences for buyers.

Gabriner also says there's an important connect to be made between the new technology, the draw of TV advertisers, and programmatic buying. "It really helps to demonstrate the power of programmatic, which is starting to really gain traction as a buying and selling mechanism," he said.

The new technology could draw TV advertisers to online video because it's now a more familiar model. "Historically, there has been a challenge around the currency that they should be using. This helps to solve for that," Gabriner said. In addition, Gabriner says that the TV community can optimize effectively in the digital space, something that isn't possible when they buy TV.

Using benchmarked data from their partnership with Nielsen and comScore, Gabriner says Adap.tv can give accurate forecasts "before the first impression." If those expectations are not met, it alerts the people behind the process in real-time so that optimization can start to occur, something people in the TV world often have to wait weeks for.

In beta, ad delivery saw a 30% boost when using In-Target Audience Optimization. That's a promising number, and if the floodgates open and TV advertisers make their way online, there will be a massive increase in dollars spent on online video. The RTB process will heat up and the "little guy" will have to get extra creative. It hasn't happened yet, but if the "power of programmatic" - as Gabriner put it - is strong enough to pull TV advertisers online, it's powerful enough to do anything.

BitTorrent Courts the Entertainment Industry

Excerpted from US News & World Report by Simon Owens

Once a clearinghouse for unauthorized Internet downloads, BitTorrent is now testing whether its services can benefit content producers.

A little more than a year ago, Michael Fiebach had a phone call with Randy Reed, the manager of the well-known electronic music artist Pretty Lights.

The artist's music had just topped The Pirate Bay's "most downloaded" list, a superlative that usually signals a high level of music infringement, but Fiebach, the CEO of Fame House, a digital marketing firm that has worked with some of the biggest names in the electronic music world (including Eminem's record label Shady Records), viewed the listing without the slightest bit of umbrage.

"Here we are celebrating hitting #1 on Pirate Bay," Reed told Fiebach, "while major labels would be kicking, cursing, and sending take-down notices."

That's because Fiebach is one of dozens within the entertainment industry who have partnered with BitTorrent, the company that manages the peer-to-peer (P2P) file-sharing protocol of the same name, to promote and distribute free audio, text, and video content within its network.

For the past few years, BitTorrent, which first launched in 2001, has engaged in an experiment to determine whether its users would be able to drive real revenue toward content producers. In the process, BitTorrent hopes to transform industry players who have long viewed the company with disdain into its allies.

For Pretty Lights, BitTorrent and Fame House bundled four of his songs (including his latest single) along with a video of his 2011 performance at the Bonnaroo music festival into a single BitTorrent file. From there, the file-sharing company employed a number of methods to promote the item to its 130 million active users.

Within months, the file had surpassed 6 million downloads worldwide. Pretty Lights' e-mail list had increased by 60,000, his Facebook page by 30,000 likes, and his website traffic increased by more 700 percent.

"In terms of the value of 80,000 new fans," says Fiebach, "there's a sliding scale there in terms of different types of fans in different geographic areas. But I can tell you that the experiment significantly grew his e-mail list, and each person on that e-mail list is a potential purchaser of something.

So if you're going to say the average click-through rate in an e-mail is 10 percent, that means you just got about 8,000 new people who are going to buy something at some point. The value of that in a year? That might be $80,000 a year, $100,000 a year. It might be much more than that."

Those who have followed and advocated for BitTorrent likely weren't surprised by such results. Copyright activists have long touted the benefits of such loss-leader promotions. In May 2012, many felt vindicated when North Carolina State University economist Robert Hammond released a study indicating that rampant BitTorrent download activity can boost music sales.

For his paper, titled Profit Leak? Pre-Release File Sharing and the Music Industry, Hammond amassed a number of download statistics between May 2010 and January 2011 and devised a model to derive the connection between unauthorized downloads and music sales.

While skeptics would simply point out that popular music titles would be the most likely to have a high number of unauthorized downloads, the economist claimed his model isolated a causal effect "by exploiting exogenous variation in how widely available the album was prior to its official release date."

Another study, this one conducted by researchers from the University of Minnesota and Wellesley College, found that box-office movie sales were only negatively affected by about 7 percent when there was a significant time gulf between a US and international release.

"We do not see evidence of elevated sales displacement in US box office revenue following the adoption of BitTorrent, and we suggest that delayed licensed availability of the content abroad may drive the losses to infringement," the authors concluded.

BitTorrent VP of Marketing Matt Mason, who joined the company a little more than a year ago, has been directly involved in the partnerships and views these promotions as only the beginning of a long, fruitful relationship with the industry, one that can eventually be monetized at a mass scale.

"This is clearly a valuable audience to speak to," he says. "And these 170 million people worldwide are not simply infringers who won't pay for anything. All the myths we hear about BitTorrent users simply aren't true. They will reward content creators, and we've seen that with every single experiment we've run."

This is not to say he doesn't understand why his company has attracted the reputation it has. The original HTTP protocol was invented for the transfer of text, and as the Internet matured it became a venue for richer media like audio and video, the large files of which became a strain to transfer in large quantities online. The engineer Bram Cohen invented the BitTorrent protocol to spread the files across thousands of distributors, thereby reducing the strain (and download time) on any one network.

"The reason that BitTorrent became thought of as a tool for infringement was because most of the people on the Internet moving rich media saw BitTorrent for the potential for infringement and used it for that," says Mason. "And its name quickly became marred as far as the content industries were concerned." That, however, was never the company's intention.

So far, BitTorrent has formed content partnerships with between one-to-two artists a month, and it employs a number of methods to promote the files. Perhaps its most successful promotion occurs when a new user downloads the BitTorrent software; Mason says it receives between 600,000 and 800,000 downloads a day, and so BitTorrent simply offers the free file on what he calls the install path.

On most days, the promotion receives between a 40 to 50 percent conversion rate. In other words, nearly half of the 500,000 new users who sign up for the service each day will download the accompanying content file. To put that in perspective, the average display ad online gets a click-through rate of less than one percent. It's not difficult to surmise why the entertainment industry would find this kind of engagement appealing.

BitTorrent also began rolling out banner ads within its client last year, and though it was met with resistance from a "very vocal minority" of its users, the ads have gotten higher-than-average click-through rates and are now serving upward of 5 billion impressions a month. Recently, BitTorrent aggressively promoted best-selling author Tim Ferriss' new ebook, "The 4-Hour Chef."

"In the first week he had 200,000 people downloading the content bundle he did, and over 89,000 went and visited his Amazon page," Mason says. "We couldn't see how many people bought the book from his Amazon page, but what we did see is that Tim hit the New York Times, Wall Street Journal, and USA Today best seller lists, and 89,000 people hitting your Amazon page in your first week, I can tell you that's a really crazy number."

This was likely a welcome number for Ferriss, whose book had been boycotted by more than 1,000 independent bookstores who felt betrayed he had abandoned his traditional book publisher in favor of Amazon's ebook services.

Sometimes, especially for lesser known content creators, this promotional outpouring can almost be overwhelming.

Josh Bernhard and Bracey Smith, two filmmakers based in New York City, got to experience BitTorrent's geyser of user interaction when they uploaded the pilot episode of their science fiction show Pioneer One onto the network. The two had used Kickstarter to raise a shoestring budget of $7,000 to create the pilot, and they didn't originally intend to make any additional episodes -- the pilot was meant to simply act as a "proof of concept."

But when the episode was featured on Vodo.net, a curator of BitTorrent content, they received $20,000 in PayPal donations in only two weeks.

"We sort of realized that we had the means to keep on making more and finish the season," says Bernhard. They eventually raised enough money to shoot six episodes with the help of about 4,000 individual donations, he says.

But the success was so sudden, and the demand so overwhelming, that the filmmakers didn't have time to plot out the season in a way they would have wanted.

"The problem was we were slowly picking up money and releasing the series over a year and a half," Bernhard says. "And that made people really frustrated, because even though they were longer episodes than your typical web series — we were aiming for an hour long length, so each episode was between 33 and 45 minutes — most people were used to regular schedules of five minute episodes, and we often had a two or three month delay between episodes."

But he also sees how this completely changes the dynamic for independent filmmakers.

"We got to a place where we knew whatever we did, we had an opportunity to get it seen by at least hundreds of thousands of eyeballs. And coming from the independent film world, that's kind of staggering. Because the problem used to be how do I get it seen, how do I find someone who has the reach and the means to get it out there and be seen by a lot of people?"

Given that BitTorrent has proven that it can catapult content in front of millions of paying customers, the question now is how it can scale that success. The financial ascendancy of companies like Google and Facebook stems not only from their ability to amass millions of users, but also their technological capacity for delivering millions of ads to micro-targeted communities within their networks.

With BitTorrent moving more information a day than Facebook, Google, YouTube, and all other websites combined, it must devise an avenue for any artist or company, not just the few anointed by its partnership program, to reach potential customers. Mason says that this will be the main focus of the company in 2013, and whether the entertainment industry makes amends with BitTorrent hinges on it effectively converting its millions of users into paying customers — either through the purchasing of content, merchandising, or concert tickets.

"I'm in no way pro infringement," says Fiebach of Fame House. "I'm pro music promotion and pro artist. I've done three campaigns with BitTorrent, all of which I've seen benefit artists. If they can keep figuring out how to do that, and they can scale it, then I'm all for it. If they can't and people are using it for infringement, then I'm not."

Intacct Rides Cloud Computing Momentum

Sererra, a premier VAR and cloud consulting group, congratulates Intacct on its record results for calendar year 2012.

Over the past twelve months, Intacct,a leader in cloud financial management and accounting software, increased new customer bookings by more than 47% over 2011. In the company's second fiscal quarter, ending December 31, Intacct rode strong momentum from companies outgrowing QuickBooks and those looking to switch from outdated mid-market on-premises software to secure a record number of new customer additions.

All of this points to increasing momentum in the cloud financials market as gains in other business application areas, such as customer relationship management and human resources, are now reaching core financials.

Existing customers also voiced their satisfaction and deepened their commitment to Intacct. Add-on business with current Intacct customers remained strong, with these companies adding new users and subscribing to additional Intacct applications in record numbers during 2012.

"There is no doubt that 2012 was the year cloud financial applications went mainstream," said Robert Reid, CEO of Intacct. "While on-premises financial software growth is essentially flat, cloud vendors continue to grow rapidly. With new customer additions at an all-time high, we are seeing increased demand for our award-winning financial applications."

"In addition, as the cloud partner of choice for the channel, we continue to see momentum across both traditional resellers and top accounting firms. Intacct customers will receive a slate of significant product enhancements in 2013 that will extend the value of their investment and provide the opportunity for accelerated growth for Intacct and our partners."

Reflecting more broadly on 2012, Intacct achieved many significant milestones, including one of the strongest channel programs of any cloud financial vendor. Included in this channel program is Sererra Consulting Group, a Top 100 VAR and one of Accounting Today's Technology Pacesetters.

Vision Cloud Protection & Recovery Service

Excerpted from Talkin' Cloud Report by Chris Talbot

Vision Solutions has launched a new cloud protection and recovery service to provide customers with recovery-as-a-service (RaaS) while offering a low-risk route to cloud.

There's another new player in the cloud protection and recovery space. Vision Solutions, which provides replication, availability, and disaster recovery services and solutions, has launched its Cloud Protection & Recovery (CP&R) offering, which combines Vision's Double-Take and MIMIX products into a service that cloud service providers can use to accelerate their own cloud service practices.

As part of a cloud services broker practice, Vision's CP&R RaaS was designed with service providers in mind. While providing a "low-risk path to the cloud," the CP&R service also gives service providers a new offering to take to their customers—something customers are looking for, according to the company.

The new service includes a RaaS platform that takes Vision's technology and combines it with Apache CloudStack Edition to enable providers running Citrix CloudPlatform or Apache CloudStack the ability to offer customers a cloud-integrated RaaS. Additionally, it's built on Vision's Double-Take real-time replication technology.

The service features metered usage that enables cloud providers to consume licenses in a pay-as-you-go model. According to Vision, this model reduces risks for cloud providers while helping them create cloud-based disaster recovery, high ability and migration solutions in the cloud.

Vision also offers a software development kit (SDK) that provides access to APIs for custom integration of Double-Take into the provider's cloud ecosystem. According to Vision, this will provide a higher degree of automation and reduce administration costs.

"Companies of all sizes, from SMB to enterprise, are considering cloud-based recovery as a non-disruptive path to cloud adoption and it follows the scalable, comprehensive disaster recovery and availability solution that every cloud option requires," said Alan Arnold, CTO of Vision Solutions.

As Vision launches this new cloud-based solution, it's also realigning its product portfolio into three categories — cloud protection and recovery, high availability and disaster recovery, and migration, and cross-platform data sharing.

Making the Cloud Invisible

Excerpted from TV Technology Report by Al Kovalick

What does it take to make the cloud "invisible?" Put another way, in the context of an end user-running a media-focused application, what parameters create an environment such that the user cannot discern if the app is running locally or in a remote cloud?

For example, using a video editor app and doing the classic jog/shuttle function across the timeline, can a user tell by the "feel of the app" that the runtime code is local or remote? If the app feels local in all aspects then the cloud is invisible to the end user. For SaaS apps in particular it's good to aim for this goal; users will demand it.

For sure, it's not easy to create an invisible cloud environment. There are many aspects of "Quality of Service" that determine the user experience. The main quality domains are:

Transport QoS from premise to cloud including the lossey and delay prone Internet;

Compute QoS including deterministic latency of a short operation and speed of a long operation (e.g., transcode);

Storage QoS including IOPS, bandwidth, deterministic latency and other storage related parameters;

Availability QoS including uptime percentage, access latency to a service or resource. This is tightly bound to systems reliability, and

Security QoS including access control, authentication, encryption, DDoS attacks prevention and more.

This column is the first of several to explore the QoS of the cloud from the perspective of a user or system component at a remote facility. Let's start by examining transport QoS (No. 1 in list). Fig. 1 outlines the salient aspects of transport QoS.

Transport QoS is measured using four main parameters; bandwidth (data rate), latency, packet jitter and packet loss. Internet marketing has corrupted the meaning of "bandwidth" to mean data rate so I use this term reluctantly. Of course, availability (uptime) is also a measure of transport QoS, but for now let's treat availability separately. Fig. 1 shows that the end-to-end QoS is divided across three areas; facility premise, Internet (or direct connect) and cloud provider. Each of these areas contributes to the QoS either positively or negatively.

Here is a brief summary of the effects of each of the big four contributors:

Data rate (bandwidth); sufficient to meet the simultaneous and peak needs of all the apps and services required at the premise. Some apps will require continuous streaming bandwidth and others can use variable file transfer rates. Key is not to starve any media streams.

Round trip latency; the lower the better. Delay is the enemy of reliable file transfer. TCP (FTP and HTTP use) is very delay sensitive and can operate ~80 times slower in the presence of large (200ms) RT delays compared to small delays (10ms). There are practical ways to circumvent the slowdown using "transfer acceleration" techniques.

Jitter; this is the time variation in latency. For most apps and modest jitter ( ±25ms), this metric is not critical. Large jitter values ruins TCP transfer rates. Packet loss; lower the better. The raw Internet has a packet loss of about .1 percent although this can vary widely depending on traffic conditions. Also, loss is often not directionally symmetric and this can adversely affect transfer rates due to TCP's performance. TCP-based data rate is reduced by about a factor of 3 when loss increases by a factor of 10.

There are three links in the overall QoS chain. Let's consider the middle link first. Fig. 1 shows two paths from premise to the cloud. The most common connection is the "best effort" Internet, path A. No carrier will guarantee Internet delay, loss or bandwidth; you get what you get. Of course, your facility connection to the Internet (the so-called "last mile") has some QoS guarantees but this is a small contributor to the overall Internet QoS. So, purchasing a 100 Mbps clean pipe to the Internet does not guarantee 100 Mbps data end-to-end transfers by any means.

An alternative to Internet connectivity is to purchase a direct connection to your cloud vendor, (path B in Fig. 1). Amazon, Terremark and other cloud providers offer this option. An example of this relies on the famous One Wilshire telecom hub in Los Angeles. It has direct paths to many cloud vendors, bypassing the Internet. It's possible to link from a media facility to One Wilshire using Metro Ethernet (for example) thus creating an Internet bypass and guaranteeing an excellent QoS for the facility-to-cloud transport chain.

For sure, paths A or B are big contributors to the overall transport QoS, but next in line is usually the local premise QoS. Running apps and other services in the cloud puts strict demands on the in-house networking and some facilities are not geared to support the required QoS. This could make the facility the weak link in the transport chain. Don't assume "all will be OK" with in-house networks. Run tests, measure performance and don't assume anything; measure it.

The good news is that facility managers have total control over local transport QoS so you have a good chance of an excellent overall end-to-end QoS. Finally, the cloud-provider portion should offer the best QoS performance in the three link chain.

It has been shown that the cloud can be invisible if transport QoS parameters are sufficiently defined, measured and managed on a daily basis. I have personally used a video editor app executing 4K RT miles from the UI and the cloud was invisible for my tests. My first experience of this made me a believer in the promise of the cloud.

Transport QoS is only one aspect of the total "quality of experience." In the next column other cloud QoS metrics will be considered.

Cloud Computing and Fall of the Old Guard

Excerpted from AppSense Report by Jon Wallace

Late last year I attended the AWS re: Invent Conference, Amazon's first global conference dedicated to all things cloud. The conference itself was spectacular, hosting an array of impressive sponsors and an enviable lineup of speakers. 

Throughout the various sessions, one theme was prevalent: cloud computing is fundamentally changing the landscape of enterprise technology- eradicating "the old-guard mentality" that opposes change, faster than ever before.

As the enterprise continues to adopt cloud computing at a quicker-than-anticipated rate, a new era for enterprise development is ushered in. 

Historically, developers who wanted to launch a global application had much to consider. Substantial infrastructure needed to be implemented to ensure failover and high-availability and a certain element of predictability needed to be applied to cater for scale. 

This is no longer the case. Today application developers can write their software using languages like Python or PHP in a few months and deploy in a massively scaled environment in a matter of hours.

This shift to more user-centric technology in the enterprise has been difficult for "old-school" developers to adapt to. It's always interesting to speak to the new generation of app developers and hear how easily they grasp terms like "compute" without the need for further clarification. 

While this is often the case for the "new kid on the block" it is not necessarily true for the seasoned but disconnected industry veteran, who may often dismiss cloud computing technology out of simple frustration.

Like any new technology (think Apple OS X), there will be a natural resistance towards cloud computing and the many services it can provide. However, those who have decided to become early adopters have learned quickly that cloud computing can enable convenient, on-demand access to a shared pool of resources with minimal management effort- a huge plus for users seeking real-time and efficient collaboration.

Giant software organizations will continue to be challenged as red tape prevents them from being as agile as smaller newcomers and CIOs will ultimately succumb to the will users who demand a more efficient means of productivity in the workplace. 

While cloud has been around for a number of years, like Moore's law, its pace of adoption will continue to rapidly increase.

The future of cloud computing is here. With cloud connectors like DataNow, IT teams will be able to shift backend storage to the cloud without disrupting user workflow. Exciting times for the increasingly mobile enterprise!

Stations' Emergency Alert Warns of Zombies

Excerpted from Broadcasting & Cable Report by Michael Malone

A hacker with an apparent taste for zombie culture reportedly broke into the Emergency Alert System at multiple TV stations and broadcasted a bogus zombie-related emergency. 

The message on KRTV Great Falls (MT) interrupted programming February 11th and told viewers, in a garbled voice, that zombies were rising from their graves and hunting down the living. 

"Our Emergency Alert System was hacked," said News Director Heath Heggem. "The matter is under investigation." 

The KRTV website added, "Our engineers are investigating to determine what happened and if it affected other media outlets." 

"This appears to be a breach of security of a product used by some local broadcasters," said a spokesman for FEMA. 

"FEMA's integrated public alert and warning system was not breached or compromised and this had no impact on FEMA's ability to activate the Emergency Alert System to notify the American public. FEMA will continue to support the FCC and other federal agencies looking into the matter." 

Zombies have emerged as a pop culture phenomenon. AMC series Walking Dead returned February 10th with 12.3 million viewers tuning in to the season premiere. 

Compared with the massive numbers for AMC's show, the affected markets reach modest audiences. KRTV is a CBS affiliate with the CW on its dot-two. It is owned by Cordillera. Great Falls is DMA No. 190.

"The Bachelorette" and "The Carrie Diaries" were airing on WBUP-WBKP Marquette (MI) when the zombie message broke through the programming. 

"It appears to be the same content," says Cynthia Thompson, Station Manager and News Director at the duopoly. Thompson says local police, Michigan State Police, and the FBI are investigating, as is the FCC. 

"It involves all of them," she says. "You're dealing with a security issue, a communications issue, a safety issue." She added that she believes a handful of stations in other states were hit by the zombie message. 

Ed Czarnecki, Senior Director of Strategy and Regulatory Affairs at Monroe Electronics, which manufactures EAS systems, noted that the events highlighted the importance of improved IT security at stations. 

"There has been a lot of interesting speculation about what happened but the EAS devices were not themselves hacked," he argues. 

Rather someone or some group, hacked through the station's firewall and then was able to gain access to the EAS devices because the default passwords for the EAS devices were not changed. 

"There is no flaw in the device, this is a matter of not updating the administrative password," Czarnecki said. In some ways, the bogus zombie alerts were a blessing in disguise, he added. 

"It is a reminder that stations need to inspect all of their devices and make sure they aren't set to factory defaults." 

In 2011, Monroe issued a white paper on best security practices.

Obama: $222K Fine for Sharing 24 Songs OK

Excerpted from Digital Music News Report

This is no accident: after all, the Obama Administration is littered with ex-Recording Industry Association of America (RIAA) lawyers. But the question is whether the US Government is "doing it wrong" by raiding infringement compounds in New Zealand, bullying programmers into suicide, and doling out heavy-handed advice like this.

In an opinion just delivered to the Supreme Court, the Administration has urged the Justices to uphold a $222,000 judgment against Jammie Thomas for uploading 24 songs. That is, by simply rejecting to hear the case entirely. The case dates back to 2007, around the time when Kazaa was the file-sharing app du jour.

Thomas' lawyers want the Supreme Court to judge on whether the fine is unfairly excessive and therefore unconstitutional.

But this is not an issue of constitutionality, according to the Administration, but rather an issue of the sanctity of copyright and the incentives it creates. "In particular, the exclusive rights conferred by a copyright are intended to motivate the creative activity of authors by the provision of a special reward, and to allow the public access to the products of their genius after the limited period of exclusive control has expired," the document asserts.

"That public interest cannot be realized if the inherent difficulty of proving actual damages leaves the copyright holder without an effective remedy for infringement or precludes an effective means of deterring further copyright violations."

Indeed, copyright needs to stand for something. But... $222,000 for 24 songs, which would have given major record labels about $16.80 ($24 x iTunes' $0.70 royalty) if Thomas had paid?

Absolutely "appropriate" according to the recommendation, especially since a jury actually fined Thomas $1.5 million at one point (this is a case that has gone through three full trials.)

There's also the issue of whether Thomas, a single mom with limited computer literacy, should be sent to the gallows of near-certain personal bankruptcy. Thomas is no Bambi, but then again, there are far worse copyright criminals on the high seas.

"Jammie Thomas-Rasset's copyright infringement was willful in the extreme," the major labels, as represented by the RIAA, blasted in a statement. "Three separate juries have concluded that her blatant and unapologetic violation of respondents' rights warranted a substantial award under the Copyright Act's statutory damages provision."

Join the discussion.

 Coming Events of Interest

2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.

NAB Show 2013 - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.

Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.

CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Copyright 2008 Distributed Computing Industry Association
This page last updated March 2, 2013
Privacy Policy