Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

A10 Networks

Aspera

Citrix

FalconStor

ShareFile

VeriStor

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

June 10, 2013
Volume XLIV, Issue 2


Plan Now to Participate in CCW:2013

This week, we'll be sending a second wave of invitations to prospective speakers to participate in CLOUD COMPUTING WEST 2013 (CCW:2013) taking place October 27th-29th at The Cosmopolitan in Las Vegas, NV.

This year's themes are "Revolutionizing Entertainment & Media" and "The Impact of Mobile Cloud Computing & Big Data."

There's no question that advances in cloud computing are having enormous effects on the creation, storage, distribution, and consumption of diverse genres of content.

And most profound among these effects are those involving the increased proliferation of portable playback systems and the accompanying generation of unprecedented amounts of viewership, listenership, and usage information from audiences globally.

The ubiquity and widespread acceptance of user interfaces that reflect the dynamic interactivity exemplified by smart-phone applications is rapidly replacing the flat linearity of traditional TV channel line-ups and changing expectations for a new generation of consumers.

Cloud-based information and entertainment-of-all-kinds accessible everywhere always on each connected device will become the new norm.

And perfect data related to consumer behaviors associated with discovering and consuming this content will displace metering and ratings technologies based solely on statistical sampling.

Two CCW:2013 conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment production and storage, as well as media delivery and analysis options; along with the growing impact of mobile cloud computing on this sector, and the related expansion of big data challenges and opportunities.

DCINFO readers are encouraged to get involved in CCA's and DCIA's CCW:2013 as exhibitors, sponsors, and speakers.

The CCA is handling exhibits and sponsorships. Please click here for more information.

The DCIA's role is to provide keynotes, panelists, and case-study presenters to participate in our comprehensive agenda of sessions in ENTERTAINMENT & MEDIA and MOBILE CLOUD & BIG DATA.

Please click here to apply to speak at CCW:2013.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe urge our US readers to join us this week in contacting federal lawmakers in both Houses of Congress to support three timely and important measures critical to our industry's continued advancement, and to oppose a troubling proposal by the Securities and Exchange Commission (SEC).

We have previously cited the SEC — in a positive way — for its early adoption of cloud services, and continue separately to support that work.

The SEC now collects more than 400 gigabytes of market information daily as part of an exemplary implementation of Big Data.

This effort, which relies on the Market Information Data Analytics System (MIDAS), helps foster a better understanding of rapidly emerging market trends, including high-frequency trading and flash-crashes, and informs policy making in constructive ways.

The $2.5 million per year cloud-based MIDAS service, hosted by SEC vendor Tradeworx, is powered by Amazon Web Services, including Amazon S3 storage, Amazon EC2 infrastructure-as-a-service (IaaS), and Amazon Elastic Block Storage.

MIDAS captures data such as time, price, and trade-type on every order posted on national stock exchanges, as well as cancellations and order modifications. MIDAS does not show the identities of entities involved in trades, however, and MIDAS doesn't look at trades executed outside the system.

Our problem with the SEC regards its Senate proposal to go far beyond this admirable deployment of cloud-based Big Data analytics in ways that are deeply troubling.

Predating the recent string of related scandals involving government overreach into the private lives of Americans, the DCIA has been participating in the Digital Due Process (DDP) Coalition, which advocates Electronic Communications Privacy Act (ECPA) reform.

The outdated ECPA statute allows government agencies to access private online content, including information stored in data centers by users of cloud computing services, without a warrant from a judge.

DDP's exemplary work on this issue helped advance Senate bill S. 607 The ECPA Amendments Act, sponsored by Judiciary Committee Chairman Patrick Leahy (D-VT) and Senator Mike Lee (R-UT), which would close this loophole in privacy protection.

S. 607 was approved by the Senate Judiciary Committee in April.

DDP also helped drive the like-named House companion to the Leahy-Lee bill, H.R. 1847 The ECPA Amendments Act, which has since been introduced by Congressman Matt Salmon (R-AZ) and attracted 18 co-sponsors.

And now, with momentum growing as a result of revelations that the IRS has been playing politics with its review of applications for tax-exempt status and its claims that Americans who use e-mail have no expectations of privacy, Congressmen Kevin Yoder (R-KS) and Tom Graves (R-GA) have introduced a third related measure in the House, H.R. 1852 The E-mail Privacy Act.

H.R. 1852 has rapidly picked up nearly 80 co-sponsors, including Congressman Jared Polis (D-CO), a long-term advocate of digital rights.

All three of these bills require that the government get a warrant before accessing private digital content stored in the cloud and have the DCIA's full support.

In addition, the embattled Attorney General Eric Holder recently testified that the Department of Justice (DoJ), traditionally an opponent of such reform, will also support a warrant requirement for online content.

In opposition to this very positive progress, the SEC has now proposed to the Senate that it and other investigative agencies be allowed to gain access, secretly, to information that companies outsource to cloud providers.

Such an absence of legal protection for cloud-based information is exactly what we're seeking to reverse with the ECPA Amendments Acts, which would establish a clear warrant requirement for all content stored in the cloud.

The SEC's requested exception to the warrant requirement would drastically alter for the worse the way it and other regulatory agencies are permitted to conduct civil investigations.

Right now, investigating agencies are required to serve a subpoena on targets of their investigations when they seek to obtain responsive documents, with unrelated documents excluded, in whatever formats they exist.

A target's attorneys are permitted to determine which documents, whether hard-copy or electronic, comply, and which are irrelevant or legally privileged.

The SEC proposal would put cloud service providers in the position of having to turn-over ALL information in the target client's account — regardless of relevance or privilege — and the target would not be notified that this was happening until after the fact.

This would constitute an outrageous breach of reasonably expected privacy protection under the law.

Accordingly, we ask DCINFO readers to join with us in requesting that the Senate reject the SEC proposal and continue to move ahead with ECPA reform.

Updating ECPA to ensure warrant protection for our digital information represents one of the rare issues in Washington, DC on which both parties should agree.

ECPA reform has bipartisan support and we have a new public stance from the DoJ indicating its support. In order to make ECPA reform a reality, now we need a strong push from DCINFO readers.

Contact your Congressional representatives and ask them to support a warrant requirement for all digital content.

Ask them to co-sponsor Leahy-Lee S. 607; Salmon H.R. 1847, and Yoder-Graves, H.R. 1852, and to oppose the SEC proposal. Share wisely, and take care.

FBI & NSA Secretly Mine Data from 9 US Tech Giants

Excerpted from ZDNet Report by Rachel King

Turns out US government agencies might be tapping into into a lot more than just Verizon customer records.

Both the National Security Agency and the Federal Bureau of Investigation are said to have been secretly mining data directly from the servers of at least nine top US-based technology companies, according to The Washington Post.

Citing a leaked presentation intended for only senior analysts within the NSA Signals Intelligence Directorate, which was then obtained by the Post, this was all done since 2007 under a highly-classified program dubbed "PRISM."

As for the companies involved, it's a "who's who" list filled with Silicon Valley behemoths that is surely going to upset lawmakers and average Internet users alike.

The ring of nine consists of Microsoft, Yahoo, Google, Facebook, AOL, Skype, YouTube, Apple, and video chat room community PalTalk. Apparently Dropbox was slated to be the next one added to the list.

The kind of content being extracted from the central servers at theses tech companies include audio, video, photos, e-mails, documents, and connection logs.

According to the report, the data was extracted to produce analysis that could point toward tracking a person's movements and contacts over time.

The Washington Post's Barton Gellman and Laura Poitras highlighted why this is particularly alarming that the NSA was involved:

"It is all the more striking because the NSA, whose lawful mission is foreign intelligence, is reaching deep inside the machinery of American companies that host hundreds of millions of American-held accounts on American soil."

The NSA is already under fire after it was discovered on Wednesday that the agency has been collecting millions of Verizon Wireless customer records on a daily basis.

As first reported by The Guardian, based on another leaked "top secret" court order, the nation's largest mobile provider was ordered on an "ongoing, daily basis" to hand over information outlining call data in its systems to the NSA.

On Thursday, ZDNet obtained a copy of a note sent by Verizon Chief Counsel Randy Milch to employees.

In the note, he didn't confirm or deny the story. But in describing it as an "alleged" court order, he stressed that the text "forbids Verizon from revealing the order's existence."

Europe Continues Wrestling with Online Privacy Rules

Excerpted from NY Times Report by James Kanter and Somini Sengupta

More than a year ago, the European Union's top justice official proposed a tough set of measures for protecting the privacy of personal data online.

But because of intense lobbying by Silicon Valley companies and other powerful groups in Brussels, several proposals have been softened, no agreement is in sight and governments are openly sparring with one another over how far to go in protecting privacy.

On Thursday, justice ministers from the European Union's 27 member states agreed to a business-friendly proposal that what companies do with personal data would be scrutinized by regulators only if there were "risks" to individuals, including identity theft or discrimination.

The ministers debated a proposal that would no longer require companies to obtain "explicit" consent from users whose personal data they collect and process, instead of "unambiguous" consent, which is considered to be a lower legal threshold. And they discussed a proposal on balancing an individual's right to data protection with other rights, including the freedom to do business.

The ministers deferred discussion of the other most fractious provision, the so-called right to be forgotten. But in recent weeks, public comments by lawmakers and draft language suggested a softening of approach.

"The right to be forgotten has been softened, made more palatable," said Viktor Mayer-Schonberger, professor of Internet governance at the University of Oxford. "But it is by no means dead."

Although a final version of the legislation is not expected to be completed for many months, and maybe not until next year, the developments on Thursday are an early signal that the technology industry's lobbying efforts are gaining some traction.

The lobbying has been "exceptional" and legislators in Europe need "to guard against undue pressure from industry and third countries to lower the level of data protection that currently exists," said Peter Hustinx, the European data protection supervisor, referring to countries outside of the European Union.

"The benefits for industry should not and do not need to be at the expense of our fundamental rights to privacy and data protection," Mr. Hustinx warned in e-mailed comments.

In the last year, American technology companies have dispatched representatives to Brussels and issued white papers through industry associations arguing that stringent privacy regulations would hamstring businesses, already suffering from the recession in Europe.

United States government officials have also made trips across the Atlantic to press policy makers like Viviane Reding, the union's justice commissioner, who drafted the original, strict measures, to press for a less restrictive approach to data privacy.

The industry's arguments have found a ready audience among some European governments. They include Ireland and Britain, where there are acute worries that the European Union is failing to take advantage of growth opportunities from Internet businesses that might help revive the economy. Apple, Facebook and Google all have European headquarters in Dublin.

"Europe is not sleepwalking into unworkable regulations," said Richard Allan, Facebook's director of policy for Europe, echoing a cautious optimism among industry officials about the data privacy law. "What's positive is that over the last year, the debate has broadened out. There are other voices in the debate, who are saying: 'Hang on a minute. What about the economic crisis?' "

The proposed law would affect most companies that deal in personal information — including pictures posted on social networks or information on what people buy on retail sites or look for using a search engine.

Whatever is enacted would serve as the privacy law in every country in the European Union and potentially have a bearing on other countries drafting data protection laws of their own.

The ministers took up their version of the law on Thursday; another version is under discussion by the European Parliament.

The Parliament is considering its version while intense lobbying is being conducted by companies, governments and other organizations that have submitted about 4,000 amendments. The various versions will need to be reconciled before they can become law.

"Ultimately, nothing is agreed until everything is agreed," Alan Shatter, the Irish justice minister, who was the chairman of Thursday's meeting, told his colleagues.

The liberal-leaning language on obtaining consent from Web users was suggested by the Irish. Another provision the Irish suggested would remove the restriction on companies to collect only a "minimum" amount of data. That could be important for any company that wants to use the data it collects in new ways.

But Chris Grayling, the British justice secretary, said those concessions were not enough. He said the rules were still overly burdensome for European businesses, along with United States technology giants.

Although the most contentious provision — the right to be forgotten — did not come up on Thursday, language is circulating that could limit it. The provision as originally conceived would enable users to demand that their accumulated data be deleted forever. But it has raised questions about what to do with data that is in turn shared with third-party companies.

The softer language would tweak it so that the data may not necessarily disappear from the Internet as a whole, but be scrubbed from the site where it was originally posted.

The shift, said David A. Hoffman, global privacy officer for Intel, shows "a much greater degree of nuance."

Mr. Hoffman added, "They are going to move it much closer to a more limited, balanced ability for individuals to access their data and then to request their data be corrected, amended or deleted."

That is fairly close to what exists in most European countries now.

Also still to be debated are the sanctions for breaking the law, which could include fines of up to 2 percent of a company's annual worldwide sales.

Ms. Reding sharply disagreed with some of the modifications put forth by the Irish. And she won strong backing during the meeting from Italy and France for maintaining robust rules requiring companies to give consumers every chance to guard their privacy online.

The French justice minister, Christiane Taubira, said at the meeting: "Not saying anything is not the same thing as saying yes, so we think it's very important that we have explicit consent. We have to protect people."

CIA Told to Rebid Amazon Cloud Contract

Excerpted from Wall St. Journal Report by Spencer Ante

The Government Accountability Office (GAO) recommended on Thursday that the Central Intelligence Agency (CIA) reopen negotiations for a large cloud-computing services contract that was originally awarded to Amazon.

The GAO was acting in response to a protest filed by IBM over the contract worth an estimated maximum of $600 million over its initial four-year period.

A spokesman for the GAO said the requirements for the contract hadn't been released publicly, but he added that the contract is aimed at funding the development of a private cloud-computing service that would be run and operated by a commercial provider for the U.S. intelligence community.

The CIA has 60 days to say whether it will follow the recommendation. A GAO spokesman said government agencies almost always follow the recommendations.

The GAO upheld two portions of the IBM protest. One claim was that the CIA failed to evaluate prices for a possible task provided for in the contract. The second sustained protest was that a software-security requirement in the contract was waived by the CIA only for Amazon during negotiations.

The GAO denied IBM's argument that the CIA didn't properly evaluate Amazon's past performance given certain service outages that occurred with Amazon's cloud service in 2012.

"The CIA selected Amazon Web Services based on its superior technological platform which will allow the agency to rapidly innovate while delivering the confidence and security assurance needed for mission-critical systems," said a spokeswoman for Amazon Web Services, the company's unit that provides cloud computing, in an email.

The spokeswoman said Amazon looks forward to a fast resolution of the two issues so the CIA can move forward with an important contract.

"We now anticipate the reopening of the contract proposal process and look forward to competing for the opportunity to serve this important federal agency on this vital program," IBM said in a written statement.

The GAO's decision was issued under a protective order because it may contain proprietary and sensitive information.

If the CIA doesn't follow the recommendation, the GAO spokesman said the GAO would be required to alert several congressional committees.

Then it would become a matter of a political disagreement, said the spokesman.

Amazon didn't respond immediately to a request for comment.

Cloud Computing Rains Billion-Dollar Deals

Excerpted from San Jose Mercury News Report by Michael Liedtke

A decade ago, the mere idea of cloud computing was a difficult concept to explain, let alone sell. Today, the technology is spurring a high-stakes scramble to buy some of the early leaders in the cloud-computing movement.

The latest examples of the trend emerged Tuesday as two major technology companies announced acquisitions aimed at seeding their own clouds.

Cloud-computing pioneer Salesforce.com said it will spend about $2.5 billion to buy ExactTarget, a specialist in helping other companies manage marketing campaigns and other business functions through email, social networks and a variety of digital services that can be reached on any device with an Internet connection.

The more time-tested IBM is snapping up SoftLayer Technologies, a privately held company that leases extra computing horsepower to startup companies and medium-size businesses that don't have the resources or desire to build their own data centers. IBM didn't disclose the financial terms of the deal, but The Wall Street Journal pegged the cost at about $2 billion. The Journal cited an unidentified person familiar with the matter.

ExactTarget, based in Indianapolis, IN and SoftLayer, based in Dallas, TX, are just the latest in a batch of billion-dollar babies hatched by what was once viewed as a kooky craze.

Cloud computing refers to the practice of renting software and other computing accessories over the Internet, an approach that once seemed out of step with the long-standing policies of corporate customers and government agencies who preferred to own their machines and the applications running on them.

But that sentiment has changed in the past six years as the popularity of powerful smart-phones and tablet computers has driven the demand for services that can be reached from any Internet-connected device.

The phenomenon has helped propel cloud computing, and driven lucrative deals in the space. In the past two years alone, long-established technology companies such as IBM, Oracle, and SAP have each spent several billions of dollars acquiring cloud-computing vendors.

"Deals beget deals," said Peter Falvey, a Boston investment banker specializing in technology. "There are no doubt other companies now saying, 'What else is there to buy out in this space?' It's all part of the maturation process."

Questions are already being raised whether the buyers are getting overzealous and paying too much.

Investors seemed particularly troubled by Salesforce's decision to buy ExactTarget for $33.75 -- 53 percent above ExactTarget's market value before the deal was announced. It's a steep premium for a company that has suffered losses in each of the past four years, including nearly $58 million since the end of 2009.

Wall Street's misgivings about the deal caused Salesforce's stock to plunge $3.24, or 7.9 percent, to close Tuesday at $37.80. It was the steepest one-day decline in the stock in 13 months.

ExactTarget's setbacks came during a period of rapid growth. The company's annual revenue climbed from $72 million in 2008 to $292 million last year while its payroll has ballooned from under 400 employees in 2008 to nearly 1,700 now.

Salesforce expects ExactTarget to trim its adjusted earnings during its current fiscal year ending in January 2014 by 16 cents per share. The deal is expected to close by the end of next month.

Salesforce CEO Marc Benioff has become accustomed to shaking off the skeptics. He was frequently mocked when he started his San Francisco-based company in 1999 and boldly predicted its model for leasing business software applications would revolutionize the technology industry. Even after Tuesday's sell-off, Salesforce now boasts a market value of $22 billion. Benioff's personal fortune stands at an estimated $2.6 billion.

Benioff, an executive known for sweeping statements, hailed the ExactTarget deal "as really a historic date for cloud computing" during a Tuesday conference call for analysts. He is counting on ExactTarget to help Salesforce sell more marketing services on mobile devices. As part of its marketing expansion, Salesforce previously spent nearly $1.1 billion to buy Buddy Media and Radian6 Technologies during the past two years.

Once the ExactTarget deal is completed, Benioff said Salesforce plans to take a "vacation" from deal making for 12 to 18 months.

"That's really because we're going to double down and focus on the success of ExactTarget," Benioff assured analysts.

Cloud Computing Driving Data Center Automation

Excerpted from Data Center Knowledge Report by Bill Kleyman

The dynamic nature of cloud computing has pushed data center workload, server, and even hardware automation to whole new levels. Now, any data center provider looking to get into cloud computing must look at some form of automation to help them be as agile as possible in the cloud world.

New technologies are forcing data center providers to adopt new methods to increase efficiency, scalability and redundancy. Let's face facts; there are numerous big trends which have emphasized the increased use of data center facilities. These trends include more users, more devices, more cloud, more workloads, and a lot more data.

As infrastructure improves, more companies have looked towards the data center provider to offload a big part of their IT infrastructure. With better cost structures and even better incentives in moving towards a data center environment, organizations of all sizes are looking at co-location as an option for their IT environment.

With that, data center administrators are teaming with networking, infrastructure and cloud architects to create an even more efficient environment. This means creating intelligent systems from the hardware to the software layer. This growth in data center dependency has resulted in direct growth around automation and orchestration technologies.

Now, organizations can granularly control resources, both internally and in the cloud. This type of automation can be seen at both the software layer as well as the hardware layer. Vendors like BMC, ServiceNow, and Microsoft SCCM/SCOM are working towards unifying massive systems under one management engine to provide a single pain of glass into the data center workload environment.

Furthermore, technologies like the Cisco UCS platform allow administrators to virtualize the hardware layer and create completely automated hardware profiles for new blades and servers. This hardware automation can then be tied into software-based automation tools like SCCM. Already we're seeing direct integration between software management tools and the hardware layer.

Finally, from a cloud layer, platforms like CloudStack and OpenStack allow organizations to create orchestrated and automated fluid cloud environments capable of very dynamic scalability. Still, when a physical server or hardware component breaks — we still need a person to swap out that blade.

To break it down, it's important to understand what layers of automation and orchestration are available now — and what might be available in the future.

Server layer. Server and hardware automation have come a long way. As mentioned earlier, there are systems now available which take almost all of the configuration pieces out of deploying a server. Administrators only need to deploy one server profile and allow new servers to pick up those settings. More data centers are trying to get into the cloud business. This means deploying high-density, fast-provisioned, servers and blades. With the on-demand nature of the cloud, being able to quickly deploy fully configured servers is a big plus for staying agile and very proactive.

Software layer. Entire applications can be automated and provisioned based on usage and resource utilization. Using the latest load-balancing tools, administrators are able to set thresholds for key applications running within the environment. If a load-balancer, a NetScaler for example, sees that a certain type of application is receiving too many connections, it can set off a process that will allow the administrator to provision another instance of the application or a new server which will host the app.

Virtual layer. The modern data center is now full of virtualization and virtual machines. In using solutions like Citrix's Provisioning Server or Unidesk's layering software technologies, administrators are able to take workload provisioning to a whole new level. Imagine being able to set a process that will kick-start the creation of a new virtual server when one starts to get over-utilized. Now, administrators can create truly automated virtual machine environments where each workload is monitored, managed and controlled.

Cloud layer. This is a new and still emerging field. Still, some very large organizations are already deploying technologies like CloudStack, OpenStack, and even OpenNebula. Furthermore, they're tying these platforms in with big data management solutions like MapR and Hadoop. What's happening now is true cloud-layer automation. Organizations can deploy distributed data centers and have the entire cloud layer managed by a cloud-control software platform. Engineers are able to monitor workloads, how data is being distributed, and the health of the cloud infrastructure. The great part about these technologies is that organizations can deploy a true private cloud, with as much control and redundancy as a public cloud instance.

Data center layer. Although entire data center automation technologies aren't quite here yet, we are seeing more robotics appear within the data center environment. Robotic arms already control massive tape libraries for Google and robotics automation is a thoroughly discussed concept among other large data center providers. In a recent article, we discussed the concept of a "lights-out" data center in the future. Many experts agree that eventually, data center automation and robotics will likely make its way into the data center of tomorrow. For now, automation at the physical data center layer is only a developing concept.

The need to deploy more advanced cloud solution is only going to grow. More organizations of all verticals and sizes are seeing benefits of moving towards a cloud platform. At the end of the day, all of these resources, workloads and applications have to reside somewhere. That somewhere is always the data center.

In working with modern data center technologies administrators strive to be as efficient and agile as possible. This means deploying new types of automation solutions which span the entire technology stack. Over the upcoming couple of years, automation and orchestration technologies will continue to become popular as the data center becomes an even more core piece for any organization.

3 More Cloud Computing Myths Dispelled

Excerpted from InfoWorld Report by David Linthicum

Speaking season will be over this month. So far this year, I've done more than 25 presentations at various conferences around the country and have spoken with hundreds of people who are building or using clouds. All of this traveling and interaction has provided more data points for me to share via this blog.

Although some of the conversations have been encouraging, many have been downright disturbing because of the misconceptions people have. There continues to be much mythology in the cloud computing space, as I've pointed out in the past. It's time we dispel three more myths so that we can better understand the reality of the cloud.

Myth 1: Private clouds are, by default, secure.

Many enterprises are implementing private clouds with the assumption that just because it's private, it's also secure. Not true.

Security is something you design and engineer into the cloud solution -- it's not automatic. Thus, private clouds are not secure by default, and public clouds are not insecure by default.

You have to design and implement the appropriate security solution into the cloud. Just because you can see your server in the data center doesn't mean anything. After all, the data could be compromised as you watch it.

Myth 2: If I go OpenStack, I guarantee portability with other OpenStack providers.

Although OpenStack is becoming a solid IaaS cloud standard, there is no portability guarantee among OpenStack distribution providers. Who knows what the future holds? If you think you can write an application on an OpenStack private cloud and move it to an OpenStack public cloud without any modifications, you're dreaming. Those moving to OpenStack should be doing so because of the potential of this technology, not for portability.

Myth 3: The public cloud providers will access and analyze my data without my knowledge.

Public cloud providers couldn't care less about your data. They do care that you're successful with your use of their cloud -- and that you pay your cloud computing bills. The myth that they are selling your data to third parties is just not true, nor are they using it for their own market intelligence.

If you're concerned about such use, then encrypt the information you place on the public cloud. That way, nobody can see it even if it's seized by the government or accessed by a bored cloud data center admin. If you're even more paranoid, don't use a public cloud provider.

DataDirect Networks Appoints Joseph L. Cowan President 

DataDirect Networks (DDN) announced today that veteran technology executive Joseph L. Cowan has been named President of the company, reporting to CEO Alex Bouzari effective immediately.

In his role, Cowan will assume full operational management of the company, allowing Bouzari to focus on refining and enhancing the company strategy.

Cowan brings a proven track record in executive management to DDN, as well as significant experience repositioning private and publicly owned companies for sustainable growth. Cowan has served on DDN's Board of Directors since 2011 and most recently, held the position of president and CEO of Online Resources Corporation, a leading provider of online financial services which was acquired in January 2013 by ACI Worldwide.

Prior to that role, Cowan served as CEO and a member of the Board of Directors of Interwoven, a provider of content management software, from April 2007 until its acquisition by Autonomy in March 2009. Cowan also has held executive leadership roles at Manugistics Grou, EXE Technologies, and Invensys Automation & Information Systems, among others.

At DDN, Cowan's responsibilities will include driving accelerated revenue and profit growth in the global big data and cloud infrastructure marketplace, as well as leading and aligning the company's worldwide corporate strategy to meet its overall business objectives and enhance its competitiveness.

Cowan received a BS degree from Auburn University and an MS degree from Arizona State University.

Tennis Channel Selects Octoshape

Octoshape, an industry leader in cloud-based streaming technology, and America ONE Sports, a leading provider of live broadband sports, announced today that they have been chosen for the third consecutive year by Tennis Channel as the exclusive provider of video streaming services for the prestigious 121st French Open on TennisChannel.com.

Tennis Channel will provide over 200 hours of coverage of the French Open, the premier clay-court tournament in the world and second of the four Grand Slam tournaments. The network is providing a live stream from its web site that allows online viewers to watch all the coverage for free.

"America ONE Sports and Octoshape have given Tennis Channel viewers the opportunity to watch the French Open where they want, when they want," said Robyn Miller, Senior Vice President, Marketing, Tennis Channel.

The consumer experience will include both live and video on demand, as well as advanced features such as instant rewind, picture-in-picture viewing and proprietary bonus features developed by America ONE Sports, such as live chat and real time event statistics. America ONE's sister company, ONE CONNXT, an award-winning leader in broadcast delivery, will provide the contribution transport technology so fans will not miss a moment of the excitement.

"This will be our third year providing broadband distribution for the French Open utilizing our partnership with Tennis Channel. Each year we truly try to push the envelope on features to enhance the user experience for this marquee event," said Paul Dingwitz, Chief Technology Officer, ONE Media Corp.

The video broadcast is delivered to over-the-top consumers via Octoshape's Infinite HD-M Federated Multicast Broadband TV platform. This technology enables the quality, scale and economics of traditional broadcast TV over the Internet. Telco and cable operators that are part of the Infinite HD-M Federated network receive the signals via native IP Multicast in a way that allows them to easily manage large volumes of traffic without upgrading their Internet capacity — ensuring a technically smoother streaming experience regardless of the usage.

"We are very proud to again be selected to support the Tennis Channel for the French Open," said Michael Koehn Milland, CEO of Octoshape. "We continue to raise the bar for providing TV to broadband-connected devices, and this year is no exception."

Scalability — The New Buzzword for Cloud Computing

Excerpted from Cloud Computing Journal Report by Liz McMillan

An exclusive Q&A with Barbara Aichinger, Co-Founder of FuturePlus Systems and VP of New Business Development.

"I think Social, Mobile, Analytics, Cloud (SMAC) will continue to grow and mature and the demand for more sophisticated data like streaming videos and online television will be a real game changer," stated Barbara Aichinger, co-founder of FuturePlus Systems and VP of New Business Development, in this exclusive Q&A with Cloud Expo Conference Chair. "Analytics is also very important and will become part of "Big Data," meaning the data pulled and pushed to and from the cloud will have an analytic associated with it."

Cloud Computing Journal: The move to cloud isn't about saving money, it is about saving time - agree or disagree?

Barbara Aichinger: It's about saving time and money but it can be scary. As someone who really understands how the hardware works and, more important, how it doesn't work I can understand the apprehension. The cloud needs standards especially in the quality and reliability area so that folks know their data is safe.

Cloud Computing Journal: How should organizations tackle their regulatory and compliance concerns in the cloud? Who should they be asking/trusting for advice?

Aichinger: I see two pieces here - first is the actual hardware, networking and the physical building that the machines are in. The second is all the layered products and software. My company is on the hardware side so I can speak to that. We make validation tools used by the designers of cloud servers. We see cost pressures causing hardware vendors to take short cuts on system validation. This can then show up in the data center as "ghost errors," i.e., memory errors that just happen once in a while but over time cause system outages.

The industry does not have standards for memory subsystem compliance. What has happened is that the industry has created tools to test the memory subsystem for compliance to the JEDEC memory specification. This is different from an actual compliance program. It's basically voluntary. If you are a good vendor, you use the right equipment to make sure your servers are compliant. The problem is many vendors do not, and since the end user has no idea about memory standards, they never ask what type of compliance testing is being done. 

IT managers should insist on seeing the validation reports for the servers they buy. System integrators who package up various hardware pieces and sell them as a complete server should also take a good hard look at what memory subsystem compliance testing has been done.

Cloud Computing Journal: What does the emergence of open source clouds mean for the cloud ecosystem? How does the existence of OpenStack, CloudStack, OpenNebula, Eucalyptus and so on affect your own company?

Aichinger: I think open source is great especially for small companies trying to put something together for initial product releases. We use open source Ubuntu and the Google Stress App Test to exercise cloud hardware so that our tool can see if the memory subsystem is violating the JEDEC rules. In eight out of 10 systems we look at we find violations. 

These don't cause errors right away but over months and years they are statistically very likely to cause system crashes and cloud outages. OpenStack, CloudStack, OpenNebula, and Eucalyptus are all great additions to the cloud ecosystem. Our role at FuturePlus Systems is to make sure the hardware stays up by validating the design so these products can add value to the users of the cloud.

Cloud Computing Journal: With SMBs, the two primary challenges they face moving to the cloud are always stated as being cost and trust: where is the industry on satisfying SMBs on both points simultaneously - further along than in 2011-12, or...?

Aichinger: I think consumers don't have too much problem with moving to the cloud but for small businesses it can be a challenge. For a core engineering company like FuturePlus Systems the issue is more trust than cost. We know the ins and outs of the cloud hardware and the network so we take a good hard look at what data we keep in the cloud. Having said that we are excited to see our tools being used more and more by system integrators and cloud IT managers. We are teaching them how to make sure the systems they deploy in the cloud are quality and that the memory subsystems are compliant to the standards. As the cloud becomes "healthier" and more secure more SMBs will be more comfortable moving to the cloud.

Cloud Computing Journal: 2013 seems to be turning into a breakthrough year for Big Data. How much does the success of cloud computing have to do with that?

Aichinger: With all the sensors and mobile devices on the market Big Data is inevitable and that pushes the need for expansion in the cloud. Scalability is the new buzzword for cloud computing. I have read some papers that say Google already has one million servers deployed. Facebook is heading there quickly right along with others.

What many don't stop and think about is how do failures scale? In 2009 a landmark study looked at the failure rate of the memory DIMMs in the Google fleet of servers. The data suggested that failures were far more often than the vendors' specifications indicated they should be. Scaling those errors out to the one million servers Google has and the reported uncorrectable error rate of 1.3% to 4% would have servers going down 13,000 to 40,000 times a year. 

Boil that down and you have two to five failures every hour somewhere in the Google fleet. These failures are expensive. The system has to be taken offline and repaired or replaced. Energy costs, labor and parts can quickly add up.

Cloud Computing Journal: What about the role of social: aside from the acronym itself SMAC (for Social, Mobile, Analytics, Cloud) are you seeing and/or anticipating major traction in this area?

Aichinger: I think SMAC will continue to grow and mature. I think the demand for more sophisticated data like streaming videos and online television will be a real game changer. Analytics is also very important and will become part of "Big Data," meaning the data pulled and pushed to and from the cloud will have an analytic associated with it. 

FuturePlus Systems has been dealing with Big Data for decades as we capture every signal on every clock edge in our tools that validate the cloud hardware. With cloud hardware going faster and becoming greener we have lots more data points to add to our own "Big Data" problem. I also think SMAC will drive better visualization techniques so that humans will be better able to digest all of the analytics associated with it.

Cloud Computing Journal: To finish, just as real estate is always said to be about "location, location, location", what one word, repeated three times, would you say Cloud Computing is all about?

Aichinger: Standards, Standards, Standards. When I meet up with engineers and managers who actually have to deploy the cloud hardware or that provide cloud hardware, they seem to be constantly exhausted. Pricing pressures are causing them to look hard at the nuances between the various server platforms, disk vendors and DIMM DRAM memory vendors. How do they know if it will work reliably? How do they know what the performance is? Will it be hard to maintain? 

Most people never think about the actual machines that run in the data centers. These data centers are the life blood of the cloud. If they don't work well I don't care how good or open your software is... it's not going to do anything. Good solid standards that address the functionality of the hardware, the reliability of the disks, the JEDEC compliance of the memory subsystem would go a long way to advancing cloud computing. 

This is where FuturePlus Systems comes in. We have been providing validation tools for both hardware and software developers for more than 20 years. We are moving into the data center to help customers evaluate the servers they are purchasing and allowing software developers to see what performance loads they are putting on the system.

Coming Events of Interest

2013 Creative Storage Conference June 25th in Culver City, CA. Co-sponsored by the DCIA, CSC:2013 offers attendees opportunities to make valuable connections and participate in the latest trends and requirements for digital storage to serve creative minds and create new and growing markets. Register now and save $100.

Cloud World Forum - June 26th-27th in London, England. The Cloud World Forum offers a comprehensive agenda and speaker line-up from the cloud sector, making it an ideal platform for global authorities to present their "how-to" strategy and vision. Many recognized headline participants along with detailed coverage of the enterprise IT market.

Cloud Computing Summit - July 16th-17th in Bradenton, South Africa. Advance your awareness of the latest trends and innovations from the world of cloud computing. This year's ITWeb-sponsored event will focus on key advances relating to the infrastructure, operations, and available services through the global network.

NordiCloud 2013 - September 1st-3rd in Oslo, Norway. The Nordic Symposium on Cloud Computing & Internet Technologies (NordiCloud) aims at providing an industrial and scientific forum for enhancing collaboration between industry and academic communities from Nordic and Baltic countries in the area of Cloud Computing and Internet Technologies.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Two major conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; and the impact of mobile cloud computing and Big Data analytics in this space.

Copyright 2008 Distributed Computing Industry Association
This page last updated June 15, 2013
Privacy Policy