April 1, 2013
Volume XLIII, Issue 4
CLOUD COMPUTING CONFERENCE at the 2013 NAB Show
Don't miss the CLOUD COMPUTING CONFERENCE at the 2013 NAB Show next Monday and Tuesday April 8th and 9th in the Las Vegas Convention Center, Las Vegas, NV.
Sponsors for this year's CLOUD COMPUTING CONFERENCE include Amazon Web Services, Aspera, DAX, Equinix, and YouSendIt.
Our 2013 event track will demonstrate the new ways cloud-based solutions are providing increased reliability and security, not only for commercial broadcasting and enterprise applications, but also for military and government implementations.
From collaboration during production, to post-production and formatting, to interim storage, delivery, and playback on fixed and mobile devices, to viewership measurement and big-data analytics, cloud computing is having an enormous impact on high-value multimedia distribution.
The 2013 Conference has been extended from one to two full-days reflecting the increased importance of and growing interest in its subject matter.
There are special discount codes for DCINFO readers to attend the NAB Show. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04.
Pre-NAB Cloud Computing Survey
John Schiela, President of Converged Technology & Media at independent market research and consulting firm Phoenix Marketing International (PMI), invites you to participate in a brief survey regarding cloud computing services.
DCINFO readers are eligible exclusively for this survey, which needs to be completed as soon as possible. All respondents who complete the survey will be rewarded with a $25 restaurant.com gift certificate.
To complete the survey, please follow this link: Cloud Computing Survey.
If you are attending the upcoming National Association of Broadcasters' 2013 NAB Show, John will be presenting during the NAB's CLOUD COMPUTING CONFERENCE on Monday, April 8th, in Room N249 at 11:30 AM..
Thank you for your participation in the survey.
Report from CEO Marty Lafferty
We're very excited to announce additional speakers for our upcoming CLOUD COMPUTING CONFERENCE at the 2013 NAB Show taking place next Monday and Tuesday, April 8th and 9th, at the Las Vegas Convention Center.
DAY ONE will begin with an "Industry Update on Cloud Adoption."
How are cloud-based technologies currently being deployed throughout the audio/video (A/V) ecosystem? What file-based workflow strategies, products, and services are now working best?
After an introductory presentation by Mark Ramberg, Amazon Web Services, a panel discussion with Dr. Frank Aycock, Appalachian State University; Jonathan Hurd, Altman Vilandrie; Rob Kay, Strategic Blue; and Patrick Lopez, Core Analysis, will thoroughly examine this emerging market segment.
Next, we'll discuss "Outstanding Issues: Reliability & Security." What remaining pitfalls cause producers and distributors to resist migrating to the cloud? How are liability, predictability, privacy, and safety considerations being addressed?
Speaker Shekhar Gupta, Motorola Mobility, will introduce the topic. And then a panel with Lawrence Freedman, Edwards Wildman Palmer; Tanya Frerichs, Docusign; Jason Shah, Mediafly; and John Schiela, Phoenix Marketing International, will follow-up with further discussion.
Then "Cloud Solutions for Content Creation" will be our subject. How is cloud computing being used for collaboration and other pre-production functions? What do dailies-screening and editing in the cloud offer the content production process?
Speaker Patrick MacDonald King, DAX will explore this area first. And then a panel with Sean Barger, Equilibrium; Morgan Fiumi, Sfera Studios; Rob Green, Abacast; and Robert Blackburn, Equinix, will continue our examination.
"Post-Production in the Cloud" will follow. What do cloud solutions bring to post-production functions such as animation and graphics generation? How are formatting, applying metadata, and transcoding improved by cloud computing?
Our DAY ONE Marquee Keynote Chris Launey of Disney will speak first.
Then a panel with Jim Duval, Telestream; Joe Foxton, MediaSilo; Jun Heider, RealEyes; and Bill Sewell, Wiredrive, will delve into this topic in more detail.
Next, we'll discuss "Cloud-Based Multimedia Storage." How are data centers and content delivery networks (CDNs) at the edge evolving? What do business-to-business (B2B) storage solutions and consumer "cloud media lockers" have in common?
Speaker Dave Fellinger, DataDirect Networks, will address the topic first. And then a panel with Bang Chang, XOR Media; Tom Gallivan, WD; Mike Wall, Amplidata; and Douglas Trumbull, Trumbull Ventures, will follow up with further discussion.
DAY ONE will end with "Content Delivery from the Cloud." How is cloud computing being used to enable distribution and playback on multiple fixed and mobile platforms? What does the cloud offer to improve the economics of "TV Everywhere?"
Speaker Chris Rittler, Deluxe Digital Distribution, will explore this area first. And then a panel with Brian Campanotti, Front Porch Digital; Malik Khan, LTN Global Communications; John Maniccia, Octoshape; and Mike West, GenosTV, will continue the examination
DAY TWO will open with four cloud implementation case studies.
How was cloud computing used most successfully during 2012 in the multimedia content distribution chain? What lessons can be learned from these deployments that will benefit other industry players?
Case studies will be presented by Jason Suess, Microsoft; Andrea DiMuzio, Aspera; Keith Goldberg, Fox Networks, and Ryan Korte, Level 3; and Baskar Subramanian, Amagi Media Labs. Then the presenters will join in a panel discussion.
Next, we'll look at "Changes in Cloud Computing." How is the cloud-computing industry changing in relation to content rights-holders? What new specialized functions-in-the-cloud, interoperability improvements, and standardization are coming this year?
First, David Cerf, Crossroads Systems; Margaret Dawson, Symform; Jeff Malkin, Encoding; and Venkat Uppuluri, Gaian Solutions, will join in a panel. And then Mark Davis, Scenios, will speak on this topic.
"A Future Vision of the Cloud" will explore what to expect next. What do the latest forecasts project about the ways that cloud-computing solutions will continue to impact the A/V ecosystem over the long term? How will the underlying businesses that are based on content production and distribution be affected?
Panelists John Gildred, SyncTV; Karen Keehan, ODCA; Mike Sax, ACT; and Sam Vasisht, Veveo, will join in the discussion.
"Military & Government Cloud Requirements" will follow. How do the needs of military branches and government agencies for securely managing multimedia assets differ from the private sector? What do these requirements have in common with commercial practices?
Michael Weintraub, Verizon, will speak first. Then Scott Campbell, SAP America; Fabian Gordon, Ignite Technologies; Linda Senigaglia, HERTZ NeverLost; and Alex Stein, Eccentex, will go into more depth.
Next, we'll explore "Unique Cloud-Based Solutions." What are cloud solutions providers currently developing to address specific considerations of the intelligence community (IC) in fulfilling its missions? How will these approaches evolve and change during 2013?
DAY TWO Marquee Keynote Saul Berman of IBM, will address this area first.
Then David Bornstein, Akamai; Rajan Samtani, Industry Consultant; Ganesh Sankaran, PrimeFocus; and Dan Schnapp, Hughes Hubbard & Reed, will continue this examination.
Four relevant cloud Case studies will follow.
How is cloud computing being used to help securely manage sensitive multimedia? What lessons can be learned from these deployments that will benefit military and government organizations?
Grant Kirkwood, Unitas Global; William Michael, NEC Corporation; Randy Kreiser, DataDirect Networks; and John Delay, Harris, will present case studies.
These presenters will then join in a panel discussion.
The Conference Closing will tie back to the commercial sector. How do those involved in multimedia production, storage, and distribution leverage cloud-based solutions to their fullest potential? What resources are available for comparing notes and staying current on the latest developments?
Our closing session speakers will be Steve Russell, Tata Communications, and Jeffrey Stansfield, Advantage Video Systems.
There are special discount codes for DCINFO readers to attend the NAB Show. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04. Share wisely, and take care.
10 Quotes on Cloud Computing That Really Say it All
Excerpted from Forbes Report by Joe McKendrick
Plenty has been said or written on cloud computing in recent years — pro, con and somewhere in between. Periodically throughout the rise of cloud computing, there have been some real gems put out there, aptly describing what's on people's minds — and maybe what needed to be said.
Oracle CEO Larry Ellison's famous "fashion-driven" analogy back in 2008 is the stuff of legend, and makes this list, compiled below. Here are some memorable quotes and apropos quotes about cloud that that have surfaced over the years:
1) "First to mind when asked what 'the cloud' is, a majority respond it's either an actual cloud, the sky, or something related to weather." - Citrix Cloud Survey Guide (August 2012).
2) "Ultimately, the cloud is the latest example of Schumpeterian creative destruction: creating wealth for those who exploit it; and leading to the demise of those that don't." - Joe Weinman, Senior VP at Telx and author of "Cloudonomics: The Business Value of Cloud Computing."
3) "Cloud computing is often far more secure than traditional computing, because companies like Google and Amazon can attract and retain cyber-security personnel of a higher quality than many governmental agencies." - Vivek Kundra, former Federal CIO of the United States.
4) "Discontinued products and services are nothing new, of course, but what is new with the coming of the cloud is the discontinuation of services to which people have entrusted a lot of personal or otherwise important data — and in many cases devoted a lot of time to creating and organizing that data. As businesses ratchet up their use of cloud services, they're going to struggle with similar problems, sometimes on a much greater scale. I don't see any way around this — it's the price we pay for the convenience of centralized apps and databases — but it's worth keeping in mind that in the cloud we're all guinea pigs, and that means we're all dispensable. Caveat cloudster." — Nick Carr, author of "Does IT Matter?," "The Big Switch," and "The Shallows."
5) "We believe we're moving out of the Ice Age, the Iron Age, the Industrial Age, the Information Age, to the participation age. You get on the Net and you do stuff. You instant message (IM), you blog, you take pictures, you publish, you podcast, you transact, you distance learn, you telemedicine. You are participating on the Internet, not just viewing stuff. We build the infrastructure that goes in the data center that facilitates the participation age. We build that big friggin' webtone switch. It has security, directory, identity, privacy, storage, compute, the whole Web services stack." — Scott McNealy, former CEO, Sun Microsystems.
6) "The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do. I can't think of anything that isn't cloud computing with all of these announcements. The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?" — Larry Ellison, Chairman, Oracle.
7) "I don't need a hard disk in my computer if I can get to the server faster… carrying around these non-connected computers is byzantine by comparison." — Steve Jobs, late Chairman of Apple (1997).
8) "Line-of-business leaders everywhere are bypassing IT departments to get applications from the cloud (also known as software as a service, or SaaS) and paying for them like they would a magazine subscription. And when the service is no longer required, they can cancel that subscription with no equipment left unused in the corner." — Daryl Plummer, Gartner analyst.
9) "If you think you've seen this movie before, you are right. Cloud computing is based on the time-sharing model we leveraged years ago before we could afford our own computers. The idea is to share computing power among many companies and people, thereby reducing the cost of that computing power to those who leverage it. The value of time share and the core value of cloud computing are pretty much the same, only the resources these days are much better and more cost effective." — David Linthicum, author, "Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide."
10) "Flying by the seat of the pants must have been a great experience for the magnificent men in the flying machines of days gone by, but no one would think of taking that risk with the lives of 500 passengers on a modern aircraft. The business managers of a modern enterprise should not have to take that risk either. We must develop standard cloud metrics and ROI models, so that they can have instruments to measure success." — Dr. Chris Harding, Director for Interoperability and SOA at The Open Group.
BitTorrent CEO: How to Upgrade the Internet
Excerpted from GigaOM Reporty by Stacey Higginbotham
Could distributed computing hold the future for scaling out the Internet and meeting our increasing demands for broadband? The CEO of BitTorrent argues it does have a place in next generation architectures.
If we persist in thinking of the Internet as an information superhighway, then we'll continue to handle congestion by adding more lanes, via expensive upgrades in the core network, at the edge and at the last mile. The end result of our love affair with connectivity is a losing proposition for ISPs who are forced to upgrade their networks to meet the ongoing demand for broadband without taking enough of a share from the growing Internet economy to meet their margins.
So writes Eric Klinker, in the Harvard Business Review blog, in a solid post about how we're going to manage the growth of the Internet. While Klinker sounds like many a telco-funded astroturfer in his worries about ISP profits, he's actually the CEO of file-sharing site, BitTorrent. And his arguments are worth listening to on both sides of the Internet divide — the ISPs and the content companies looking to ride those pipes.
In the post, which is similar in spirit to one he wrote for GigaOM in 2011, he agues that the problem on the Internet is congestion, and that there are far more ways to address congestion than just adding more lanes. And of course as the CEO of BitTorrent, which has a proprietary file transfer system that is composed of masses of distributed computers, his main idea is distributed computing. From the article:
"Distributed computing systems work with unprecedented efficiency. You don't need to build server farms, or new networks, to bring an application to life. Each computer acts as its own server; leveraging existing network connections distributed across the entirety of the Internet. BitTorrent is a primary example of distributed computing systems at work. Each month, via BitTorrent, millions of machines work together to deliver petabytes of data across the web, to millions of users, at zero cost. And BitTorrent isn't the only example of distributed technology at work today. Skype uses distributed computing systems to deliver calls. Spotify uses distributed computing systems to deliver music."
The challenges associated with this are obvious. Customers have to download clients in order to use such networks, and they will still affect the end user's connection at the last mile or in the airwaves and at cell sites on mobile networks. Thus, they can tax ISP networks (although they can be optimized). But with video a huge driver of congestion on the consumer side, it's a solution that could work, since people will download software in order to watch TV. Even ISPs have tested distributed computing when they tried out the P4P network protocol way back in 2008.
Distributed computing would force many popular web services to reconsider how they build their applications and stream their files, which could have a large effect on big websites such as Facebook or Google as well as content companies and content delivery networks. Another option, and one that we're inching toward, is smart routers and prioritization schemes where the user can set their own network parameters to best use the bandwidth they have available. Software-defined networks will also make such prioritization easier and cheaper to manage inside the core telco network as well.
There's also a more controversial idea of ISPs charging more for broadband during peaks times, as opposed to current data caps that limit people no matter if they download information at 2AM or during prime time. True congestion pricing would also force users to bear to cost of overburdening the ISP network, although ISPs would then have to be open about how often their networks are congested and would risk consumers losing their appetite for broadband. My hunch is that neither the ISPs or the content companies want that to happen, although it's still far from clear that upgrades are the death knell for the cable and telco companies, as opposed to a painful shift in their margin profiles.
Regardless, we're only asking for more broadband and more Internet services, so Klinker's article is a welcome reminder that none of that will come for free.
Verizon Brings SMS to the Cloud
Excerpted from WallStCheatSheet Report by Dan Ritter
Verizon has introduced a service called Integrated Messaging, which according to its website, takes "texting to a new level with a seamless messaging experience!"
At its core, the service decentralizes text and multimedia messages, opening up the SMS platform to tablets and PCs. This means that conversations begun on a smart-phone can be continued through the web browser on a PC, or through a client on a tablet.
For any archivists out there, conversations can even be saved to SD cards, and a Content Finder feature can be used to "quickly find photos, web links, and contact information in your messages."
Superficially, this is a face-lift. Looking ahead, this represents what could be the first step toward more cloud-based messaging services.
The next step in the chain would be something like the capacity to send voice or video messages from any device to any device, using someone's cell-phone number as their identifier.
This would address competition from other messaging services that have begun to take more and more traffic from traditional SMS, like e-mail or platform-specific services like Apple's iMessage or BlackBerry Messenger.
Amazon CloudHSM Aims To Ease Security Worries
Excerpted from InformationWeek Report by Charles Babcock
Amazon Web Services (AWS) is adding a single-tenant, secure hardware cloud appliance to its usual software services to give customers an extra-secure method of storing encryption keys, issuing digital signatures, and executing digital rights management (DRM) in compliance with strict regulations.
AWS CloudHSM uses Safenet's Luna-SA appliances. AWS is making them available in EC2 only to Virtual Private Cloud customers, who access their virtual servers over virtual private networks (VPNs) and use other security precautions. The appliance is given an IP address within the virtual private cloud and is accessible only to the customer contracting for it, even though Amazon monitors it and ensures that it remains up and running.
The availability of a hardware security module (HSM) inside Amazon's EC2 allows a cloud user to store a cryptographic key, digital signature, digital rights, etc. in the cloud instead of having to maintain them on premises and upload them to an application in the cloud when they're needed. The latter inevitably slows performance and adds to the time needed to get work done.
The appliance is an Ethernet device that is tamper-resistant and can call up and use a cryptographic key without exposing it outside the device's boundaries. AWS CTO Werner Vogels considered the hardware addition significant enough to alert his thousands of Twitter followers. Noting virtual private clouds already come with security protection measures, he referred followers to an AWS blog post that said rigorous contractual or regulatory requirements in some cases require "additional protection."
When it came to security keys themselves, "Until now, organizations' only options were to maintain data in on-premises data centers or deploy local HSMs to protect encrypted data in the cloud. Unfortunately, those options either prevented customers from migrating their most sensitive data to the cloud or significantly slowed application performance," AWS said in its blog post.
Amazon released a whitepaper this month describing its existing AWS security measures.
An Amazon FAQ suggested CloudHSM would be good for encrypting databases in the cloud, storing the keys of public key infrastructure, authentication and authorization, document signing, DRM, and transaction processing.
Amazon will charge a one-time fee of $5,000 to set up the CloudHSM and $1.88 per hour or $1,373 per month on average thereafter until the CloudHSM is terminated. As usual with a new service, it is available only in AWS's US East data center in Ashburn, VA in the US, and at its Dublin, Ireland, facility in Europe. Amazon did not say whether CloudHSM would also become available in its other regional centers in the US or around the world.
Telefonica Brings Cloud Storage Services to Businesses
Excerpted from Mobile Entertainment Report by Daniel Gumble
CTERA has revealed that it is powering a new cloud storage service for businesses from Telefonica.
The service, named Disco Virtual en Red, utilizes CTERA's cloud storage enablement suite, which includes secure file sync and share for laptop and smart=phone users, as well as hybrid backup using cloud storage gateways.
Aimed at providing a storage, file sharing and data protection solution for Telefonica's enterprise customers in Spain, the service is available as part of the IT offering for the enterprise market at Telefonica's Aplicateca business app store and in its Fusion Empresas offering.
"We identified a need in the market for Spain-based, secure cloud storage services," said Andres Lopez Hedoire, Marketing Manager, Cloud Security and Management at Telefonica. "We chose CTERA as our partner in creating this innovative service because they provide a unified platform for enabling cloud storage services, supporting the scalability we need and the security that our customers expect."
"Thanks to CTERA's multi-tenant architecture, enterprise customers have their own 'private hosted' environments, with complete separation of data, security and access privileges."
Telefonica will host the cloud infrastructure in its own data centers. It will also monitor and troubleshoot devices and software agents remotely, as well as perform remote firmware and software upgrades.
"It is becoming increasingly evident that cloud-based storage, backup, and file sync & share services represent a major opportunity for cloud service providers," said Herve Bourgeois, Vice President of International Sales at CTERA.
"We are thrilled that Telefonica, one of the world's largest telecom providers, chose to deliver those services based on our platform. The new partnership is an important milestone in our growth in the European market."
Huawei Helps Build Efficient Cloud Data Center
Excerpted from DataCenter Dynamics Report by Laura Luo -
Huawei said it will work with Fountain Data Solution, a high availability IT and ISP/IDC service provider in China, to build the Yangpu Cloud Data Center using its own cloud data center solution.
Located at the Zhoujiazui Road, Yangpu District of Shanghai, the Yangpu Cloud Data Center will cover an area of 6,000 sq meters, 2,400 sq meters of which will be used as whitespace hosting 800 standard cabinets.
Constructed in accordance with Tier III+ standards, the new data center will be equipped with duel grid power supply systems, 2N Uninterruptible Power Supply (UPS) systems and an N+1 cooling system. It will also have fiber connections through multiple telecommunication carriers.
The new data center broke ground in September 2012 and is scheduled to be online in August 2013. It will provide colocation, cloud infrastructure, and network resource services to financial, government, and high-end enterprise users upon completion.
Huawei said its cloud data center solution will bring flexibility, security, as well as energy-saving advantages to the Yangpu Cloud Data Center.
The solution provides elastic resource pooling through virtualization, which enables the customer to use resources based on requirements and reduce IT expenditure.
It also uses energy-saving technologies, including in-row cooling to reduce the Power Usage Effectiveness (PUE) rating and operation and maintenance costs.
Huawei said the solution will provide end-to-end security guarantees from infrastructure, network, resource pools, virtual machines, and terminal devices to avoid customer data loss and leak.
It will adopt best practices in Information Technology Infrastructure Library (ITIL) for overall management of the data center to enhance business continuity.
Huawei said it will also deploy its modular data center solution in the Yangpu Cloud Data Center of Fountain Data Solution.
It said this modular approach integrates pre-fabricated racks, power distribution units, cooling systems, monitoring systems, cabling systems, and fire protection systems.
It said the solution is designed to bring business online within a short time, achieving high density deployment (9kW/cabinet at maximum) and reduced energy costs.
Prior to project implementation, Huawei used advance Computational Fluid Design (CFD) simulation software to simulate and analyze the air flow of the whole data center to ensure high levels of cooling efficiency.
And it said its modular solution will also utilize advanced cooling technologies to enhance energy efficiency in the data center.
"Our modular data center solution adopts in-row cooling technology and achieves complete separation of cold air and hot air by sealing the cold aisle, which saves about 30% of energy in total compared with traditional data centers," said Sun Zhenjian, General Manager of the Enterprise Data Center Department of IT Product Line of Huawei.
"Each IT module is equipped with one separate redundant air conditioner, which ensures the reliability of the cooling system."
"Our in-row cooling technology also removes the stringent requirement for data center ceiling height. For traditional data centers using the raised floor cooling technology, the ceiling height requirement is about four meters. Yet with the in-row cooling technology, the ceiling height requirement of our modular data center is only about three meters, which means customers may deploy data centers in commercial residential buildings (about 2.9 to 3 meters in height), making data center construction and renovation much easier."
Leif Zheng, President of the IT Product Line of Huawei, told Focus that with data growing explosively, energy shortage and environmental protection issues are becoming increasingly salient.
"There are higher and higher requirements for high density and quick data center deployment, and intelligent operation and maintenance," Zheng said.
"Modular data centers enable enterprises to deploy capacity based on requirement and bring their business online quickly, thus reducing energy cost greatly. For medium-sized and small enterprises which are short of capital and land, modular data center is also the best option."
Huawei's modular data center solution has been deployed in about ten data centers both domestically and internationally, including South China Base of China Mobile (or International Information Hub of China Mobile), the data center of the largest IT service provider in India NxtGEN and CANTV in Venezuela.
"Take South China Base of China Mobile as an example, it saves two thirds of space thanks to the deployment of our modular data center solution, compared with traditional data centers," Zheng said.
"Moreover, by taking advantage of in-row cooling and aisle containment technology, coupled with the utilization of Huawei's ManageOne management platform for overall monitoring of the data center, the data center can achieve a PUE as low as 1.6.
Storage Joins the Cloud, Virtualization & Big Data
Excerpted from Baseline Magazine Report by Tony Kontzer
It's not every day that the IT storage community can claim to be on the cutting edge, but this is one of those days.
Whether we're talking about the cloud, virtualization, or big data, storage has emerged as an important component in today's hottest technology trends. Industry experts say these new technologies have enabled the storage industry to overcome an era of difficult-to-integrate silos that made getting at the right data a challenge.
"It's so exciting to see it all come together," says Greg Schulz, Senior Advisor for Server and StorageIO, an IT infrastructure consultancy. "It used to be together, but it got all broken apart."
What follows are three tales of organizations that have taken advantage of storage innovation in the cloud, in software-defined virtual environments and to contend with big data — all in an effort to use data more effectively, and to make themselves more nimble and competitive.
Like many organizations, Weitz, a commercial construction firm, has seen an explosion in demand for mobile access to data over the past couple of years. But the De Moines, IA based company's attempts to satisfy that demand underscored a weakness it had to address: Its aging document management system wasn't up to the task of rendering files on tablets and smart-phones.
That shortcoming had spurred numerous field workers to start using consumer cloud service Dropbox, which enabled easy access to copies of photos, building plans, and virtual models via its mobile application. This introduced a new problem: Having multiple versions of this information scattered on various Dropbox accounts — totally out of the company's control — represented a serious discovery risk should Weitz have to respond to potential litigation.
"We had to find something that would offer the same functionality, but that we could also manage," says Karmyn Babcock, IT Director.
Early in 2012, the company started looking into its options, and quickly focused its sights on cloud storage provider Box. By March, it was testing the service with four proof-of-concept projects in which large files were being shared among designers and construction crews.
The tests were so successful that the company signed up for a 50-user business license, which grew to 180 users in three months. That enthusiastic adoption spurred the company to negotiate a 450-user enterprise agreement, which also enables the firm to provide unlimited access to files for external users, such as contractors and client owners.
Initially, the use of Box storage was going to be limited to data related to current projects, but that strategy was scratched in the face of user backlash.
"We were going to save our older projects in our data center and not go through the process of migrating them," says Babcock. "But we had requests from the operational groups saying, 'No, we want that data.'"
As a result, Babcock and her team are steadily migrating about 4 terabytes (TB) of data from that aging document management system. Meanwhile, independent of that migration, her last check of the Box dashboard indicated that 500 Mb of data are being added each day.
Babcock says it's hard to know what other types of data Weitz might one day opt to store in the cloud, but one thing's certain: "Any system we're looking to replace now," she says, "we're looking into whether it integrates with Box."
When that time comes, using cloud storage will enable Weitz to provision — or deprovision — storage resources in seconds, as business needs change.
Washington Trust Bank's plans to roll a virtual desktop infrastructure deployment into a migration to Windows 7 had a practical and seemingly sound component: The bank would support the VDI with its existing physical storage.
With 5TB of physical storage allocated to nearly 450 knowledge workers targeted for virtual desktops, the Spokane, WA based regional bank had plenty of room to accommodate data. But what Washington Trust quickly learned was that its storage setup lacked the required input/output capabilities — a shortcoming that would require an additional $100,000 in storage hardware to address, says Chris Green, Vice President of IT Infrastructure Systems. Adding costs of more than $200 per user to a project with a total budget of $600 to $700 per user wasn't feasible.
"I thought it might kill the project," Green recalls.
Online research of the issue led Green to Atlantis Computing software, which optimizes storage hardware to support virtual environments. When tests of the company's software showed huge reductions in the volume of processing loads running through the bank's new virtual environment, Washington Trust negotiated a 450-user license and moved into production.
The bank immediately registered an 80 percent reduction in input/output processes per second. Subsequently, there was a 50 percent reduction in the drain on the bank's storage processors.
As a result, Washington Trust how has a VDI that's running faster and more efficiently, and it also has been able to free up storage resources for other systems. All of this cost the bank just $75 per user, or about one-third of what it might have spent on additional storage hardware.
"Not spending that money either increases your profitability or allows you to take advantage of other opportunities," Green points out. "Our engineers are freed up to work with our business units to move the needle, rather than having to manage all those desktops."
The onrush of the big data era has forced many data-rich organizations to optimize their storage environments to contend with the volume, velocity and variety of data coming at them. Then there are others, such as the University of North Texas, that unwittingly readied themselves for big data even before the term was coined.
Back in early 2007, the huge Dallas-Fort Worth school moved from Novell GroupWise to a Microsoft architecture running Exchange, and deployed PeopleSoft with new Oracle databases on the back end. These moves more than doubled the school's storage requirements from 14TB to nearly 30TB, and accommodating that increase (which was sizable at the time) was going to require a huge investment in additional storage hardware, says Monty Slayton, the university's IT manager.
And it wasn't just space the school needed. Its storage systems at the time relied on a single class of high-end drives designed to store frequently accessed data. It had no data progression capabilities that would allow older data to be stored on less expensive hardware.
"There was no thought given to the way data aged," Slayton recalls.
The school decided to invest in a storage platform that could grow modularly to accommodate the anticipated mushrooming of data. It chose to go with arrays from Compellent Technologies (since acquired by Dell), which also offered automated tiering of data.
The timing of the decision was fortuitous, as the portended exponential growth of data arrived as predicted. Steady growth in enrollment pushed the school's student population to nearly 36,000 last fall.
In addition, regulations progressively lengthened the time during which schools are required to retain student and faculty data. Finally, new kinds of unstructured data, such as social media feeds, placed additional demands on storage resources. As if that wasn't enough of a challenge, the school opened a new South Dallas campus, which created additional data storage needs.
But what really tested the school's storage environment was the massive growth in the use of video. Whether it's the increasingly fast-growing collection of HD video of athletic practices and games, or the larger roll video plays in the school's marketing campaigns, video has steadily pushed each department's storage requirements upward.
"The size of the files and the density of things have increased exponentially," says Slayton. "It's rare that anyone asks for less than 1TB of data."
The growing assortment — and volume — of data has tested the university's compellent arrays. In the few years since it was deployed, the school has upgraded to faster and more powerful controllers, speedier fibre-channel storage networks, and a larger number and wider array of disks.
Today, the school manages nearly 2 petabytes of tiered storage. Some 80 percent of that is third-tier long-term storage, while just 5 percent consists of first-tier high-availability disks.
While the university's data needs at the moment might not reflect every definition of big data, they're substantial enough to be termed "explosive, expansive data growth" in the school's IT circles. "To me," says Slayton, "they're kind of the same thing."
Security Issues Unique to Cloud Computing
Excerpted from CyberDefense Magazine Report by David Amsler
Chances are good that your organization is hosting at least some services through a cloud provider. And if you're not yet, you're thinking about it. These environments introduce some new security issues that you need to incorporate into your security plans.
The virtualization technologies used by cloud computing hosting providers mean that as well as managing your own virtual servers and your own staff, you need to make sure the hosting provider's policies for managing their physical servers and staff are acceptable.
Security issues to keep in mind include:
Privileged User Access
When data is stored in your own, on-premises data center, you can literally put it behind lock and key - and you control the key. But when you outsource to a cloud provider, you're bypassing all of your physical, logical, and personnel controls and handing your data over to a remote entity you've probably never met or spoken to. When put that way, it certainly sounds scary. It should. Consider Dropbox's repeated breaches and the recent Evernote breach.
Not all cloud providers are created equal. Do some due diligence to verify a prospective cloud provider will be a good caretaker for your sensitive data. This can be hard to do when the whole point of cloud services is that there's no physical presence, but the most important thing you can do is carefully review their terms of service. If the provider doesn't include their full terms of service on their website, contact them and ask for a copy.
Don't wait until it's time to put pen to paper and sign a contract to review them. If there is anything that is vague or doesn't comply with your own security policy, contact the provider and ask for clarification. As the Dropbox incident makes clear, cloud service providers are realizing they need to be more transparent and specific about their internal security policies and practices. If you don't get the answer(s) you seek, move on to the next cloud provider.
Regulatory Compliance
Your enterprise remains responsible for compliance requirements regardless of whether you're the one hosting the data. Sarbanes-Oxley, HIPAA, and other laws hold many organizations responsible for an exacting level of data monitoring and archiving. Be aware of the regulatory responsibilities that affect your organization and your data, confirm that any potential cloud provider can comply with them, and have a way of auditing that compliance.
For example, ask your provider how they prove that deleted data is truly unrecoverable. If they have data centers in multiple countries, cloud providers should not be afraid to submit to audits and security certifications to ensure they're able to hold up their end of the bargain. Amazon Web Services (AWS), for example, has an extensive list of certifications and third-party attestations.
Data Location and Segregation
Any cloud provider worth its salted hash will have multiple data centers for redundancy. Some large providers have data centers in multiple countries. If regulations or your own security policy prohibits offshoring data, make sure your chosen cloud provider has a way to keep your data within US borders.
Most cloud providers use virtualization technologies that will likely store your sensitive data on a physical server or storage device along with data from multiple other customers. Although there have been no documented instances of someone with access to one virtual server being able to "escape" to the hypervisor and then access other virtual servers, the risk remains. One way to eliminate this threat is to encrypt your data before it leaves your organization. There are a number of options for simple file encryption but as Gartner explains in a June 2008 report, "Encryption accidents can make data totally unusable, and even normal encryption can complicate availability."
Monitoring and Investigative Support
Intrusion detection systems (IDS), firewalls, proxies, and packet capture devices all work on physical networks. Moving to a cloud hosting provider may mean you're unable to monitor and restrict network traffic. This can make identifying and investigating security incidents difficult if not impossible - especially if the Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) providers are different entities, as is sometimes the case.
Although there are a few security products that support virtual networking, it is still a relatively new segment of the security market. Because these virtual appliances are so new, not all cloud providers have implemented them. Gartner suggests that "If you cannot get a contractual commitment to support specific forms of investigation, along with evidence that the vendor has already successfully supported such activities, then your only safe assumption is that investigation and discovery requests will be impossible."
There's another aspect of monitoring and investigative support unique to cloud computing: Under current US law, law enforcement agencies often don't need a warrant to monitor or search any data given to a third party. Your cloud service provider may not inform you if a law enforcement agency requests access to your data stored on their servers. And if your virtual server happens to reside on the same physical server as some other customer that's operating an illegitimate business, there are no guarantees that your virtual server won't be accidentally taken by a law enforcement agency when seizing the other customer's server. This has happened multiple times already.
In light of these significant security factors, it is imperative for prospective cloud users to thoroughly gather their business or mission requirements - particularly for national defense enterprises with unique security constraints. Even cloud providers meeting basic eligibility criteria to support government agencies may vary in capabilities and experience supporting different types of demanding organizations. Because most cloud infrastructures are fundamentally about economies of scale and availability, customers need to drive the security discussion and team with capable providers in a shared approach to protection.
Proposed Computer Fraud Law Could Make Bad Rule Worse
Excerped from Daily Online Examiner Report by Wendy Davis
It's hard to imagine, but the worst law in technology could soon become even, well, worse, thanks to a group of GOP lawmakers who are reportedly trying to revive an amendment first proposed in 2011.
The current Computer Fraud and Abuse Law, which dates to 1984, makes it a crime for people to exceed their authorized access to a computer. That law is already under attack from some lawmakers, who say that the concept of "authorized access" is so broad that the law could transform nearly everyone who goes online into a criminal.
Already, people have been prosecuted on the theory that they exceeded authorized access by violating the terms of service of private companies - such as by lying when creating a MySpace account. In the most famous recent case, open information activist Aaron Swartz - who committed suicide earlier this year - was about to face trial for allegedly violating the computer fraud law by using the Massachusetts Institute of Technology's server in order to download academic papers.
Judges have sided with defendants in some prior prosecutions based on violating terms of service, but reformers say that the law should state that disregarding a private organization's terms of use isn't a federal crime.
Unfortunately, this proposal now being floated would do the exact opposite. The bill would make it even easier to prosecute people for exceeding their "authorized access" to a computer, and also would increase the penalties, according to an analysis by the Center for Democracy & Technology (CDT).
Hopefully, the proposal goes nowhere fast. Meanwhile, a separate bill unveiled recently by Senator Ron Wyden (D-OR) and Congresswoman Zoe Lofgren (D-CA) deserves serious consideration. Their proposal would clarify once and for all that people who violate private organization's terms of use don't commit computer fraud.
Coming Events of Interest
2013 NAB Show - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.
CLOUD COMPUTING CONFERENCE at NAB Show - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.
Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.
CLOUD COMPUTING EAST 2013 - May 19th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.
P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.
CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.
|