March 4, 2013
Volume XLII, Issue 12
Cloud Continues to Move into the Mainstream
Excerpted from the Wall St. Journal Report by Steven Rosebush & Michael Hickins
Royal Dutch Shell, the largest company in the world by revenue, could afford to build all the data centers that it desires. But like many other major corporations, it has enthusiastically embraced cloud computing, which has been introduced at multiple layers of its information technology (IT), from routine applications to the very core of the energy giant's global IT infrastructure.
Large organizations are adopting cloud technology for many of the same reasons smaller organizations have been attracted to the idea of renting rather than owning computing resources — greater flexibility and cost savings, resiliency in the event of natural disasters, and the ability to try new technologies without having to commit to them. And while they acknowledge concerns about cybersecurity, they believe they can safely pursue their cloud strategies while taking appropriate precautions.
"For every application, we look at whether the cloud can be used, and at that moment we look at all aspects, including security. When all criteria are met we will launch on the cloud and therefore we believe that the cloud is secure enough for a number of scenarios," said Johan Krebber, Shell group IT architect and lead architect for the Projects and Technology Business.
Joe AbiDaoud, CIO of Hudbay Minerals, says security is of course "a big deal," but says customers of cloud computing can take steps to ensure security. He says the company only works with reputable vendors that can provide the "level of service that's required." He also says he'll only work with applications that support single sign-on using Active Directory, which allows employees to access approved cloud services by signing on to a single, federated system.
While Shell maintains the majority of its IT on its own data centers, "We already have quite a lot of computer resources that we use in the cloud," Mr. Krebber says. Approximately three years ago Shell started to make use of Amazon Web Services public cloud service to host various types of applications.
In some cases, the service was used to explore emerging applications, such as predictive analytics, in which leadership was interested without being obligated to commit significant resources. But in others it is using the service "to run our business applications," Mr. Krebber said. The most significant cloud-based business applications include software used for training management, business planning services, and engineering design service.
In addition to Amazon, Google and Microsoft also allow third parties to lease the use of their computing infrastructure through an Internet-based connection. The on-demand cloud service is known as infrastructure-as-a-service (IaaS).
The cloud can in many cases lower the cost of computing resources on the basis of price, but it also eliminates wasteful expenditures, because the company only pays for the computing resources it needs. Many "large scale systems" often "run inside" of company-owned data centers, though, Mr. Krebber said.
Mr. AbiDaoud says cost "is definitely a factor" in considering cloud, but AbiDaoud warns "you have to think what cost means." Renting cloud applications over a long period of time may eventually cost more than buying it outright, but Mr. AbiDaoud says other factors, including the cost of upgrading purchased technology and employing engineers to maintain it, should be factored in as well. "You have to think about total cost of ownership, the ability to support it," he said.
One advantage in using public cloud resources is that companies like Shell can experiment with analytic technology based on the open source Hadoop framework. Many executives believe this type of software can provide their companies with a significant competitive advantage, but there is still little consensus about how best to use it, and even fewer engineers and analysts with extensive experience using it in a corporate environment.
"In the case of the Hadoop architecture, we are in a small-scale proof of concept and pilot phase and we do that in a cloud-based environment," Krebber says. "We are running it in a public cloud environment to validate and to determine how best to run it inside Shell. Once we have decided that, we might still remain inside the cloud, depending on the amount of data we want to store in that environment."
The Hadoop trial is running on AWS.
Nissan Motor Co. is using Microsoft's IaaS service, Azure, to develop telematics, a term that describe the use of sensors to gather data from vehicles, which is wirelessly transmitted back to the company for analysis. The use of the Azure, which provides computational power and business solutions, has allowed Nissan to cut the cost of its telematics program in half, according to its CIO, Celso Guiotoko.
Hudbay also uses public cloud infrastructure to reduce costs related to maintaining e-mail and document storage. Mr. AbiDaoud says Hudbay started rolling out Google Gmail in 2012. Now he plans to start promoting other Google apps, such as Hangout video conferencing, instant messaging, and Drive document storage and sharing. He says Google Drive replaces the need for all employees at the company to have a personal virtual drive maintained in the data center.
Cloud computing also allows companies to build redundancy and resiliency into their systems. CIO Alexander Pasik of IEEE says Hurricane Sandy taught the professional organization for technologists a lesson about the need for network redundancy — "no one ever thought we'd have a two-week outage," he says. The organization was closed for four days in the aftermath of the hurricane and "important applications were down."
IEEE uses Amazon templates for 50% of the websites IEEE maintains on behalf of member organizations, including chapter websites, conference websites. Those sites didn't go down during Sandy.
No one believes cloud computing will completely supplant traditional data centers run by individual corporations. But the flexibility it provides organizations to increase or decrease their use of computing and storage power on an as-needed basis is compelling enough that most agree its use will only increase.
In addition to freeing financial resources, cloud computing also allows organizations to reassign software engineers and other IT workers from mundane maintenance tasks to efforts that could directly affect business performance, such as developing new applications needed by their organizations to compete more effectively. Mr. AbiDaoud says he doesn't anticipate the corporate data center "completely disappearing." But he believes it is "going to shrink."
Report from CEO Marty Lafferty
The DCIA salutes the Cloud Security Alliance (CSA) for releasing its 2013 report on The Notorious Nine leading security threats that affect cloud computing.
CSA's list reflects an assessment by industry experts of risks specifically related to the nature of cloud computing technology, and we plan to address these in detail during the CLOUD COMPUTING CONFERENCE at the 2013 NAB Show.
Leading CSA's list are Data Breaches. A virtual machine (VM) could use side-channel timing information, for example, to extract private cryptographic keys in use by other VMs on the same server.
Key to prevention of this risk for users of multitenant cloud service databases is proper design, so that a single flaw in one client's application is not able to allow an attacker to access that client's and other clients' data.
In second place is Data Loss. The disappearance of valuable data into the ether could result from a malicious hack, a careless service provider, or a natural disaster disrupting a data center. Preventing data loss requires thoughtful attention to off-premises redundancy and intelligent use of encryption.
The third greatest security risk is Account Hijacking. Compromised account credentials can lead to falsified information, data manipulation, and redirection to illegitimate sites. Further, a compromised account may be converted into a launching pad for subsequent attacks. The key to defending against this threat is to protect credentials from being stolen by prohibiting account credentials sharing and leveraging strong two-factor authentication techniques.
Fourth on the list of threats are Insecure APIs. Application program interfaces are integral to cloud provisioning, management, orchestration, and monitoring. For third parties to build on APIs and inject add-on services, access to credentials is required, and this creates vulnerability. To combat this weakness, the security implications of each of the above functions needs to be understood and confidentiality, integrity, and accountability measures factored into providing availability.
Fifth is Denial of Service. While DoS is not a new threat to Internet-based services, its impact can be exacerbated in situations where users rely on full-time access to cloud-based services. DoS outages can exponentially increase costs to customers who are billed for compute cycles and disk-space consumption, causing severe economic harm even if they don't entirely take down a service. Early detection is critical to damage prevention from DoS.
The sixth most challenging exposure can come from Malicious Insiders, who gain access to a network, system, or data for malevolent purposes. Good cloud service design must ensure that keys are securely kept with the customer and made available only at data-usage time to reduce this risk.
Seventh on the list is Cloud Abuse, such as leveraging the greater power of cloud computing versus a stand-alone computer to break a difficult encryption key, launch a DoS attack, propagate malware, or disseminate infringing software. Cloud providers must define up-front what would constitute abuse and determine the best processes for identifying it as early as possible.
The eighth most severe security threat is Insufficient Due Diligence. Adapting cloud services can generate contractual issues with providers over liability and transparency. Operational and architectural issues can also be factors when migrating applications to the cloud. The remedy is to allocate sufficient resources for performing adequate due diligence before jumping into the cloud.
Finally, Shared Technology Vulnerabilities are the ninth greatest security threat. To offer scalability, cloud service providers share infrastructure, platforms, and applications to deliver meaning individual components like CPU caches and GPUs are exposed beyond the purposes for which they were designed to serve in isolation.
If a hypervisor, a shared platform component, or an application is compromised, it can expose the entire environment to a potential breach. A defensive, in-depth strategy, including compute, storage, network, application, and user security enforcement, as well as monitoring, will provide the best means to prevention.
There are special discount codes for DCINFO readers to attend the 2013 NAB Show. The code for $100 off conference registration is EP35. And the code for FREE exhibit-only registration is EP04. Share wisely, and take care.
Move to the Cloud without Compromising Security
Excerpted from TechRepublic Report
You've heard why so many companies now store data in the cloud: access anytime, from any computer or device, and scalability. However, many businesses have valid concerns about giving up control of data.
Read how Citrix ShareFile resolves this issue with advanced security features throughout the entire file storage and transfer process.
Granular access permissions, comprehensive activity reports, encryption for data in transfer and at rest, and mobile access management bring total control of data back to the business.
DDN Ushers in New Era of Hadoop Simplicity
Excerpted from SearchStorage Report by Sonia Lelii
High-performance-computing storage specialist DataDirect Networks (DDN) today unveiled hScaler, an appliance designed to speed the deployment of Apache Hadoop-based big data analytics.
The hScaler from DataDirect Networks (DDN) is built on its Storage Fusion Architecture (SFA) 12K clustered network-attached storage (NAS) array. DDN claims the Hadoop storage and compute system can scale up to approximately 7 PB of capacity, uses 40 GBps networking through InfiniBand, and performs at about 1.5 million IOPS. The hScaler runs the Hortonworks Data Platform Hadoop distribution.
Jean-Luc Chatelain, DDN's Executive Vice President of Strategy and Technology, said the goal with hScaler is to enable Hadoop adoption in large enterprises.
"The reality is enterprise Hadoop has not gotten into data centers. The majority of enterprises are not using Hadoop," Chatelain said. "Building a system for Hadoop is a science project. The customer has to cobble together a system the hard way. Some enterprise customers take up to six months to deploy a Hadoop infrastructure. Our appliance reduces implementation to about eight hours."
The hScaler storage and compute nodes can scale independently. An appliance supports at least two 4U SFA12K storage enclosures with 84 SAS drives per enclosure. A two-enclosure system includes up to 345 TB of usable capacity with 3 TB drives. HScaler also supports solid-state drives. An hScaler can hold 32 compute nodes in a rack, with each server equipped with two CPUs with eight cores per CPU and 64 GB of RAM per node.
"The way you grow Hadoop is to add servers and direct-attached drives," Chatelain said. "The drawback is every time you need more compute, you have to add more drives. That leads to an imbalance of storage and compute. We take the approach that there is a more efficient way to do that. We have the flexibility to scale compute from the storage."
There are two nodes per appliance that run the Hadoop Distributed File System (HDFS) and the extract, transform and load (ETL) process that gathers data from the application servers. The system includes a NameNode that runs the software to orchestrate and manage Hadoop functions, supporting 12 cores and 128 GB of RAM per node. The NameNode manages the data copy and distribution process for Hadoop.
Each appliance has 48 ports for 10 Gigabit Ethernet switches, 44 ports for Gigabit Ethernet switches, and 36 ports for Mellanox Technologies InfiniBand switches.
Evan Quinn, Senior Principal Analyst for Data Management and Analytics at Enterprise Strategy Group, said most organizations follow the "do-it-yourself path" when designing a Hadoop system. They tend to start with a proof-of-concept implementation that works well but becomes more difficult when it is scaled for larger implementations. That leads companies to build server and storage farms for big data.
"First, they buy commodity hardware and it works well," Quinn said. "But when it grows to the enterprise level, they are back to the server- and storage-farm business that companies are trying to move away from. Hadoop is supposed to cost less, but at some point it becomes more expensive. So, now it's beginning to shift to appliances."
Quinn said integrated appliances are becoming the preferred approach for big data and Hadoop storage because "the original design of Hadoop with storage associated with each node is problematic."
DDN's hScaler will ship at the end of March. The company has not released pricing details. The system competes primarily with IBM's Netezza and Teradata Corp. appliances.
Mainstream storage vendors EMC and NetApp have also taken steps to integrate Hadoop with specific platforms.
Qualys & Verizon Deliver Cloud-Based IT Security
Qualys, a pioneer and leading provider of cloud security and compliance management solutions, and Verizon, a leading provider of global managed security solutions, today announced an agreement to expand their relationship to deliver new advanced cloud-based IT security and compliance management services.
These services include vulnerability management, web application scanning, PCI and policy compliance solutions to organizations around the world. Verizon enterprise clients can now benefit from the performance, security, and scalability of the QualysGuard Cloud Platform to secure and protect their IT assets and Web applications from cyber attacks and to automate compliance.
Verizon also will layer additional analytics with these managed security services by leveraging its proprietary risk models to provide clients with actionable information about their security risk posture.
"Qualys' cloud platform, commitment to innovation, and global reach have made them a leader in the security services space," said Cindy Bellefeuille Stanton, Director of Security Product Management for Verizon Enterprise Solutions. "By incorporating these solutions into our managed services and consulting offerings, our clients now have more choice and flexibility in meeting their compliance and security requirements — a growing area of focus for our clients."
"Verizon has built an impressive global managed security and consulting practice and together we will deliver leading edge IT security and compliance solutions to organizations around the world," said Philippe Courtot, Chairman and CEO, Qualys. "Our expanded relationship will pay big dividends for enterprises looking to strengthen their security defense."
The QualysGuard Cloud Platform and its integrated suite of security and compliance solutions helps provide organizations of all sizes with a global view of their security and compliance posture, while reducing their total cost of ownership.
The QualysGuard Cloud Suite, which includes Vulnerability Management, Web Application Scanning, Malware Detection Service, Policy Compliance, PCI Compliance and Qualys SECURE Seal, enable customers to identify their IT assets, collect and analyze large amounts of IT security data, discover and prioritize vulnerabilities and malware, recommend remediation actions and verify the implementation of such actions.
Huawei Launches OpenStack-Powered Cloud
Excerpted from Network World Report by Brandon Butler
Chinese telecom giant Huawei today announced FusionCloud, an OpenStack-powered cloud aimed mostly at carrier providers.
Huawei joined OpenStack in October with subdued fanfare compared to other companies that have joined the open source cloud computing project. OpenStack was started by Rackspace and NASA in 2010 and has since grown to include dozens of major vendors, including IBM, Dell, HP, VMware and others, but Huawei is by far OpenStack's largest telco-focused company.
Huawei's FusionCloud, unveiled at the Mobile World Congress show in Barcelona, has a variety of components including FusionSphere, an operating system for running cloud deployments, as well as FusionCable, a complementary converged infrastructure component that incorporates compute, storage and networking.
"Telecom services require low latency, high reliability, and high performance," said Ren Zhipeng, General Manager, Cloud Computing Product Line at Huawei. "We know that in addition to satisfying customer needs and service requirements based on speed, a reliable and unified management platform is required to ensure the efficient operation of core services. With complete OpenStack compatibility, we are confident that FusionCloud represents the optimal cloud computing solution for carrier organizations." In addition to announcing the FusionCloud, Huawei says it has launched an internal cloud that serves its 70,000 active users.
A variety of other OpenStack-backed companies are in various stages of developing OpenStack-powered clouds, but Huawei today jumps to the forefront. Rackspace is the furthest along with its OpenStack powered cloud already in production. HP has its OpenStack powered public cloud in a preview version, while an official from Dell said that the company's OpenStack-powered public cloud would be available late in 2013. Dell already has VMware-powered public cloud computing offerings.
The OpenStack community is gearing up for its next bi-annual summit in April, in which members of the open source project will meet in Portland, OR, to discuss the newest release of OpenStack code - named Grizzly.
BitTorrent Surf Boosts "Content Discoverability"
Exerpted from ZeroPaid Report by Jared Moya
Last month I mentioned BitTorrent's new Chrome extension BitTorrent Surf, and the company continues to experiment with exciting new content distribution models.
BitTorrent Surf allows you to search for AND download content right in the Chrome browser without having to visit a BitTorrent tracker site OR having to launched a standalone BitTorrent client. BitTorrent Surf does both.
The move was part of more than a year-long process of experimentation to "create a more sustainable distribution model for the Internet's creators and fans."
"Within 365 days, we've been able to create a legitimate content ecosystem 85-petabytes-big connecting 16 innovative artists with 170 million activist listeners," notes the company.
Recently the company demoed an update to BitTorrent Surf that strengthens the ability of it and its content partners to create a scalable, sustainable distribution model by optimizing the BitTorrent platform for artist visibility.
In other words, the musicians, authors, filmmakers, and others that choose to partner with BitTorrent to deliver their content to the masses outside of standard distribution channels will rightly see their content prioritized in the search results so as to help create a strong, standalone alternative for content creators.
"We've learned that BitTorrent users invest money and time to support artists," it adds. "We've learned that they want to see better content in the BitTorrent ecosystem. We've learned that artists who distribute work via BitTorrent create stronger connections with fans."
They do indeed.
Stay tuned.
Free Ad-Supported TV Shifts from Linear to the Internet
Excerpted from The Guardian Report by Colin Dixon
In the United States, television viewing is completely dominated by pay TV providers. With 90% of households relying on cable, satellite, or TelcoTV as their primary source of TV entertainment, one might be forgiven for thinking free TV was a dying industry.
New data from the NPD group, however, shows that free TV isn't a vanishing resource at all. It's just moving online. The company said that online TV streamers were mostly using PCs and that the importance of tablets and smart-phones was overblown. However, other data suggests this might not be the case.
According to NPD, 12% of US TV viewers said they had streamed shows for free in the previous three months. This makes free TV sites like Hulu and ABC.com about as popular as pay sites such as Netflix and Amazon Prime. What's more, the company also says that over half of the streaming TV viewers are in the 18- to 34-year-old demographic.
This is in line with Nielsen three-screen data from the second quarter of 2012. Nielsen found that the same group streamed the most video from the Internet. For example, 18- to 34-year-olds streamed 40% more than 35- to 49-year-olds. What is even more interesting is that the latter group watched 43% more traditional TV than the latter. This lends some credence to NPD's postulation that the younger age group is turning from shorter video clips and toward longer form TV shows.
The most popular site for streaming free TV is Hulu, and 43% of free TV streamers reported using this site. CBS.com was the next most popular destination (10%), followed by ABC.com (9%) and Fox.com and NBC.com (both with 4%.) Generally, consumers were very happy with their experiences at these sites with over 75% reporting they were likely to return to view more shows.
One factor affecting the use of some sites could be TV show accessibility. For example, the poor performance of Fox could be explained by the fact that last year the company made the decision to delay releasing new shows through Fox.com. A viewer wishing to watch for free must wait eight days after the first TV broadcast to catch the latest episode.
It is still possible to watch within the eighth-day window, but a consumer must have a pay-TV subscription to do it and login with their operator credentials. Given that 90% of US households have pay TV this might, at first, seem to be no deterrent at all. However, other research from GfK indicates that just requiring a user to login is sufficient deterrent to make them move on to other free sites.
Another statistic revealed by NPD was that 83% of TV streams were sent to the PC. This led NPD groups Russ Crupnick to chide the industry for "lavishing" too much attention on new devices such as tablets and smart-phones. Is he right? Are companies such as Verizon and ABC wasting their time on mobile apps while they should be focusing on the PC?
New data from Conviva, a video streaming optimization company, seems to indicate that would be a mistake. According to Darren Feher, President and CEO of the company, a tier one US TV content provider is seeing a very different split between devices for its streams. In January, just 32% of video streams went to the PC: almost half went to mobile platforms like the iPad and iPhone. At least for this premium TV provider, mobility is a critical platform for delivery. Focusing just on the PC would result in a lot of very unhappy viewers.
Conviva's data seems to be corroborated by recent streaming data from the BBC in the UK. The company reported that, in October, nearly a quarter of all streaming requests came from mobile phones and tablets while the PC accounted for just over a half. In addition, the trajectories of platform usage are radically different. In just one year, PC requests had fallen from 65% to 52% while mobile platform requests had risen from 9% to 23%.
Further evidence that it would dangerous to focus on the PC comes from Cisco. The company forecasts that mobile data consumption is slated to grow at a startling 66% a year through 2017. By comparison, broadband data traffic is forecast to grow at a much more modest 26% a year. The largest component of mobile data traffic is video, consuming 66% of total bandwidth by 2017.
For free TV broadcasters, the NPD data suggests reaching out through the Internet is an excellent strategy for engaging younger viewers. However, there is a mountain of data which suggests eschewing mobile platforms for the PC probably isn't the best way to go about it.
BrightTag Fuse Taps "The Cloud" for Multichannel Data
Excerpted from RTM Daily Report by Tyler Loechner
BrightTag this week announced Fuse, a multichannel real-time data provider. BrightTag seeks to break from the mold of traditional tag management systems by streamlining data from all channels — not just the web — via the cloud.
"Marketers today are forced to choose between day-old data locked in 'cold storage' or real-time data across a single channel," stated Mike Sands, president and CEO of BrightTag. "At a time when consumers are already interacting with brands across multiple channels, marketers need sophisticated tools to help them capitalize on cross-channel activities and take action instantly."
The new technology can match user profiles and cookies in real-time across devices. That type of functionality and all-in-one data will make life easier for marketers. BrightTag claims that Fuse's on-demand data integration is an industry first. To the right is a logo from BrightTag used to demonstrate how Fuse works and what it includes.
"Multichannel marketing integration is a critical opportunity for me," stated Joseph Yakuel, Cross Site Marketing Manager at Quidsi. "I need to be able to integrate mobile and desktop user interactions in order to drive successful marketing programs."
Cloud Computing Is Simplifying Things
Excerpted from Cloud Computing Journal Report by Pat Romanski
"You need two groups when dealing with cloud compliance," explained Rob LaMear IV, CEO and Founder of Fpweb, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. "First," LaMear continued, "you need a provider that is willing to operate transparently and work with you and your auditors. Most seasoned providers are well aware of this symbiotic relationship and are open to getting it out in the open early."
Cloud Computing Journal: The move to cloud isn't about saving money, it is about saving time.- Agree or disagree?
Rob LaMear: Agree. Time is money. Focusing your team on strategic initiatives gives you a competitive advantage. You get to market faster and can deliver something truly special before your competitors. First one to market typically owns 70-80% of the market share. Think Apple.
Cloud Computing Journal: How should organizations tackle their regulatory and compliance concerns in the cloud? Who should they be asking / trusting for advice?
LaMear: You need two groups when dealing with cloud compliance. First, you need a provider who is willing to operate transparently and work with you and your auditors. Most seasoned providers are well aware of this symbiotic relationship and are open to getting it out in the open early. Second, you need a progressive competent audit team. The cloud can be architected and managed to meet most compliance concerns. The auditor should be engaged from day one so the provider, customer and auditor are all on the same page from the start. No surprises please.
Cloud Computing Journal: What does the emergence of open source clouds mean for the cloud ecosystem? How does the existence of OpenStack, CloudStack, OpenNebula, Eucalyptus and so on affect your own company?
LaMear: Like any technology in its infancy, there will be competing standards that will detract and slow down adoption. Cloud standards are necessary from an operations, governance and security perspective. We need relatively simple ways of connecting, securing and managing all the various public and private clouds. Moving forward, the standards will consolidate and adoption will accelerate.
Cloud Computing Journal: With SMBs, the two primary challenges they face moving to the cloud are always stated as being cost and trust: where is the industry on satisfying SMBs on both points simultaneously - further along than in 2011-12, or...?
LaMear: Not there yet for the SMBs. We are certainly progressing but I think we are still looking down the road to 2015 before SMBs will be able to easily glide in and out of clouds without having to worry about migration headaches, extensive consulting assistance or vendor lock-in.
Cloud Computing Journal: 2013 seems to be turning into a breakthrough year for Big Data. How much does the success of cloud computing have to do with that?
LaMear: Quite a bit. With true cloud computing available from a variety of sources, and more spawning daily, you can crunch Oracle, SQL and Hadoop to your heart's content without standing up big iron. You can fire up a Big Data project in the cloud, do the work and turn it off when finished. On-demand Big Data infrastructure and analysis is a game changer for large enterprises and is trickling down into the small and medium enterprise. Everyone needs better BI to compete globally.
Cloud Computing Journal: What about the role of social: aside from the acronym itself - SMAC (for Social, Mobile, Analytics, Cloud) - are you seeing and/or anticipating major traction in this area?
LaMear: The SMAC down has begun. Folks are making decisions based on social. Most in the Far East don't have PCs, they have phones. What is measured is improved. And the cloud is making all this possible and iteratively faster. Helping businesses be more successful is the name of the game and SMAC is the avenue to get bigger results faster.
Cloud Computing Journal: To finish, just as real estate is always said to be about "location, location, location," what one word, repeated three times, would you say cloud computing is all about? (Example: Scalability, Scalability, Scalability)
LaMear: "Easy, Easy, Easy." Think technology at home or the enterprise and do these sound familiar? I don't have time. It's too hard. I just want it to work. Cloud computing is simplifying things.
Managing Big Data in the Cloud
Excerpted from Baseline Report by Bob Violino
Two of the hottest IT trends today are the move to cloud computing and the emergence of big data as a key initiative for leveraging information. For some enterprises, both of these trends are converging, as they try to manage and analyze big data in their cloud deployments.
"Our research with respect to the interaction between big data and cloud suggests that the dominant sentiment among developers is that big data is a natural component of the cloud," says Ben Hanley, Senior Analyst at research firm Evans Data. Companies are increasingly using cloud deployments to address big data and analytics needs, he says, adding, "We have observed significant growth with respect to the interaction between cloud and big data."
Geostellar, a Washington, DC company that provides computations of available renewable-energy resources for geographic locations, is involved in both the cloud and big data. The company has had to develop strategies — including the use of cloud services — to store, process, and move the petabytes of information in various formats that it processes and provides to customers.
The company didn't move to the cloud until about a year and a half ago. It started out by providing data to customers via hard drives. Later it implemented on-site virtualized servers and moved them into hosted environments, and then migrated to the cloud.
"All of the data we're processing has to be centralized in our operations center," says CEO David Levine, "because the various fields are so large, and it's much more efficient in terms of the proximity of dedicated CPUs and disk drives for reading and writing and processing configurations."
Before the company processes data internally, various sources ship raw data sets via hard drives sent by overnight delivery or some other means. "We take all these different data assets and create data structures, so when the customer looks up a particular property, he has the profile he needs," Levine explains. That applies regardless of whether it's weather patterns or available resources in the area being examined.
The data Geostellar collects isn't moved within the cloud because of its large size. "We've got these very large files — imagery, surface models, databases, etc. — and we have to aggregate all of this information," Levine says. "And people are still shipping that to us on hard drives because of the bandwidth."
Once processing of the data is complete, Geostellar streams it over to the cloud, and then customers can access and interact with the data from there. "We and customers can work with data in the cloud because we've already created all these interrelated structures," Levine says.
Over time, Geostellar has developed its process of gathering and analyzing large volumes of information, producing connected spatial-relational data sets and then moving the data from its data centers to the cloud.
The company now operates two separate infrastructures, a highly efficient processing system that includes solid-state hard drives and powerful, dedicated servers, and a virtualized, cloud-based environment used for managing the information it produces through computation. The cloud is critical for distributing and providing access to this data, Levine says.
"Probably the biggest benefit of the cloud is that it's much easier to manage capacity," he says. "You can stay ahead of whatever trends are happening." There's also resiliency in terms of long-term storage of the data.
The cost saving is another benefit. "It's the service provider's excess capacity we're using, and the memory is cheaper than if we had procured our own systems and set up our own nodes," Levine says.
Another organization using big data in the cloud is the Virginia Bioinformatics Institute (VBI), a research institute in Blacksburg, VA. VBI conducts genome analysis and DNA sequencing using about 100 terabytes of data that's collected each week from around the world.
"Our largest project is the downloading and reanalysis of every sequenced human genome to identify new biomarkers and drug targets, especially for cancer," says Skip Garner, Executive Director and Professor at VBI. "We are analyzing approximately 100 genomes per day, and these are all downloaded from the cloud."
Data generated from various scientific sources is downloaded and then analyzed on VBI servers. "Recently, it has become easier and more efficient to download what we need and not keep local copies, for it amounts to tens of petabytes," Garner says. "So the cloud has enabled us to download, use and throw away raw data to save space, and then download again if necessary."
The institute hasn't used non-cloud compute resources for the research work because its codes "are memory hogs, requiring servers with at least a terabyte of RAM," he explains.
Managing big data in the cloud does come with challenges, Garner points out. The big issues are security and intellectual property. For example, VBI has permission to download certain data sets, and, in those agreements, it must maintain control, allowing only certain people to have access to the data.
"We can be absolutely sure of where the data is when it is in our servers, and we are confident that we are adhering to the terms of agreements," Garner says. "That is not [the case] when data is in the cloud. So, currently, we do not put data in the cloud, we only download."
Downloading and using data from the cloud saves VBI a lot on storage costs, and the return on investment was "immediate", according to Garner.
As organizations approach big data, their first choice for compute and storage platforms should be the cloud, says Chris Smith, US Federal Chief Technology and Innovation Officer at New York, NY based Accenture, a global management consulting company.
"Low cost, highly scalable and elastic capabilities are the right formula for implementing big data," Smith says. "In some cases, a big data solution in a highly secure environment may dictate an internal data center strategy, but most organizations are developing their own internal private clouds, and this is the right place for those specific solutions as well."
Organizations continue to adopt and implement private, public and hybrid clouds, "with these technologies having become mainstream choices for developing new capabilities," Smith says. "I expect to see increased and even more rapid adoption over the next 18 to 24 months."
As organizations increase the breadth and depth of business technology offerings in the cloud, Smith says, they need to ensure that they can manage information across multiple heterogeneous environments, in order to be able to clearly develop, analyze and articulate the state of business, as well as provide highly available, high-performing services that deliver value.
"A robust cloud brokering and orchestration capability that puts the organization in the driver's seat to maintain, deliver and innovate new and better services will be key for the enterprise," Smith says.
The cloud itself will continue to generate lots of data, says London, UK based research firm Ovum. In "2013 Trends to Watch: Cloud Computing," the firm says that 2013 will see cloud computing continue to grow rapidly. Cloud computing in all its types—public, private and hybrid—is building momentum, evolving fast and becoming increasingly enterprise-grade, Ovum says.
Cloud computing services — and the social and mobile applications that cloud platforms underpin — are generating a lot of data, which, in turn, requires cloud services and applications to make sense of it, Ovum notes.
This trend is fueling other industry trends, such as the Internet of things (machine-to-machine communication and data processing), consumerization of IT and big data.
How to Add "Stickiness" to a Third-Party Cloud App
Excerpted from Total Telecom Report by Nick Wood
Telcos that want a simple way to win customers and generate revenue from cloud services could do worse than opting to resell an off-the-shelf product from a respected software provider; however, Japan's NEC said this method comes with a health warning.
"Operators sometimes ask us for killer apps from some big brands," said Manuel Gallo, Senior Manager of Business Development for EMEA, at NEC's cloud competence center. "When they say that we advise them to get some big names, but we also say don't rely on them too much."
His explanation was fairly straightforward.
"You won't build customer loyalty or retention from killer apps like Microsoft Office 365 because they're available everywhere and you will always find yourself challenged on price."
Instead, Gallo's advice to telcos that want to be successful in the cloud is to create sticky services that become an integral part of their customer's own business.
"I sort of call it forced loyalty," he joked. "By being deeply integrated in, for example, a restaurant's order and billing management systems, it will make them feel less inclined to migrate."
In its capacity as a carrier cloud services provider, NEC's brand is the one thing largely absent from its offering: the solutions it provides to operators are done so on a white label basis.
"The NEC brand is well-established in the carrier cloud [services] market, but we don't want to undermine the brands of our operator customers," said Gallo.
"There are sometimes internal concerns that our brand is not listed in cloud service rankings, but I always think, would I rather have the money or the brand? I'll have the money please."
AllJoyn P2P Software Framework Adds Audio Streaming
Excerpted from Engadget Report by Michael Gorman
Qualcomm revealed that it was expanding its AllJoyn software platform today with some new services designed to help create a network of connected devices. Essentially, these services take the AllJoyn P2P software framework and package it in a way that makes it easier for hardware makers to implement.
Qualcomm sees these new services enabling a kind of hub and spoke organization where myriad devices — from coffee makers to stereos — connect to a single Internet gateway. With such a framework in place, users can control those devices and receive notifications from them on a smart-phone or tablet.
To get an idea of what AllJoyn can do, imagine a world where your washing machine sends you a text when the laundry's done and you can tell your coffee maker to start brewing using your smart-phone. In addition to an appliance and gadget connectivity network, AllJoyn's also rolling out a open source, wireless audio streaming protocol.
Like AirPlay or Sonos' wireless technology, it allows users to stream music from mobile devices to any set of AllJoyn-enabled speakers.
But, unlike those closed competitors, AllJoyn's solution is open source and freely available to speaker and stereo manufacturers.
Intrigued? You can see an AllJoyn-enabled coffee maker and the AllJoyn-compatible DoubleTwist app do some music streaming in our video.
Where the Highest Paying Cloud Computing Jobs Are
Excerpted from Forbes Report by Louis Columbus
Using analytics to better understand the cloud computing job market is fascinating.
One of the most advanced companies in this area is Wanted Analytics, which aggregates job postings from over 500 job boards and maintains a database of over 600 million unique job listings.
They specialize in business intelligence for the talent marketplace, providing insights into how one company's salary range compares to competitors for the same position, also calculating the difficulty to hire a given type of candidate. They've developed a unique Hiring Scale to accomplish this.
I recently had a chance to test-drive their analytics applications. Using the parameters to analyze all cloud computing jobs that pay $100,000 a year or more for the analysis, I ran several queries. Key takeaways include the following.
San Jose-Sunnyvale-Santa Clara, CA leads the metropolitan statistical areas (MSAs) with a salary range $118K to $144K and one of the highest Hiring Score index values of 81, meaning it is very difficult for employers to find candidates who are qualified for their open positions.
Bridgeport-Stamford-Norwalk, CT is next with a salary range of $117K to $143K and a Hiring Index Score of 75.
San Francisco-Oakland-Fremont, CA shows a salary range of $114K to $140K and a relative high Hiring Scale of 88. Salary range for cloud computing professionals charted by MSA is shown here.
Professional, Scientific and Technical Services (31%), Information Technologies (30%) and Manufacturing (12%) lead the top ten industries hiring cloud computing professionals in positions paying $100K or more. Wanted Analytics uses the NAICS taxonomy to organize this area of their database.
A total of 5,299 positions are open today for Computer Software Engineers, Applications and Architects as is shown here. What is surprising is the rapid increase in Marketing Managers (1,076 positions), Sales Representatives, Wholesale and Manufacturing, Technical and Scientific Products (576 positions) and Sales Engineers (452 positions).
Wanted Analytics uses the Standard Occupational Classification (SOC) taxonomy too organize this area of their database. The results are shown here.
Coming Events of Interest
European Cloud Computing Conference - March 7th in Brussels, Belgium. This 2nd annual event will provide a platform to hear from key policymakers and stakeholders, discuss opportunities offered by the technology, and examine steps to be taken so that Europe can fully take advantage of the benefits provided by the cloud.
2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.
2013 NAB Show - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.
CLOUD COMPUTING CONFERENCE at NAB Show - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.
Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.
CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.
P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.
CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.
|