August 13, 2012
Volume XL, Issue 7
Kaltura Sponsors CLOUD COMPUTING WEST 2012
The DCIA and CCA proudly announce that Kaltura has signed on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.
Kaltura provides the world's first-and-only Open Source Online Video Platform, which combines industry-leading media management tools with a framework for developing custom applications. This basically means that Kaltura provides its customers with flexible video solutions to cover all of their needs — both now and in the future.
But don't take Kaltura's word for it — listen to the industry. Over 150,000 web publishers, media companies, enterprises, educational institutions and service providers use Kaltura's flexible platform to enhance their websites, web services, and web platforms with advanced video functionalities.
When it comes to deployment, Kaltura is the only video platform on the market to offer a wide range of deployment options. Its full software-as-a-service (SaaS) solution includes video hosting, streaming, cross-platform delivery, transcoding, analytics, and support and maintenance services.
Kaltura's commercial software license allows customers and partners to self-host the software behind firewalls for self-use or reselling. Its commercial offerings are supported by expert professional services for strategic design and software development.
Kaltura also offers a 100% free, community-supported version of its software and source code, available from the Kaltura developer community.
And to top it all off, Kaltura allows its customers to develop their own solutions, and choose their own bells-and-whistles — via either Kaltura's robust and open API or Kaltura Exchange, the first marketplace for media-centric applications.
Kaltura urges DCINFO readers to try out its out-of-the-box applications — or develop original solutions and workflows using its offerings. With Kaltura, the sky is the limit.
CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry's fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.
Kaltura will participate in a panel discussion at the Entertainment Content Delivery conference within CCW:2012.
CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.
At the end of the first full-day of co-located conferences, attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.
So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.
Ci&T Predicts Major Advances in Cloud Computing for the Second Half of 2012
With the benefits of cloud computing widely known, the technology is in the midst of major advances as it firmly solidifies itself as a standard within the enterprise.
Ci&T, a provider of value-driven web and mobile application services and software product engineering, works closely with some of the world's largest and most progressive organizations to migrate their applications to the cloud.
With such a close view into how the cloud is being used in the real world, Ci&T identified four of the most exciting advances to cloud technology this year, which will ultimately shape cloud deployments in 2013 and beyond:
Beyond Gaming, Startups and Websites - Platform-as-a-Service (PaaS) in the Enterprise: While has been used in gaming and websites for years, IT innovators have recognized that it is an enterprise-ready technology. Large, multinational enterprises are adopting the architecture due to its highly reliable and scalable qualities. With its high availability and developer-friendly features, PaaS allows for organizations to move forward with cloud migrations quickly, economically and with minimal changes to existing IT infrastructure.
The Most Popular Kids in IT School - Big Data and the Cloud: It comes as no surprise that companies are drowning in data, but the challenge is actually deriving meaning from it. While traditional business intelligence (BI) has been used to find meaning within data, it is often limited to specific point of views and too-complex-to-process amounts of data. The cloud is taking enterprise data analysis to the next level. With unlimited computational resources, companies can create and process huge datasets quickly and easily. Additionally, since companies only have to pay for the computational power they use, the cloud is a significantly lower investment than a traditional BI approach.
IaaS and PaaS - Best of Both Worlds: Increasingly, organizations have been mixing infrastructure-as-a-service (IaaS) and PaaS, creating a synergy to take advantage of the benefits of both. Major cloud vendors have made moves to ensure that their IaaS and PaaS offerings are complementary. Due to this, software architects are able to design the absolute best architectures for their organizations, using the ideal, most-needed features from each service.
The Cloud and Application Development - Perfect Together: The cloud is the most promising new advancement in software development to come along in years. Today, progressive organizations are leveraging the cloud in application development due to its speed, cost-savings and increasing reliability and security. In the next several years, cloud offerings from the major players will dominate development, while standard platforms (Java, .NET, etc.) will become secondary.
Report from CEO Marty Lafferty
As noted last week, even though President Obama strongly urged the US Congress to pass the Cybersecurity Act of 2012 (CISPA), the Democrat-controlled Senate could not agree on regulatory provisions and privacy protection, and failed to pass the measure before its summer recess.
Congressional action on cybersecurity strategy therefore is now likely delayed until after the election.
The original draft of the Senate bill contained mandatory security precautions, especially for "critical infrastructure" businesses, such as power plants and water-treatment facilities. The final version, however, made such precautions voluntary, in order not to burden businesses with unnecessary, unproven, and potentially ineffective procedures.
In addition, public advocacy groups voiced concerns that, as initially proposed, the legislation would give unchecked power to the National Security Agency (NSA) to investigate consumer activity online without express authorization or accountability.
And in the end, Congress couldn't reach a compromise on what now stands as the most recent instance of that age-old debate between security and rights.
Lawmakers basically stalled-out on the cybersecurity issue.
The White House, however, hasn't ruled-out issuing an executive order in order to demonstrate its insistence on a government-backed strengthening of the nation's defenses against cyberattacks.
"In the wake of Congressional inaction and Republican stall tactics, unfortunately, we will continue to be hamstrung by outdated and inadequate statutory authorities that the legislation would have fixed," said White House Press Secretary Jay Carney.
"Moving forward, the President is determined to do absolutely everything we can to better protect our nation against today's cyberthreats, and we will do that," he added.
The administration had already sent officials to testify at seventeen Congressional hearings and presented more than one-hundred briefings to underscore its prioritization of this opportunity for the federal government to assert itself on this important issue.
If President Obama issues an order on cybersecurity, it wouldn't be the first time that his administration has resorted to such executive action to bypass Congress. And in fact, the President could enact many of the core provisions contained in CISPA through executive order.
In addition, many companies managing vital computer systems are already heavily regulated; and agencies that monitor their work could conceivably require that they now meet new cybersecurity standards — even without requiring specific legislative authority.
Furthermore, the Office of Management and Budget (OMB)'s in-process security standards for federal computer systems — which encompass cloud computing — could also be applied in the private sector.
And the Federal Communications Commission (FCC) has already set-up a voluntary system for companies to share information about cyberthreats with one another.
A cybersecurity executive order, however, would surely be met with outrage by the US Chamber of Commerce, which lobbied strongly against the legislation, as well as Republican interests on the Hill, and others.
Opponents already accuse Obama of making unlawful power grabs with such actions on other matters, where the President similarly did not get his way with lawmakers.
And unlike Presidents Reagan and Clinton who preceded him, Barack Obama has not demonstrated great abilities to negotiate with Congress when it does not agree with him. An executive order on cybersecurity would not help that reputation. Share wisely, and take care.
Mobile Demand Driving Growth in the Cloud
Excerpted from Sys-Con Media Report by Pat Romanski
"Increasing mobile demand will drive a lot of the growth in cloud computing," stated James Strayer, VP of Product Management and Marketing at Racemi, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan.
"But," Strayer continued, "there are still people running Windows NT 4 out there and there are probably even old IBM System/360s out there in production still."
Cloud Computing Journal: Just having the enterprise data is good. Extracting meaningful information out of this data is priceless. Agree or disagree?
James Strayer: Agree. I am often amazed at how many businesses don't even have the data yet, so getting it is a great first step. But the value is limited unless you can build useful information and models from it.
Cloud Computing Journal: Forrester's James Staten: "Not everything will move to the cloud as there are many business processes, data sets and workflows that require specific hardware or proprietary solutions that can't take advantage of cloud economics. For this reason we'll likely still have mainframes 20 years from now." Agree or disagree?
Strayer: Agree. Increasing mobile demand will drive a lot of the growth in cloud, but there are still people running Windows NT 4 out there and there are probably even old IBM System/360s out there in production still.
Cloud Computing Journal: The price of cloud computing will go up - so will the demand. Agree or disagree?
Strayer: Disagree on price, Agree on demand. Disagree on price because it is relatively easy to spin-up a public cloud these days. With readily available colo space, free hypervisors like open source Xen and KVM, as well as open source "cloud in the box" platforms like OpenStack and CloudStack, there will likely continue to be low-cost alternatives for price-conscious consumers. Of course, having said that, higher performance enterprise-class clouds/managed clouds will likely go up moderately in price. We've seen heavy investment from big players in this space as a way to increase margin and differentiate from AWS, as unmanaged public cloud providers face growing price pressures.
Cloud Computing Journal: Rackspace is reporting an 80% growth from cloud computing, Amazon continues to innovate and make great strides, and Microsoft, Dell, and other big players are positioning themselves as big leaders. Are you expecting in the next 18 months to see the bottom fall out and scores of cloud providers failing or getting gobbled up by bigger players? Or what?
Strayer: We'll see typical consolidation in the emerging market. Big players will consume other players and many smaller cloud providers that can't compete due to market saturation will fold.
Cloud Computing Journal: Please name one thing that - despite what we all may have heard or read - you are certain is NOT going to happen in the future, with Cloud and Big Data?
Strayer: It will not solve world peace or end hunger.
NASA Uses Amazon's Cloud Computing in Mars Landing Mission
Excerpted from LA Times Report by Andrea Chang
Although it boasts having "Earth's biggest selection," Amazon's reach has stretched to Mars.
Better known for being an e-commerce giant, Amazon has become a major player in cloud computing, with NASA's Jet Propulsion Laboratory (JPL) using the company's Amazon Web Services (AWS) to capture and store images and metadata collected from the Mars Exploration Rover and Mars Science Laboratory missions.
With so much large-scale data processing to be done, JPL is leading the way in the adoption of cloud computing in the federal government, said Khawaja Shams, Manager for Data Services at La Canada Flintridge-based JPL.
"At this point, JPL's data centers are filled to capacity, so we're looking for ways to cost effectively expand the computational horsepower that we have at our disposal," he said. "Cloud computing is giving us that opportunity."
Using AWS's cloud to operate the mars.jpl.nasa.gov website, Shams noted, enables JPL to get images, videos and developments to the public quickly, without having to build and operate the infrastructure in-house.
According to Amazon, AWS enabled JPL to construct a scalable Web infrastructure in only two to three weeks instead of months.
"With unrelenting goals to get the data out to the public, NASA/JPL prepared to service hundreds of gigabits/second of traffic for hundreds of thousands of concurrent viewers," Amazon said.
The mission will continue to use AWS to automate the analysis of images from the planet, giving scientists more time to identify potential hazards or areas of particular scientific interest, Amazon said.
Shams noted that JPL has partnered with several providers of cloud computing in the past, including Microsoft, Google and Lockheed, evaluating each option for specific projects and needs. JPL's strategy, he said, "is to find the right cloud for the right job."
AWS began offering cloud computing to businesses in 2006 and today its platform powers hundreds of thousands of businesses in 190 countries. Companies that use AWS include Yelp, Netflix, and Pinterest.
Among AWS's benefits, the company says, is low cost (companies pay as they go, with no upfront expenses or long-term commitments); instant elasticity; an open and flexible platform; and security.
Online Video is Causing Seismic Shifts in Internet Traffic
Excerpted from Streaming Media Magazine Report by Tony Dreier
The topology of the Internet is changing, and online video is at the heart of the change. At the recent Content Delivery Summit in New York, NY, Craig Labovitz, co-founder of DeepField Networks, presented the results of years of research.
"It's really been a remarkable transition over just the last five years," noted Labovitz. "The Internet is evolving in ways that look nothing like the fundamental architecture -- how the networks interconnect, where the POPs are located, how the traffic is distributed across a backbone is completely different and changing rapidly."
What we're seeing isn't just a change to recent networking architecture, but to the standard that started long before the Internet.
"For 150 years of telephony, of telegraph, the architecture of the network basically looked the same: very hierarchical. You had regional networks — small little regional networks — you had them feeding up into larger postal telephone and telegraph (PTT), national networks. And the way traffic flowed through the network, the way interconnections worked, the way that money worked was all very similar," explained Labovitz. "It all flowed one direction and traffic went down the other. But things over the last five years really began to change quickly."
What we're seeing in an entirely new model, one flat where the previous was top-down.
"The old hierarchical Internet is gone. Today, we're increasingly moving to a very flat, dense, highly interconnected network where, in fact, most of the traffic isn't flowing up along a tree to reach the tier 1s and back down. Most of the traffic today is interchange between what we've been calling the hyper-giants," said Labovitz.
For the full study results, download Labovitz's presentation.
Online Video Streams as Popular as TV during Olympics
Excerpted from IP&TV News Report by Jamie Beach
The London 2012 Olympics Games are driving an unprecedented phenomenon: online video streams of the sporting action are proving just as popular with UK viewers as broadcasts on conventional TV platforms, according to content delivery network (CDN) operator Level 3 Communications.
In a report compiled for Level 3 by Redshift Research, it was found that over half (58%) of the 2,000 UK consumers polled plan to watch sport at the time of the 2012 Olympics online — the same proportion as those who will also watch some events on terrestrial and satellite TV respectively.
Even more importantly, the London Olympics are expected to act as a catalyst for lasting change in viewer behavior: 58 per cent of consumers said they would watch sport online during the Olympics, compared to 49 per cent who watch sport online as part of their normal viewing diet.
When quizzed on which technologies have had the biggest impact on sport viewing, connected devices such as mobile phones/smart-phones (16%) and tablets (14%) together made up over 30% of responses, with interactive services cited by a further 16% per cent of respondents.
Significantly, one in five respondents stated that a poor online streaming experience affects their enjoyment of sport viewing, with this figure rising to one in three (33%) in the 18-24 age group.
The report also attempts to divine what change are ahead in sports broadcasting, and predicts these to include more immersive and social experiences, such as stereoscopic 3D televisions, super high-definition (4K) TV, and the overlay of contextual information and social opportunities onto a broadcast.
Among its conclusions, the study highlights a disconnect between growing online consumption patterns and current broadband infrastructure. Level 3 observes that while the Internet was designed for an equal amount of traffic being requested by users and delivered to them, today's traffic is currently 100:1 downstream to the users.
The CDN provider thus recommends a "philosophical change" in the economics of the Internet and the technology behind it: 'all- you-can-eat' services may need to be scaled back as companies charge for services and invest in necessary infrastructure.
The report can be downloaded here.
Cloud Underlies Enhanced Verizon Fleet Services
Excerpted from TeleCompetitor Report by Joan Engerbretson
A cloud approach is key to two new service enhancements announced this week from Verizon Enterprise Solutions, the unit created last year to oversee business, government, and wholesale customers across the carrier's wireless and wireline assets.
The enhanced services include Fleet Control, which enables enterprise customers to keep closer tabs on fleet vehicles, and Field Force Manager, which Verizon Enterprise Solutions Director of Transportation Solutions Abdul Abdullah referred to in an interview as a "simple powerful mobile business productivity solution."
Verizon's Fleet Control offering provides functionality such as turn-by-turn directions, route optimization, and engine diagnostics, as well as tracking driver fitness and health, explained Abdullah.
The software is now available on an enterprise-grade wireless tablet device that supports bar code scanning, mag stripe reading, and signature capture. With the software running on the tablet, drivers can capture proof of delivery data in real time, Abdullah said.
An enterprise-grade tablet is sometimes referred to as a "blank slate" because it doesn't come pre-loaded with all sorts of consumer applications that may not be appropriate for an enterprise user, Abdullah explained.
Abdullah declined to provide pricing on Fleet Control, noting that individual customers typically require some level of customization.
Fleet Force Manager also has been made available on new devices, including various smart-phones and tablets, Abdullah explained. Previously the software came on flip phones which in some cases couldn't take a photo, Abdullah said.
In addition, the Fleet Force Manager software itself has been enhanced so that it has "better integration via web services," Abdullah said. For example, he said the software easily integrates with QuickBooks and ADP to support payroll applications, with information about the hours a driver works flowing through to the appropriate applications.
Support for signature capture and bar code scanning also has been added to Fleet Force Manager, Abdullah explained.
Enterprises can use Fleet Force Manager for between $15 and $25 per device per month including airtime usage in support of the offering, Abdullah said. Pricing for individual devices varies depending on the device but is in line with what customers pay for those devices when purchasing traditional service contracts.
If a driver will be using the smart-phone or tablet to support other applications such as accessing the Internet, the enterprise customer will also need to purchase a plan to cover the usage charges associated with those applications.
Verizon has been vigorously pursuing opportunities in the cloud market, launching cloud-based services as diverse as infrastructure-as-a-service, telematics and unified communications.
And at least two key Verizon acquisitions were motivated, at least in part, by the desire to pursue cloud opportunities — including the company's 2011 purchases of CloudSwitch and Terremark.
Google Ventures Invests In Cloud Signature Platform
DocuSign, the Global Standard for eSignature, announced that Google Ventures has joined in the company's most recent financing round, bringing the total Series D funding to $55.7 million. Google Ventures joins investment lead Kleiner Perkins Caufield & Byers, along with Accel Partners, Comcast Ventures, SAP Ventures, and a large global institutional investor in this round. The company will use the funds to accelerate growth of the DocuSign Global Network via increased customer-focused R&D, deeper vertical industry solutions, and faster international expansion.
DocuSign's secure, cloud-based platform helps consumers and businesses of all sizes and industries to complete transactions faster by eliminating the hassles, costs, and lack of security inherent in printing, faxing, scanning, and overnighting documents to capture information, payments and signatures. Companies use DocuSign to create better customer experiences and save money by automating and streamlining their business processes.
"The investment by Google Ventures highlights DocuSign's value as a tech disruptor across the web and mobile platforms, from consumers to global enterprises," said Keith Krach, DocuSign Chairman and CEO. "DocuSign has become the global standard for eSignature by building a viral network of more than 20 million users that attracts 60,000 new users every day."
"Electronic signatures are being rapidly adopted by enterprises, small businesses and consumers worldwide due to their convenience, security and ease of use," said Karim Faris, Partner, Google Ventures. "DocuSign's market momentum, deep technology, and strong team attracted us to them and we are excited to be working with the company as they scale their business worldwide extending their reach to hundreds millions of consumers."
Global enterprises, business departments, individual professionals and consumers are standardizing on DocuSign. Today, that network includes 20 million users who have DocuSigned more than 150 million documents in 188 countries — including employees at 90 percent of the Fortune 500.
DocuSign is used to finish business faster across nearly every industry — from financial services, insurance, technology, healthcare, manufacturing, communications, real estate, consumer goods to higher education and others — as well every business department, including sales, finance, operations, procurement, HR/staffing, legal, and customer support.
DDN Announces Rapid Uptake of Award-Winning Technology in Australia
DataDirect Networks (DDN), the leader in massively scalable storage, this week announced the broad adoption of DDN technology within Australia's Super Science Initiative and the selection of its massively scalable parallel file storage technology by many of the Initiative's largest organizations.
Recent high-scale deployments include sales to the Commonwealth Scientific and Industrial Research Organization (CSIRO), the Australian Bureau of Meteorology, the Australian National University, the University of Western Australia (UWA) and the Victorian Life Sciences Computation Initiative (VLSCI). This year alone more than 20 petabytes of high-performance, high-efficiency DDN SFA storage have been purchased by these internationally renowned research institutions as the region makes significant investments into its data-intensive computing infrastructure.
"As we enter an era marked by new technologies that enable researchers and data scientists to extract value from massive amounts of information, the work our customers do promises to change the way society lives and works," said Justin Glen, DDN Director of Sales for Australia and New Zealand.
"With more than a decade of experience in resolving challenges at massive scale, DDN is happy to play a central role in Australia's success as a center of Big Data-driven innovation."
Customers that have recently purchased DDN data-intensive storage include:
In partnership with IBM, VLSCI at the University of Melbourne, currently home to the fastest computer in the southern hemisphere, which investigates genomics, biology, and other life sciences to improve diagnostics, find new drug targets, refine treatments and further understanding of major diseases.
CSIRO is Australia's national science agency and one of the largest and most diverse research agencies in the world. CSIRO facilitates research in the areas of climate change, energy, environment, mineral resources, and other scientific disciplines.
"CSIRO uses DDN for large-scale storage of scientific data sets," said David Toll, Chief Information Officer of CSIRO. "The performance and density of DDN's storage has allowed our scientists to do things that were not possible before."
In partnership with SGI, UWA supports Big Data research in disciplines including geoscience, nanoscience and radio astronomy. The National Computational Infrastructure (NCI) hosted at Australia National University (ANU) is deploying DDN storage to power their new Fujitsu Petaflop system. When deployed, the DDN storage subsystem will have a capacity of over 12 petabytes and will be capable of achieving over 120GB/sec of bandwidth through its file system to power one of the world's fastest supercomputers.
"NCI will be using DDN storage to provide Australian researcher with world-class high-end computing services," said Professor Lindsay Botten, Director of the National Computational Infrastructure at ANU.
In partnership with IBM, DDN has also recently deployed a large system at the Australian Bureau of Meteorology to support the bureau's work in developing and delivering regular forecasts and warnings that covers the Australian region and Antarctic territory.
Internet Archive Partners with BitTorrent to Serve 1 Petabyte of Favorites
Excerpted from International Business Times Report by Valli Ramanathan
Internet Archive, a non-profit project established to act as online library for digital information has to date promoted nearly a petabyte of content via the file-sharing site, BitTorrent, ComputerWorld reported.
The Internet Archive is offering 1.5 million torrents including live music concerts, the Prelinger movie collection, the librivox audio book collection, feature films, old time radio, more than 1.2 million books and "all new uploads from patrons who are into community collections," on the BitTorrent site,a blog post on Internet Archive stated.
The Internet Archive intends to continue offering the content it serves from the BitTorrent file-sharing site, ComputerWorld added.
The California-based non-profit organization set up the site to store Internet images, video, audio and web pages.
"I supported the original creation of BitTorrent because I believe in building technology to make it easy for communities to share what they have. The archive is helping people to understand that BitTorrent isn't just for ephemeral or dodgy items that disappear from view in a short time," John Gilmore, Founder, Electronic Frontier Foundation (EFF) stated in the non-profit's blog post.
BitTorrent is the fastest way to download items from the archive because the BitTorrent client downloads simultaneously from two different archive servers located in two different data centers, and from other archive users who have downloaded these torrents already, the site stated.
The Internet Archive has web pages hosting varying content. For instance, the archive's Wayback Machine shows what websites looked like in the past, such as what The New York Times website looked like in the year 2000.
Besides, the archive also showcases television coverage of important news and events including 9/11. It hosts over 3,000 hours of 9/11 coverage comprising 20 channels of international television news.
Microsoft Azure to Host China ISP's New Online TV Service
Excerpted from CNET News Report by Charlie Osborne
This week, technology giant Microsoft and Chinese Internet service provider PPTV signed a partnership deal that will bring a new web TV service to Microsoft's Windows Azure platform.
The deal, Microsoft's first cloud-based collaboration with a local Chinese new media company, will see PPTV's Asia TV Networks (ATN) launched using Azure as its core infrastructure, while the two companies have also agreed to explore further opportunities to work together in online TV and other areas.
ATN is intended as a "local showcase" for subscription online television, according to PPTV. Content can be uploaded by global content providers, and then can be licensed to service providers with a revenue sharing model.
Chuang Tao, chief executive of PPTV, said, "With Windows Azure, we can offer scale and flexibility not found anywhere else. We can also avoid unpredictable resource and demand fluctuations globally. PPTV ATN can thus pass on these benefits to our partners, and in the end to our global customers. This is a new competitive edge for our business."
The signing ceremony took place in Shanghai, attended by Tao; Zhengyi Liu, the deputy district head of Shanghai Pudong New Area District; and Ya-Qin Zhang, Microsoft Corporate Vice President and Chairman of Microsoft Asia-Pacific R&D Group.
PPTV is currently the largest online television service operating in China, featuring television shows, sports, entertainment, and news content.
Allot Grabs Oversi for $16 Million
Excerpted from CED Magazine Report by Brian Santo
Allot Communications said it is acquiring Oversi Networks, which specializes in media caching and content delivery solutions for Internet video and peer-to-peer (P2P) traffic, for $16 million upfront.
Allot could end up paying as much as $5 million more, contingent on performance.
Allot expects the Oversi product portfolio to complement its Allot Service Gateway; the combined product set will enable both video optimization and caching, allowing fixed and mobile service providers to more effectively manage the increasing video traffic on their networks.
Allot's fixed line customers report that video traffic currently uses more than 50 percent of bandwidth and is rapidly increasing.
On the wireless side, according to Allot's latest Global MobileTrends report, video now represents 42 percent of mobile data traffic worldwide.
Oversi's caching and acceleration solutions are designed to help both fixed and mobile service providers relieve the network congestion associated with Internet video traffic. Its technology identifies popular content and caches it at the edge of the network, with the aim of saving bandwidth and minimizing delays in video delivery to improve subscribers' quality of experience (QoE) for video.
Rami Hadar, Allot's President and CEO, said, "Over the past year, we have seen an increase in customer interest in video caching solutions, which made clear to us the critical need for Allot to own a leading solution in this growing market instead of reselling a third-party offering."
Allot also announced Q2 2012 earnings this week. Total revenues for the second quarter of 2012 reached $26.4 million, a 43 percent increase from the $18.5 million of revenues reported for the second quarter of 2011, and it's a 9 percent increase from the $24.2 million of revenues reported for the first quarter of 2012.
Cloud Services Sector Grows in IT Outsourcing, Gartner Says
Excerpted from Silicon Republic Report
Worldwide spending for IT outsourcing services is on track to reach $251.7 billion this year, a 2.1 per cent increase from 2011 spending of $246.6 billion, Gartner reports.
Cloud compute services is the fastest-growing segment within IT outsourcing services, and cloud compute services are expected to grow 48.7 per cent this year to $5.0 billion, up from $3.4 billion in 2011.
"Today, cloud compute services primarily provide automation of basic functions. As next-generation business applications come to market and existing applications are migrated to use automated operations and monitoring, increased value in terms of service consistency, agility and personnel reduction will be delivered," said Gregor Petri, Research Director at Gartner.
"Continued privacy and compliance concerns may, however, negatively impact growth in some regions, especially if providers are slow in bringing localized solutions to market."
Data center outsourcing (DCO) represented 34.5 per cent of the market in 2011, but growth will decline 1 per cent in 2012, Gartner said.
"The data center outsourcing market is at a major tipping point, where various data center processing systems will gradually be replaced by new delivery models through 2016," said Bryan Britz, Research Director at Gartner.
"These new services enable providers to address new categories of clients, extending DCO from traditional large organizations into small or mid-size businesses," Britz added.
The application outsourcing (AO) segment is expected to reach $40.7 billion, a 2 per cent increase from 2011 spending of $39.9 billion. This growth reflects enterprises' needs to manage extensive legacy application environments and their commercial off-the-shelf packages that run the business.
"Change is afoot in the AO market," Britz said. "The burdens of managing the legacy portfolio, along with the limitations of IT budgets, have shifted the enterprise buyers to be cautious and favor a more evolutionary approach to other application services, such as software-as-a-service (SaaS).
"New applications will largely be packaged and/or SaaS-deployed in order to extend and modernize the portfolio in an incremental manner. While custom applications will remain 'core' for many organizations, the trend in the next few years to SaaS enablement in the cloud will reflect in the growth of the AO outlook."
While there will be some impact from the ongoing business slowdown due to sovereign-debt issues in Europe and slowing exports in China, Gartner expects the IT outsourcing services market in the emerging Asia/Pacific region to represent the highest growth of all regions.
Spending on ITO in the Asia/Pacific region will grow 1 per cent in US dollars in 2012 and exceed 2.5 per cent growth in 2013.
How Is Dig Data Faring in the Enterprise?
Excerpted from ZDNet Report by Dion Hinchcliffe
It's certainly one of the hottest new buzzwords in technology, yet the meaning of big data typically depends on whom you ask. Yet it's also clear that big data, an important reformulation of how we store and process our digital information, continues to make a big splash as a major IT trend of this half-decade.
Certainly the market estimates are optimistic, with Deloitte recently pegging the size of the market at between $1.3-$1.5 billion this year, while IDC forecasts the industry will be whopping $16.9 billion by 2015.
But these large numbers tend to obscure the fundamental changes that currently seem to be taking place under the rubric of big data.
The first of these is the data-first ethos that's embodied by trying to tap into and process ground truth (by seeking out the best raw data) and then deriving insight from what is uncovered (domain-specific business intelligence), rather than trying to find data to support one's already-completed strategic decision making.
One of the better known examples of data-first thinking is the famous "Moneyball" story, as told in the 2003 book by Michael Lewis, relating the story of how the Oakland A's bucked tradition and switched to heavy data analysis to identify their highest performers, with considerable success.
Though only one data point, this story — and a growing list of others — are leading many believe that data-first thinking may be the solution to many long-standing problems to help combat everything from crime and disease to pollution and poverty. It's also perhaps the key to resolving somewhat more mundane challenges in our businesses as well.
The second major change is the shift away from the relational data model as the definitive standard for how to process information for the first time in over a generation. To be sure, the growing adoption in customer-facing technology of emerging platforms such as Hadoop and NoSQL-style databases, is still most prevalent in web start-ups and consumer services.
Yet the petabytes and exabytes of today's data volumes in many business contexts practically demands technologies that scale well in the face of unrelenting datasets and shrinking time scales that are growing exponentially.
For a variety of reasons too long to enumerate here, the relational model has at long last encountering both a serious challenge to its hegemony as well as real challengers that can frequently do better at handling today's data volumes and types. And though many organizations will continue to use relational technology to create some of their big data solutions, it's no longer the only option, particularly as the growth in unstructured data is now much faster than classical structured data.
The third change is the move towards making big data a more operational component of the way organizations work and how externally-facing products function. While data scientists are often required to get the best outcomes, the results of their work are often applications or data appliances that are usable by just about anyone.
Just like Google enabled the layperson to query the entire contents of the web with a few keywords, the next generation of enterprise big data seems to be about connecting workers with the data landscape of their organizations in a way that doesn't typically require IT wizards in white robes.
Thus, business solutions based on big data technology must be a readily approachable end-user technology for the average line worker in order to have a sustained and meaningful business impact.
Let's take a look at what organizations are actually reporting when it comes to big data implementation and usage today. Looking at a broad cross section of companies both large and small, the O'Reilly Strata Conference survey published a useful breakdown this year of what its attendees were doing with the technology: 18% already had a big data solution; 28% had no plans at this time; 22% planned to have a big data solution in six months; 17% planned to have a big data solution in 12 months; and 15% planned to have a big data solution in two years.
Admittedly, attendees of this particular conference were more likely than average to be adopters of big data, so these numbers are a little optimistic, even given that big data is a big tent for a great many technologies that handle large data volumes and analytics.
However, the story becomes even more interesting when we look at specific sectors. For example, the insurance industry recently reported that 15-20% of insurers are actively preparing for big data solution.
Government, one of the larger potential beneficiaries of big data according to the seminal McKinsey report on the subject, is itself experiencing relatively slow adoption, with a recent survey of public sector CIOs and IT managers reporting it will take three years to start processing their data this way. If we look at function, instead of industry, we can see that sales processes are likely poised to be revolutionized by big data.
A recent analysis by CSO Insights reveals that 71% of companies expect big data to have a significant impact on sales, despite only 16% currently doing so, a gap that many organizations will clearly want to close.
However activating on the large set of changes that big data entails will clearly happen incrementally, yet broadly, in most companies.
There's technology, process, infrastructure, and management that all has to be put into place, plus the hiring of data scientists that understand your business (or learn to), as well as such still-esoteric concepts such as DevOps, which will marry the operational aspects with the development aspects of big data to quickly solve business problems by applying data-first analysis combined with just-in-time R&D and deployment.
In addition, companies will also have to deliver on a big data "stack" in the enterprise. This stack will invariably consist of the following components, designed out of a conglomerate of open source software, commercial applications, on-premises and cloud infrastructure, combined with data from just about everywhere:
Technology. In general, these seem to be breaking down into three major families, two of which are new and one of which is legacy. There are Hadoop and its variants, the NoSQL family, and relational databases which have added big data features.
Infrastructure & Development. This includes Infrastructure-as-a-Service (IaaS), Software-as-a-Service (SaaS), Data-as-a-Service (DaaS), Open APIs, DevOps, and data scientists, the latter which craft solutions from an array of internal and external components from this palette.
Big Data Applications. This list of popular application models for big data includes business intelligence, social analytics, decision support, visualization and modeling, behavioral prediction, and business process optimization (BPO), but there are many others.
Domain-Specific Solutions. Once the big data tech, infrastructure, and app are in place, businesses must focus their efforts on extracting industry-specific value for them. Top industries and/or functions for big data (ones most likely to benefit) include marketing, R&D, scientific/technical/engineering/mathematics (STEM), health care, financial services, retail, and insurance.
Big Data-Powered Business Processes. To be useful, big data solutions must then be incorporated into an organization's business processes including operations, line of business, and support functions. In particular, the high-value and common business processes will provide the largest ROI.
To summarize all this, it's still early days yet for this era's growing data deluge. In fact, one of the best quotes of the year about big data is from Ben Werther, who recently observed that we're still "in the pre-industrial age of big data."
Most organizations aren't yet doing it at scale, but the writing is on the wall that significant competitive advantage can be had for those that want it. As I predicted earlier this year, social analytics will be one particularly bright spot in big data this year, and organizations already have a good array of tools and vendors to pick from.
Ultimately, the biggest challenge will be in integrating big data effectively into updated and revised business processes. Thus again, change itself will be the large overall obstacle as technology out-paces the ability of most organizations to absorb it.
This will likely push big data into the cloud for most organizations look for strategies to speed adoption, further hastening cloud-related migration of so much of IT. This may not be a bad thing.
Coming Events of Interest
ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.
ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.
CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.
Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.
|