July 23, 2012
Volume XL, Issue 4
CLOUD COMPUTING WEST 2012 Adds Networking Reception Cruise
Attendees of the Cloud Computing West 2012 (CCW:2012) business strategy leadership summit taking place in Santa Monica, CA on Thursday November 8th and Friday morning November 9th are in for a treat.
At the end of the first full-day of three co-located conferences focusing on "Revolutionizing Entertainment Delivery, Network Infrastructure, and Investing in the Cloud," attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.
After the first day of informative conference sessions, senior business and IT decision-makers from end-user enterprises; cloud infrastructure providers, vendors and value-added resellers (VARs); and technology investors and entrepreneurs will have the opportunity to unwind and relax in one of Southern California's most relaxed and beautiful settings.
So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.
Opportunities to participate include multiple levels of sponsorship and exhibition, as well as a variety of speaking roles, from keynotes to panels to roundtable discussions.
For more information, please contact Don Buford, Executive Director at the CCA, or Marty Lafferty, CEO at the DCIA.
EARNINGS PREVIEW: Cloud Computing Bright Spot for Software Cos
Excerpted from Wall Street Journal Report by Nathalie Tadena
While information technology (IT) spending remains muted amid a challenging global macroeconomic environment, businesses' continued investment in cloud services remains a bright growth spot for major software companies.
Earlier this month, technology research firm Gartner reported worldwide IT spending is on pace to reach $3.6 trillion this year, a 3% increase from last year. Gartner reported there has been little change in business confidence and consumer sentiment in the past quarter and noted the short-term outlook calls for continued cautious IT spending.
Analysts polled by Thomson Reuters expect IBM to report a profit of $3.42 a share and revenue of $26.31 billion. In the same period last year, the company posted a profit of $3 a share and operating earnings, which exclude retirement-plan costs and amortization, of $3.09 a share. IBM reported revenue of $26.67 billion last year.
IBM has benefited from its push toward higher-margin, complex businesses such as business analytics, and away from crowded fields where companies can compete on price only. While that bet has been paying off, worries have emerged the macroeconomic environment is hurting spending on technology, and the company's revenue has been lighter than expected for the past couple of quarters.
The hardware division has seen two consecutive quarters of revenue declines and the company has warned that the division would likely have a tough comparison in the second quarter. In a recent note, UBS said it expects IBM to achieve its 2015 road map, but noted revenue growth could continue to disappoint as services and Unix technology sales come under pressure.
Analysts project Microsoft to report fiscal fourth-quarter earnings of 62 cents a share and revenue of $18.14 billion. A year earlier, Microsoft reported earnings of 69 cents a share and revenue of $17.37 billion.
Microsoft has been transitioning customers to cloud services and its forthcoming Windows 8 operating system, which is designed to work on traditional PCs and mobile devices and is critical to the company's future as it looks to better compete in the fast-growing mobile market.
However, Microsoft still needs sales of its flagship products to remain robust. Continued strength in sales of its Office suite of products to corporate customers has helped cushion the company from the impact of slumping personal computer sales.
The software heavyweight disclosed earlier this month that it will record a $6.2 billion goodwill write-down in its online services division in the quarter, connected to its 2007 acquisition of online-advertising agency aQuantive for $6.3 billion.
Microsoft said while the online-services division has been improving the company's expectations for future growth, profits are lower than previous estimates, which led to the write-down.
Analysts expect VMware to post a per-share profit of 66 cents and revenue of $1.11 billion. In the same period of last year, the company recorded earnings of 51 cents a share, or 55 cents excluding stock-based compensation, tax adjustments and other items. VMware reported revenue of $921.2 million last year.
VMware, which is majority-owned by storage vendor EMC, dominates the market for virtualization software, which allows users to run multiple computers' operations on a single machine, a key step in cloud computing. Customers that had turned to the company for software to virtualize their information systems are now buying software to build applications and run their enterprise.
The company's core-virtualization software VSphere has continued to perform well in competition with software from Microsoft, Oracle and open-source products. However Wunderlich Securities, which recently cut its estimates and price targets on VMware, noted the company experienced "numerous late-quarter push-outs" in the latest period.
The firm continues to see VMware as a "key element in the deployment and management of next-generation data center infrastructure."
Wall Street predicts Citrix will record earnings of 59 cents a share and revenue of $613 million. A year earlier, Citrix posted a per-share profit of 43 cents, or 57 cents excluding stock-based compensation and other items, and revenue of $530.8 million.
Citrix, which competes with VMware in the virtualization space, has posted double-digit revenue gains over the past two years as its desktop-solutions business, which includes XenApp and XenDesktop, has seen its growth accelerate due to the increasing importance of desktop virtualization.
In a recent note, Nomura said Citrix appears to have had a good quarter in the educational segment in North America, noting large educational deals helped out its results.
Report from CEO Marty Lafferty
Five US Senate co-sponsors of cybersecurity legislation introduced the newly revised Cybersecurity Act of 2012 on Thursday in a well-meaning effort to protect national and economic security - including life-sustaining services - from increasingly troubling cyber attacks.
The latest version includes elements of a voluntary program outlined in a compromise framework drafted by a bipartisan group of Senators led by Sheldon Whitehouse (D-RI) and Jon Kyl (R-AZ).
Please click here for a summary of the revised bill, which would:
Establish a multi-agency National Cybersecurity Council (NCC) - chaired by the Secretary of Homeland Security - to lead cybersecurity efforts, including assessing the risks and vulnerabilities of critical infrastructure systems.
Allow private industry groups to develop and recommend to the Council voluntary cybersecurity practices to mitigate identified cyber risks. The standards would be reviewed and approved, modified or supplemented as necessary by the Council to address the risks.
Allow owners of critical infrastructure to participate in a voluntary cybersecurity program. Owners could join the program by showing either through self-certification or a third-party assessment that they are meeting the voluntary cybersecurity practices. Owners who join the program would be eligible for benefits including liability protections, expedited security clearances, and priority assistance on cyber issues.
Create no new regulators and provide no new authority for an agency to adopt standards that are not otherwise authorized by law. Current industry regulators would continue to oversee their industry sectors.
Permit information-sharing among the private sector and the federal government to share threats, incidents, best practices, and fixes, while preserving the civil liberties and privacy of users.
Require designated critical infrastructure - those systems which if attacked could cause catastrophic consequences - to report significant cyber incidents.
Require the government to improve the security of federal civilian cyber networks through reform of the Federal Information Security Management Act (FISMA).
In addition, provisions that improve privacy protection have been added to:
Keep the data in the hands of civilian agencies (as opposed to the National Security Agency [NSA]);
Restrict the government's use of the information to cybersecurity issues and the prevention of immediate physical harm;
Require annual reporting on the data's use;
Let Americans sue the government for abuse; and
Not undermine potential benefits of Net Neutrality ideals.
The Senators stressed that the revised Cybersecurity Act of 2012 does not affect copyrighted information on the Internet and thus in no way resembles the Stop Online Piracy Act (SOPA) or the Protect Intellectual Property Act (PIPA).
The focus of the revised Cybersecurity Act is to improve the security of systems that control the essential services that keep the nation running - for instance, power, water, and transportation networks.
In an opinion piece published by the Wall Street Journal on Thursday, President Obama urged the Senate to pass this revised bill.
The American Civil Liberties Union (ACLU) had previously argued that the information-sharing section of the bill would increase the flow of Americans' personal information to the military and NSA. But Michelle Richardson, a legislative counsel at the ACLU's Washington office, noted in a blog post that the revised bill would "ensure that companies who share cybersecurity information with the government give it directly to civilian agencies, and not to military agencies" like the NSA.
Demand Progress, which is not yet satisfied with the measure, has created a letter-writing campaign to the Senate to urge additional pro-privacy protection language. Share wisely, and take care.
Cloud Computing Market Size - Facts and Trends
Excerpted from CloudTweaks Report by Rick Blaisdell
Although estimates of the overall cloud market size vary considerably, the consensus is that cloud computing is growing rapidly. Market Research Media, cited in a Bloomberg report, says the cloud market will reach $270 billion in 2020 while Forrester was not that optimistic, predicting last year that the market will hit $241 billion by that time and says the market will hit about $55 billion by 2014.
So, what are the trends in the cloud market?
Software-as-a-service (SaaS) offers more growth opportunities than any other segment. SaaS will retain its position as a leading segment in cloud computing. Gartner tracks ten different categories of SaaS applications in this latest forecast with CRM, ERP, and Web Conferencing; Teaming Platforms; and Social Software Suites being the three largest in terms of global revenue growth.
Forrester, Gartner, IDC and others have predicted significant growth in Supply Chain Management (SCM).
Infrastructure-as-a-service (IaaS) will witness a rapid growth in the next few years but Forrester expects dynamic infrastructure services to perform better than IaaS in the long term. Studies conducted by Gartner also found that PaaS is the fastest growing segment of the five included in their public cloud forecast.
Growth rates for the Platform-as-a-service (PaaS) subsegment according to Gartner study include the following: Application Development (22%), Database Management Systems (48.5%), Business Intelligence Platform (38.9%) and Application Infrastructure and Middleware (26.5%), with the last subsegment expected to be the largest revenue source in PaaS for the next four years.
However, all segments of the cloud computing market will be influenced by the overall state of the economy and global demand for IT services. Cloud computing is an attractive growing market particularly for SMB customers, but offers significant potential for organizations of all sizes. That's why many managers are eager to implement cloud computing services and products while others are slower to adopt cloud-based solutions.
Fun Facts about Cloud Computing
Excerpt from FormTek Report by Dick Weisinger
Charles Babcock, editor at InformationWeek, recently included "5 Cloud Facts" as part of his cover story piece on an investigation on the Economics of Cloud Computing. Those facts came from a suite of surveys and studies done at InformationWeek:
25 percent of IT shops are using either cloud servers or storage as part of an infrastructure-as-a-service (IaaS) strategy. 11 percent of companies have a major cloud implementation. 26 percent of companies are planning a major cloud implementation. 20 percent of companies have a formal policy in place for evaluating cloud services. 38 percent of companies have expressed concerns with unanticipated overrun costs from using cloud services.
Some Fun Facts about Cloud Computing posted by Todd Nielsen at Wired Magazine's CloudLine include:
90 percent of Microsoft's 2011 R&D budget was spent on cloud computing strategy and products. The cloud computing market is expected to reach $241 billion by 2020. Cloud providers have increased personnel from nil in 2007 to over 550,000 in 2010. 48 percent of US government agencies as part of the "cloud-first" policy have begun using the cloud. 30% of small and mid-size businesses (SMBs) used cloud software in 2011.
Intel has a List of Cloud Facts too that includes the following:
More than 2% of electricity in the US is consumed by data centers. It is expected that with the technology of today, the equivalent of about 45 new coal plants will be needed to power data centers by 2015 without dramatic efficiency improvements. A new server is added to the cloud for every 600 smart-phones or 120 tablets.
ISPs Improve Delivery on Advertised and Actual Broadband Speeds
Excerpted from Broadcasting & Cable Report by John Eggerton
Internet service providers (ISPs) are delivering faster broadband speeds and delivering more fully on their promises of advertised speeds, and subscribers are benefiting from both. That is the upbeat takeaway from highlights of the FCC's second annual residential wireline broadband advertised/actual speed survey, done in cooperation with the nation's largest ISPs.
The report's bottom line is that ISPs have delivered significant improvements in only a year in key areas, including more accurate promises of performance. According to a source familiar with the FCC's planned unveiling of the study at a public meeting Thursday, as of April 2012, participating ISPs were delivering 96% of advertised download speeds during peak periods, up from 87% in March 2011.
The 2012 study found that improvements in delivering on those ISP-advertised speeds were due to improvements in network performance rather than "adjustment to the speed of tiers offered."
The study found that ISPs did not just improve their ability to deliver on what they advertised, but improved the actual bottom-line speeds customers were getting.
Because ISPs are doing a better job of meeting or beating advertised speeds, says the FCC report, consumer's actual speeds are up by almost 38%.
The average actual speed in March of 2011 was 10.6 Mbps. In April 2012, that figure was 14.6 Mbps speeds are increasing at a faster rate.
Other takeaways: Faster speeds are resulting in greater overall consumption. Seven of the participating ISPs are offering speeds of 50 Mbps or greater, and four are offering at least 100 Mbps.
Next steps for the FCC, says the study, to keep that ball rolling: continue to encourage boosts in speed and capacity, expand to include new technology, and continue dialog with stakeholders.
DDN & ATG Enable Intel Orgs to Harness Big Data & Cloud Collaboration
Excerpted from Sys-Con Media Report
DataDirect Networks (DDN), the leader in massively scalable storage, today announced that its award-winning Web Object Scaler (WOS) cloud storage appliance is now available for purchase through the NASA SEWP IV contract, a Government-Wide Acquisition Contract (GWAC), via YottaStor, a division of Alliance Technology Group (ATG).
Sold as an integrated system within YottaStor's mobile computing and big data storage system called YottaDrive, the combined solution is ideal for government agencies that are challenged by the need to capture, store, process, and distribute massive volumes of intelligence and surveillance sensor data.
Designed to support scalability to over an Exabyte of global storage capacity and trillions of objects, global data distribution and latency-optimized global access, WOS is built to enable multi-site, global organizations to connect to and collaborate on data without the bottlenecks and overhead associated with traditional file access.
DDN's WOS is used as the underlying global namespace that connects YottaDrive multi-Petabyte "Enterprise Thumb Drives" into a connected and global big data namespace — breaking down data collaboration barriers between distributed government facilities and agencies.
With performance that is also significantly higher than conventional file storage solutions, the YottaDrive architecture featuring DDN WOS appliances can capture, store and distribute any sensor data type and resolution to enable rapid decision making at any scale.
"The Big Data era is not defined by the ability to store data, but by the ability to quickly extract actionable insight from that data," said Jeff Denworth, Vice President of Marketing, DDN.
"Our WOS solution, as a foundation of YottaStor's YottaDrive system, provides a cost-effective way for analysts and agencies to leverage a secure, geographically dispersed, high-performance big data storage cloud to maximize the value of their intelligence and surveillance data."
WOS enables significant cost effectiveness through extreme ease of management, substantially economical acquisition expenses, 99% storage efficiency at the disk level and reduced requirements for cooling and power. This improved efficiency reduces the total cost of hyperscale data management from traditional file system data centers.
The YottaDrive mobile computing and big data storage system is currently available in three configurations, 2.5, 5 or 10 Petabytes, with pricing starting at 8 cents per GB per month and available as low as 5 cents per GB per month.
"Our customers' ability to acquire, exploit and store massive amounts of data is just emerging. The YottaDrive delivers cost per Gigabyte over 90% lower than what many of the federal agencies we work with are paying today," according to Mike Dillard, former Chairman, CIA Information Policy Board and Executive Director of YottaStor.
"This new challenge is fueled by the massive amounts of content being generated by new technologies like sensors, medical devices, and full-motion cameras as integral parts of the business process. The YottaDrive is a game-changer for highly secure environments dealing with data sets so large that they require a combination of localized storage and processing at the unit level. The YottaDrive is engineered to deliver the economics and performance of the public cloud behind the agency firewall."
Department of Defense Maps its Cloud Strategy
Excerpted from Sci-Tech Today Report by Jennifer LeClaire
The US Department of Defense (DoD) has released a cloud computing strategy all its own, aimed at moving current network applications from a duplicative, cumbersome, and costly set of application silos to an end-state designed to create a more agile, secure, and cost-effective service environment that can rapidly respond to changing needs.
DoD CIO Teri Takai said the government agency is moving to an enterprise cloud environment that provides tangible benefits across the department by supporting the delivery of the joint information environment, from the continental United States to the warfighter at the tactical edge.
"This strategy lays the groundwork, as part of the Joint Information Environment (JIE) framework, for achieving cloud adoption within the Department," Takai said. "It focuses on the creation of department core data centers, enterprise cloud infrastructure, and sustainment of cloud services."
As part of the Defense Department's cloud computing strategy, the government has named the Defense Information Systems Agency (DISA) as the enterprise cloud service broker to help maintain mission assurance and information interoperability within this new strategy.
"The Defense Department is committed to accelerating the adoption of cloud computing and providing a secure, resilient enterprise cloud environment," Takai said. "This strategy will align with all department-wide information technology efficiency initiatives, federal data center consolidation and cloud computing efforts. The result of the strategy will be improved mission effectiveness, increased IT efficiencies, and enhanced cybersecurity."
Charles King, principal analyst at Pund-IT, told us he's not surprised the Defense Department has developed a cloud computing strategy.
The MeriTalk Cloud Computing Exchange, a community of federal cloud leaders, estimates that federal agencies are already saving $5.5 billion a year via cloud implementations and that those savings could balloon to $12 billion as the cloud gains momentum in the federal government. MeriTalk also discovered that Defense Department respondents have a rosier picture of the cloud and its impact, estimating that IT budgets will decrease.
"For some time now, there's been a drumbeat around the value that cloud computing offers to businesses, and the DoD's DISA announcement proves that public sector institutions intend to embrace the cloud, as well," King said.
"That's all to the good, and the decision to name DISA as the DoD's cloud service broker is a smart one since it should help ensure that cloud strategies and implementations will follow a common set of rules and guidelines, making successful outcomes more likely."
Huawei Cloud Storage Passes Performance Tests of CERN
Excerpted from Telecom Tiger Report
Huawei, a leading global information and communications technology (ICT) solutions provider, on Wednesday said that its cloud storage system has passed the performance test of the European Organization for Nuclear Research (CERN).
CERN is the world's largest particle physics laboratory. The CERN data center, also known as the Worldwide LHC Computing Grid (WLCG) Tier-0, is at the core of a global computing resource which enables the storage and analysis of more than 20 PB of Large Hadron Collider (LHC) data per year.
CERN openlab was created to develop innovative and advanced IT systems to be used by the LHC community through bringing together the efforts of science and industry. The scalability of the storage system is important for CERN as the laboratory faces the ever increasing demands of its physics users. The massive data growth prompts CERN to evaluate new storage technologies.
Huawei, with industry-leading storage experts and a significant technical experience, is committed to the development of innovative storage solutions in the field of next-generation cloud storage. By joining CERN openlab as a contributing member, Huawei focuses on investigating the applicability of new storage techniques and architectures to the processing of high energy physics data from the LHC experiments.
In early 2012, Huawei's cloud storage system was delivered to the CERN site, and in three months, the installation and benchmark performance evaluation were completed. Huawei's cloud storage proved to show excellent data writing and reading performance in large-scale data environments, with horizontal scalability.
The system also provides a self-healing intelligent maintenance, which significantly reduces maintenance costs, and effectively enhances the storage system's availability and reliability. The test validation results demonstrate this innovative hardware and software architecture complies with the mass storage requirements.
"CERN is hitting the technology limits for resource-intensive simulations and analysis. Our collaboration with Huawei shows an exciting new approach, where their novel architecture extends the capabilities in preparation for the Exascale data rates and volumes we expect in the future." said Bob Jones, head of CERN openlab.
"Establishing the link with CERN openlab gave us a fantastic opportunity to further develop our cloud storage products, and proved their worth in the extreme scientific research and mass data environment." James Hughes, Chief Architect of Cloud Storage, Huawei said.
Microsoft Revamps Office, Looks to the Cloud
Excerpted from Reuters Report by Edwin Chan and Noel Randewich
Microsoft unveiled a new version of its Office suite tailored for tablets and other touch-screen devices, in the company's largest-ever overhaul of the workplace software it relies on for much of its profit.
The revamped Office, touted by Microsoft Chief Executive Officer (CEO) Steve Ballmer on Monday as the most ambitious version so far, takes advantage of cloud computing and is designed for use with the upcoming Windows 8 operating system.
It makes use of cloud computing by storing documents and settings on the Internet by default, and is compatible with touch-screens widely used in tablets. It also incorporates Skype, the video-calling service Microsoft bought for $8.5 billion in 2011.
The latest version of Office comes as Apple and Google make inroads into the workplace, long Microsoft's stronghold. Office is Microsoft's single-biggest profit driver.
"The Office that we'll talk about and show you today is the first round of Office that's designed from the get-go to be a service," Ballmer said at a news conference. "This is the most ambitious release of Microsoft Office that we've ever done."
Microsoft has a lot riding on the 15th version of Office. Windows is one of the world's biggest computing platforms, and the Office applications — Word, Excel, PowerPoint, and other tools — are used by more than 1 billion people around the world.
The world's largest software company has been slow to adapt to a boom in mobile devices and cloud computing.
Microsoft is hoping corporate IT managers will fork over the cash to upgrade internal software just when global tech spending is looking shakier than it has in years. The 2013 picture is uncertain but budgets are expected to tighten, with Europe's economic crisis and a deceleration in the Chinese economy.
The company last updated Office in 2010, when it incorporated online versions for the first time. The full version of Office 15 is expected to be available in early 2013.
Cloud computing refers to a growing trend toward providing software, storage and other services from remote data centers over the web instead of relying on software or data installed on individual PCs.
"Your modern Office thinks cloud first. That's what it means to have Office as a service," Ballmer said, adding that a preview version of the software is now available online.
Documents in the sleeker-looking Office can be marked up by writing on a touch screen with a stylus. The suite will be compatible with tablets that use Windows 8 — due for release in October.
Screens appear less cluttered with icons and menus than in current Office applications.
Microsoft did not say whether it planned to launch versions of Office compatible with Apple's iPad or tablets running Google's Android platform.
The software package is now integrated with Skype, the voice-over-IP service that lets users collaborate on documents through video conferences.
While past versions of Office saved documents on PCs' hard drives, the new Office uses Microsoft's online "Skydrive" service for default storage. Documents can be shared on Facebook or published as blogs.
"The Windows 8 launch is right around the corner, and we have a lot to do ... In a sense, it feels to us a lot like 1995," Ballmer said, referring to Microsoft's Windows 95 operating system which was a significant step forward at the time.
"We have the most exciting, vibrant version of Windows in years," Ballmer said. He did not disclose pricing plans for the new Office.
Google has been pushing hard to persuade Office users to switch to Google Docs, an applications suite running on Google's servers and accessible on the cloud and mobile devices. Apple has also been trumpeting the ability of its iPad — the dominant tablet in the market today — to perform clerical duties.
"This puts Microsoft even farther ahead of Apple in terms of product richness. But it still leaves the door open to competition from Google pursuing a strategy that's cross-device, cross-platform - and is free," said Sarah Rotman Epps, an analyst at Forrester.
Oracle Adds More Apps to its Cloud with Skire Acquisition
Excerpted from eWeek Report by Chris Preimesberger
Oracle, which has been on an acquisition tear lately, said July 19th that it has bought most of the assets of business software developer Skire, which makes a suite of capital program management and facilities management applications.
Skire's applications, which Oracle will add to its growing collection of choices to be made available through its new Oracle Cloud, can also be used in on-premise deployments. Terms of the transaction were not revealed by Oracle.
Skire's software consists of a set of management and governance tools that span all project phases from planning and building to operations, enabling companies to better manage their capital and construction programs.
By combining Skire with its Primavera products, Oracle intends to create a full lifecycle Enterprise Project Portfolio Management (EPPM) platform that provides a comprehensive offering from capital planning and construction to operations and maintenance for owners and operators, contractors and subcontractors, the company said.
The full suite of apps is designed to help organizations manage their projects with more predictability and financial control, improving profitability and operational efficiency.
Cisco Buys Software Security Firm Virtuata
Excerpted from Wall Street Journal Report by Ben Rubin
Cisco Systems said it bought Virtuata, a privately held security software firm, giving the highly acquisitive networking company new tools related to cloud computing and data center infrastructure.
Cisco is in the midst of a turnaround after restructuring last year to focus on core product areas such as routing and switching gear that shuttle data between computers. Cisco has said it is benefiting from telecommunications and other companies' need for more robust networks to support mobile and cloud computing.
The company said its newest deal "is well-aligned to our strategic goals to develop innovative virtualization, cloud and security technologies, while also cultivating top talent."
Financial details of the deal weren't disclosed. The Virtuata team, based in Milpitas, CA, will join Cisco's Data Center Group.
Historically one of Silicon Valley's most active buyers, Cisco has focused more recently on acquiring start-ups and small companies. It recently agreed to acquire Truviso, a real-time network data analysis and reporting software maker, and in March, said it would buy ClearAccess, a maker of customer-premise equipment management software.
The company earlier this year also agreed to acquire video-software maker NDS Group Ltd. for $4 billion, the company's biggest deal in more than two years and a reflection of Cisco's focus on video.
In May, Cisco said its fiscal third-quarter earnings rose, its second-straight quarter of improved earnings, as the company's margins and revenue continued to strengthen.
Virtustream Aligns with SafeNet
Excerpted from Cloud Computing Journal Report by Liz McMillan
Virtustream and SafeNet on Tuesday announced that they have entered into an agreement that will incorporate SafeNet's market-leading authentication solutions into Virtustream's enterprise cloud platform, xStream.
"Enterprises looking to deploy the cloud want both enterprise-grade performance and security, while still benefiting from the scalability and economics of multi-tenant virtualization technology," said Dr. Shaw Chuang, Executive Vice President of Engineering, Virtustream. "Incorporating SafeNet's authentication platform brings another best-of-breed offering to our platform and lays the foundation for further extensive security capabilities."
The rapid growth and evolution of the cloud market is generating new vulnerabilities, and evolving business requirements are calling for strong authentication for enterprise cloud solutions. The incorporation of SafeNet Authentication Manager into xStream brings secure cloud access from tablets and mobile devices via two-factor identity authentication. The new capability provides hybrid identity cloud model capability and centrally-managed authentication services, which extends the foundation for enterprise-level authentication, authorization, and auditing on the xStream platform.
"We're excited to have Virtustream leverage our data protection solutions to the xStream enterprise cloud," said Chen Arbel, Director, Business Development at SafeNet. "SafeNet's Authentication Manager enables customers such as Virtustream to easily extend their security infrastructure to keep pace with new and emerging issues, ensuring that data remains protected, whether it is in the enterprise or in the public or private cloud."
Virtustream is also a member of the SafeNet System Integrators and Alliance Partners, through which the two companies will work together to offer their integrated solutions to meet the growing cloud and security challenges of enterprises worldwide.
xStream is the industry's first cloud solution to use Virtustream uVM (Micro-VM) technology to provide efficiency that extends significantly beyond traditional virtualization, and enables a truly consumption-based pricing model.
The xStream platform supports complex enterprise IT systems, providing companies with a robust, secure cloud environment built from the ground up for enterprise applications.
The Virtustream cloud platform enables customers to move both legacy and Web-scale applications to the cloud so that they may benefit from compelling cloud economics by offering industry-leading application-level SLAs in a highly secure cloud with built-in data protection and disaster recovery.
The Battle for BitTorrent's Soul
Excerpted from WebDev360 Report by Elliot Bentley
We've just passed the 11th anniversary of the introduction of BitTorrent, a file-sharing protocol which has become the scourge of the media industries. Bram Cohen's creation made it easier than ever to share huge files across the Internet, making infringement of films and TV shows as common as music infringement.
In 2004, Cohen founded BitTorrent, Inc., which has remained the respectable face of the technology for the past eight years by promoting legal sources of torrent content. However, it has faced a constant uphill battle, and recent developments have highlighted the technology's split personality - and the ideological battle between the company and the community for BitTorrent's soul.
Boxopus made headlines at the end of last month with a service able to download the content torrent files on its own servers and send them directly to your Dropbox. By handling the download process "in the cloud," Boxopus promised to make it simpler and safer.
Just two days later, Dropbox blocked Boxopus - presumably fearing that their service would become known as a harbor for infringers if Boxopus took off. The project is now looking into other services to connect to instead, likely Google Drive if a recent user survey is anything to go by. While free in beta, Boxopus plans to charge for bandwidth use in the future.
Customers are essentially paying for the ability to download a torrent from anywhere, with the promise of anonymity.
Regardless of whether it's effective, Boxopus is yet another example of time and money being sunk into making BitTorrent more resilient to shutdown by the authorities. This version of the technology's future is of the ultimate infringement tool: safe to use and forever active.
BitTorrent's other future, and the one promoted by BitTorrent, Inc., is expanding the technology beyond unauthorized downloading of copyrighted materials. After all, there are plenty of lawful uses already.
BitTorrent as a protocol has been adopted by other companies needing to distribute large files: Facebook uses it to push updates out to its servers, while Blizzard (developer of World of Warcraft and Diablo III) uses a proprietary BitTorrent client to patch its games.
Yet despite these high-profile uses, BitTorrent is still a dirty word, synonymous with infringement. No wonder the company that shares its name with the protocol is looking to clean up its reputation.
If you want to see its vision of BitTorrent's future, just take a look through a couple of pages of its blog: lawfully distributed movies and albums, BitTorrent-based back-up services, and the recently-announced Torque. Most indicative of its future direction is a new desktop app for sending large files between individuals using BitTorrent technology, released in January with the working title "Share." As the pitch reads:
Have you ever been stuck trying to send an HD home movie to a friend over the Internet? Or a batch of high-resolution photos? How about longer smart-phone videos?
It's not easy. You can try a complicated FTP service. Or pay big fees for a file sharing or cloud service. Or dramatically reduce the size, quality or length of your creation to send via e-mail or social networks.
Sending large files to a friend via BitTorrent makes perfect sense, and it's a great use of the technology. BitTorrent, Inc. see this is such an important use of their protocol that they intend to incorporate it into uTorrent and other clients. There's potential for piracy, sure, but it's no greater than with standard e-mails.
It seems like a no-brainer - so why, after 11 years, has it only come to the official app? Perhaps because BitTorrent's primary users have no interest. A comment by "mynameishare" on TorrentFreak's coverage states: "Vuze had sharing with friends, no one wanted it".
So what can we make of BitTorrent, Inc.'s latest development? Torque is a browser plugin which uses the open-source btapp.js to provide a backend for web-based BitTorrent apps.
It's a smart move, considering the increasing trend towards users performing everything in the browser. (The other recent software trend is of course towards Android and iOS, but BitTorrent clients are already available on the former and aren't allowed on the latter.)
Using a browser plugin seems a bit archaic in the world of HTML5 and jQuery, but once installed it works beautifully. One initial example of the technology at work is OneClick, a Chrome extension that promises to integrate torrents identically to regular downloads and tidy away its P2P aspect. Unfortunately we were unable to install the current version of the plugin, but essentially it promises to make downloading of public torrent files easier than ever.
On the opposite side of spectrum is the second Torque experiment, PaddleOver, which works similarly to the desktop app Share in that it allows private sharing of files between individuals. Once the Torque plugin is installed, PaddleOver works brilliantly, delivering files straight to the download folder with a simple drag-and-drop interface.
Both were developed by Patrick Williams, Engineering Lead at BitTorrent, Inc. - suggesting that even BitTorrent, Inc. itself is unsure which way it's taking the company.
What is BitTorrent? Is it a way for independent creatives to distribute their work for free? Is it a flag for infringers to rally beneath? Is it a company working to innovate the world of file-sharing? Or is it just a neutral protocol for distributing large files?
Perhaps the infringers have already won this battle. It's rumored that BitTorrent, Inc. is planning to rebrand itself as "Gyre" - which, if true, is likely an admission that the brand they have spent eight years building is toxic when it comes to anything outside of unauthorized file sharing.
In that case, BitTorrent will truly be a technology of two halves: Gyre, the legal, corporate side, promoting free private file-sharing; and on the other side BitTorrent, a tool for unauthorized sharing of copyrighted media - and the occasional Linux distribution.
How Will the Web Monetize in 2020?
Excerpted from Techcrunch Report by Murthy Nukala
On a whim, I recently purchased a Logitech Keyboard Case for my new iPad. Instead of being just a novelty accessory, the keyboard has fundamentally changed my device usage: I'm using my iPad-plus-keyboard in meetings; I use my laptop-plus-docking station at my desk; and I use my iPhone when I'm on the go. My online behavior, once centered on one device, has now been fragmented across at least three devices.
I suspect that I am not alone, and that this is the shape of things to come.
The web is on the cusp of massive change: By 2020, the number of global Internet users is expected to quadruple to 4 billion, and most of these new users will come online using multiple devices. Additionally, existing usage will move significantly from the monolithic computer to mobile, as smart phones, tablets, smart TVs, and who knows what other devices further permeate both work and home life.
As the online population booms and usage becomes more and more fragmented across multiple devices, the key question is: How will all this multi-device traffic be monetized?
Chris Dixon recently blogged that most mobile apps currently fit into one of four categories:
Time Wasters, such as Angry Birds and Plants vs. Zombies, primarily provide entertainment value.
Core Utilities are those apps on the home screen of your phone, e.g., camera, phone, contacts, texting, calendar.
Episodic Utilities, such as OpenTable, Uber and Hipmunk, are extremely useful in certain situations. Sometimes, I'm in the mood to find a new Thai restaurant. Sometimes, I want to know the score of specific baseball game.
Notification-Driven Apps do just that — notify you. Notifications can be scheduled, for example, when something you want to buy goes on sale or when your PS3 is turned on at home.
Broadly speaking, these four categories also happen to describe how the web will be used and monetized in 2020.
Entertainment will either be monetized by the content itself (e.g., paying for a game, or for a subscription to watch a show), and/or by advertising that will be broadcast in nature. Social ads on Facebook may not drive response as much as other online channels, but Facebook's wide reach and high traffic will make it a natural venue for brand-awareness campaigns (e.g., opening night for a blockbuster movie).
Core usage will be sold as stand-alone apps or subscription services (e.g., your monthly cell phone plan). Few people want to be advertised to while they are using a device for core usage.
Episodic usage will mostly be monetized by advertising, which will be more targeted in nature (as opposed to broadcast). I use the term "advertising" broadly — e.g., restaurants that take reservations via the OpenTable app are essentially advertising on a per reservation basis.
Notifications will be monetized via a combination of advertising, subscriptions and freemiums — all depending on the nature of the notification. Personal productivity notifications — like the app that reminds you to take a break every hour — will probably be free and most likely show ads with the option to upgrade to ad-free. Notifications that are based on some sort of purchase intent — like sale alerts — will be monetized by advertising. Apps that provide an ongoing service — like the app that notifies you when your PS3 at home is turned on — will either be purchased outright or be subscribed to on a monthly basis.
So, outside of entertainment and core usage, the bulk of the web will still be monetized the way it is today — via advertising.
How will online advertising be different in 2020? The proliferation of devices fundamentally changes the online advertising equation.
Historically, we have primarily seen two types of online advertising: paid search and display. Paid search has been denominated in keywords. Display has fundamentally been denominated in cookies — impressions, clicks, unique users, reach, frequency, returning users, etc.
The biggest challenge to the online advertising industry of 2020 is the proliferation of devices. There is no "uber-cookie" that can track a user's behavior across all devices she might own — and privacy advocates would be up in arms if one existed. Without an uber-cookie, it seems like the online advertising market is set to become highly fragmented by device. Fragmenting the online advertising market by device might seem like a natural outcome, but it's not in anyone's interests. Greater fragmentation means greater advertiser inefficiency, which means online advertiser spend doesn't reach its full potential.
Is there a way to avoid fragmenting the online advertising marketplace by device? We only need to look at the paid search market for the right solution.
With episodic usage, we have a certain objective or intent in a situation, which the device or app helps us achieve.
When we issue a query to a search engine, that query has a certain intent associated with it, and the search results page and ads on it try to help us achieve our intent. As such, episodic usage is often "query-like." For example, if I'm in my Urbanspoon application browsing for Thai restaurants in San Francisco, that behavior is the functional equivalent of going to my favorite search engine and entering the query "Thai restaurants in San Francisco."
Additionally, if my device is geo-enabled, my location may help to provide further context around my behavior. My online behavior or the query is the explicit intent whereas my implicit intent is represented by geography, what device I'm using, what time of day it is, etc.
With paid search today, consumer intent is primarily explicit — it's stated in a query — whereas with app usage and geo-enabled devices, intent is both explicit and implicit.
Explicit intent in queries is monetized in search by selling keywords. Keywords alone, however, can't monetize implicit intent because implicit intent isn't query based. For example, a user who queries "big apple" on a smart phone while in New York City likely has a different intent than a user who queries "big apple" within a Yelp app while in North Carolina. The first person is probably looking for tours of New York City; the other person is looking for reviews of a local pizzeria. Explicit intent provides the subject; implicit intent provides the context.
So keywords alone can't prevent device-based fragmentation. Nor can cookies.
Regardless of whether I'm using a smart phone, tablet, laptop, desktop, or smart TV, browsing for Thai restaurants in San Francisco via Urbanspoon always has the exact same consumer intent associated with it.
In order to avoid device-based fragmentation in online advertising, a new unit of trade is needed — one that is neither keyword-based nor cookie-based, but rather intent-based. If consumer intent becomes the unit of trade, which device, app or channel (search, display, or social) a consumer is using no longer matters. An advertiser might simply buy users with an intent to buy basketball shoes or an intent to travel to Europe. Device, app and channel might only determine which type of ad to serve.
If an intent-based unit of trade can be used to monetize cross-device traffic, everyone wins. Advertisers and agencies will be able to maximize the reach and return of their campaigns, and they will benefit from cross-channel economies of scale. Publishers will monetize traffic better. And consumers will see ads that are more relevant to their intent.
Google has nearly a $200 billion valuation, not because queries are valuable unto themselves, but because queries are rich with consumer intent. If the rest of the industry can monetize intents across a multitude of devices as well as Google has monetized intent in search, then online eCPMs of 2020 will be on par with search eCPMs of 2012.
Coming Events of Interest
ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.
ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.
CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.
Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.
|