November 19, 2012
Volume XLI, Issue 9
CCW:2012 Online and CCE:2013 Opportunities
Audio recordings of keynotes and panel discussions as well as keynote slide-show presentations from CLOUD COMPUTING WEST 2012 (CCW:2012) are now available online here. CCW:2012, which took place earlier this month in Santa Monica, CA, focused on entertainment content delivery, broadband network infrastructure, and investing in the cloud.
The Distributed Computing Industry Association (DCIA) again thanks its strategic partner, the Cloud Computing Association (CCA), for bringing together such an impressive array of exhibitors and event sponsors for this first joint activity. These included Aspera, DataDirect Networks, Extreme Reach, Oracle, SAP, CSC Leasing Company, i3m3 Solutions, Kaltura, Scayl, Sequencia, Unicorn Media, Hertz NeverLost, MEDIAmobz, Moses & Singer, and Scenios. CCW:2012 could not have taken place without them.
Together, the DCIA and CCA will next present CLOUD COMPUTING EAST (CCE:2013), a strategic summit for business leaders, in Boston, MA on Monday May 20th and Tuesday May 21st, 2013.
CCE:2013 will feature three co-located conferences that will thoroughly examine the impact of cloud-based services and technologies on three major sectors: GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES.
Companies interested in exhibiting at CCE:2013 should contact CCA Executive Director Don Buford. An attractive range of sponsorship opportunities are available, including the unprecedented ability for HEALTHCARE participants to extend their reach to a fourth co-located event, The New Clinical Research Conference.
The CCE:2013 speaking faculty will be comprised of more than 100 thought-leaders who will conduct a timely exploration of the various roles cloud computing now plays in streamlining government, revolutionizing healthcare, and providing for the secure and safe functioning of financial services at all levels.
If you'd like to speak at this seminal industry event, please contact DCIA CEO Marty Lafferty. The DCIA is interested in representatives of cloud-computing solutions providers that serve these three key sectors, customers in these sectors who use cloud services, and industry analysts who cover this space.
Government Involvement in Cloud Computing & Security
Excerpted from GovPlace Report
Government cloud computing is being used in agencies around the world, but it's also the government's job to create and enforce regulations surrounding the technology. Government agencies are working to implement cloud-based services for their own benefits while also attempting to develop guidelines for organizations in the United States. According to FierceTelecom, the federal government is working to help enterprises answer difficult questions about the technology, ensuring that cloud-based services are safe to use, reliable and standardized.
The news source reported that the US government is trying to create a set of guidelines and frameworks for cloud computing, attempting to define the technology so that the general public can understand it. The government is also responsible for working with nonprofits, as well as school systems, higher education institutions, and corporations to implement cloud computing strategies that can save costs and enhance reliability. Please click here for the full report.
Healthcare Cloud Computing Continues to Grow
Excerpted from Evolution1 Report
More and more of today's healthcare organizations are starting to see the value and cost-savings in migrating to cloud based operations, including hospitals and medical research facilities. In this article from Healthcare IT News featuring Todd Reynolds, Chief Technology Officer at Evolution1, we see more examples of the growing trend.
"With its ease of installment, functional versatility, cost effectiveness and seemingly limitless capacity, cloud computing is taking the healthcare IT landscape by storm. There are many different deployments happening at facilities across the industry as providers search for ways to improve their computing power, inject vitality into established systems and utilize the cloud's potential for clinical, financial and administrative purposes."
Read the full story here.
Financial Services Firms Look to Cloud for Big Data Solutions
Excerpted from LiveTrading News Report
Cloud computing continues to be heralded as holding many of the answers to taming the exponential rise of "Big Data." But issues of compliance, reliability, and security persist as some financial services firms remain reluctant to move into cloud computing, which is the practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than on a local server.
With the amounts of data that firms are now being forced to use and hold escalating at a rapid rate, cloud computing-which is an increasingly mature technology that has now been around for a few years-is becoming an attractive option for many firms in dealing with these Big Data needs. Please click here for the full report.
Report from CEO Marty Lafferty
The DCIA is proud to announce the agenda and speakers for our upcoming INTELLIGENCE IN THE CLOUD Workshop being conducted in partnership with the National Association of Broadcasters (NAB) on Tuesday December 4th in Washington, DC.
Space is limited, so please click here to register now.
At IITC, cloud-computing experts from the private and public sectors will come together to discuss ways that cloud services are evolving to meet a variety of needs while satisfying outside-the-wall concerns for mission-critical content.
Don't miss this opportunity to network and learn from government and commercial professionals.
Our opening keynote will feature Dr. Suzanne Yoakum-Stover, Executive Director, Institute for Modern Intelligence, who will address What Does Cloud Computing Bring to the Intelligence Community? What's a working definition of "cloud computing" and related terminology for this workshop? What are the unique attributes of cloud computing that should benefit the missions of military and government agencies charged with managing large quantities of sensitive multimedia data?
In Securing Multimedia Traffic on Broadband Data Networks, Verizon Communications will offer observations from the perspective of a major broadband network operator on issues associated with securely storing and distributing multimedia data among multiple users of different clouds, with anecdotal examples from the private sector to illustrate how related hurdles have been overcome and lessons learned.
Next, Scott Campbell, Industry Principal, Media & Telecom, SAP America and Marlyn Zelkowitz, Global Director, Public Services Cloud Solutions, SAP America will discuss Business Intelligence & Mobile Device Management. What lessons can be learned from private sector business intelligence (BI) and mobile device management (MDM) activities in the cloud that have value to related military and government initiatives? SAP's Sybase acquisition and its work with Gartner and TMobile on real-time analysis combined with predictive analytics solutions provide important insights.
In Data Collaboration & Analysis, speaker Randy Kreiser, Chief Storage Architect, Federal Division, DataDirect Networks (DDN) will share DDN's real-world experiences in optimizing every step involved in gathering, analyzing, and sharing visual data — including full-motion video and imagery. What can be gleaned from its NRL "large data" program that will be useful to additional deployments by other branches of the military and in government agencies?
Next, a panel discussion will address Identifying Common Challenges. Moderator John Bordner, WISC Enterprises will lead a group of government and private sector professionals including Mark Andrew Eick, Institute for Modern Intelligence in exploring common challenges and ways to leverage the learning from commercial experience to date with ongoing government projects.
During lunch, in a break from our primary focus on operational and technological issues, a special session on Legal Considerations led by Lawrence Freedman, Partner, Edwards Wildman Palmer will highlight late-breaking policy and regulatory developments in the cybersecurity and data privacy arenas and what to expect from lawmakers on these matters in 2013.
In Cloud 101 for Multimedia Assets, speaker Allan McLennan, President, The PADEM Group will present PADEM's recent IBC tutorial addressed the business models and investment opportunities surrounding the growth, ease, and security of cloud-based distribution of media, analytics, and services, especially in the over-the-top (OTT) video and the connected-TV space. How do the underlying cases studies relate to the problems that must be overcome for Military & Government implementation of cloud solutions for critical visual data?
Michelle Munson, CEO, President, and Co-Founder, Aspera will discuss Collaboration, Transfer, Storage, Access. With its numerous cloud deployments for media, collaborative research, and government agencies for gathering intelligence data from the field faster and more reliably than previously, what insights can Aspera provide regarding key technologies needed for real-time collaboration, transfer, storage, and access?
In Managing Your Cloud for Security and Compliance, speaker Jeff Reich, Chief Risk Officer, Layered Tech will outline the essential characteristics of cloud computing that can make the cloud dynamic, agile, and responsive. Sometimes, traditional controls become ineffective. Layered Tech shares experiences gained from its cloud offering that provides security with guaranteed compliance. Some essential characteristics had to be interpreted in order to allow important process controls to work.
Sean Jennings, VP, Solutions Architecture, Virtustream will provide guidance on the key question of Private, Hybrid or Public -- Which Cloud for What Purpose. Lessons learned from XStream deployments as a secure high-performance cloud software solution, appliance, and managed service. How uVMTM technology enables granular accountability and works with existing hardware and virtualization software to run private clouds, manage hybrid clouds, and securely operate public clouds. How cloud staging and network services have helped overcome obstacles in transitioning assets to the cloud and co-locating non-cloud physical assets and applications.
In Secure Interactive Broadcasting, speaker Greg Parker, CEO, DarpaTV will offer lessons learned from a proprietary digital broadcasting network enabling live, secure, interactive broadcasting with two-way video and audio interaction among program hosts and audience members. Viewers watch and participate in the show (video, voice, text, surveys, polls, etc.), or create their own shows — through real-time cloud-based collaboration. A large multinational firm used DarpaTV for secure interactive CEO and CFO broadcasts to all employees around the world allowing workers to video-call into the show to ask questions and carry-out live voting polls on certain topics, and also to have secure multimedia (inserting images, video clips, audio segments) interactive broadcasts with their customers and suppliers for product development and production.
Eric Klinker, CEO, BitTorrent will provide valuable insights into What, not Where: What the Cloud Changes Next. The BitTorrent ecosystem is one of the world's largest clouds. BitTorrent was designed as a replacement for http, and already moves more content than all of the websites in the world combined. What are the advantages of a distributed cloud like BitTorrent? And what does this mean for the future of the Internet? BitTorrent Inc. CEO Eric Klinker will uncover some of the far-reaching strategic considerations, technical oversight and infrastructure needs associated with managing sensitive multimedia data in the cloud.
And finally, Prabhat Kumar, Managing Partner, i3m3 Solutions; David Sterling, Partner, i3m3 Solutions; and Vic Winkler, Author, "Securing the Cloud" will discuss Putting It All Together and Where Do We Go from Here in a panel that we will moderate. What were the most salient points and most important takeaways from each of the workshop's earlier sessions? What critical questions remain relating to an IP and security framework? How can IP Multimedia Subsystem (IMS) platforms enhance hybrid cloud and legacy TDM environments? And finally, how can security frameworks best be applied to cloud environments?
We hope to see you at what promises to be an extremely valuable and instructive workshop. Share wisely, and take care.
How Cloud Computing Helped Obama Win the Election
Excerpted from Forbes Report by Reuven Cohen
Jeff Barr, a web services evangelist at Amazon has written an interesting blog post on how Amazon's cloud helped Barack Obama win the election.
In the post Barr says. "Imagine setting up the technology infrastructure needed to power a dynamic, billion-dollar organization under strict time limits using volunteer labor, with traffic peaking for just one day, and then shutting everything down shortly thereafter. The words 'mission critical definitely apply here. With the opportunity to lead the United States as the prize, the stakes were high."
He goes on to outline how "the campaign used AWS to avoid an IT investment that would have run into the tens of millions of dollars. Along the way they built and ran more than 200 applications on AWS, scaled to support millions of users. One of these apps, the campaign call tool, supported 7,000 concurrent users and placed over two million calls on the last four days of the campaign."
The post further outlines the technology and architectures used within the campaign:
A database running on Amazon RDS, served as the primary registry of voter file information. This database integrated data from a number of sources including www.barackobama.com and donor information from the finance team in order to give the campaign managers a dynamic, fully-integrated view of what was going on.
Alongside this database, an analytics system running on EC2 Cluster Compute instances. Another database cluster ran Leveldb on a trio of High-Memory Quadruple Extra Large instances.
This array of databases allowed campaign workers to target and segment prospective voters, shift marketing resources based on near real-time feedback on the effectiveness of certain ads, drive a donation system that collected over one billion dollars (the 30th largest e-commerce site in the world).
The applications run by the Obama campaign are comparable in scope and complexity to those seen in the largest enterprises and data-rich startups. For example:
Massive data modeling using Vertica and Elastic MapReduce.
Multi-channel media management via TV, print, web, mobile, radio and email using dynamic production, targeting, retargeting, and multi-variant testing like you'd find in a sophisticated digital media agency.
Social coordination and collaboration of volunteers, donors, and supporters.
Massive transaction processing.
Voter abuse prevention and protection, including capture of incoming incidents and dispatch of volunteers.
A rich information delivery system for campaign news, polls, information on the issues, voter registration, and more.
Read the complete post here.
5 Reasons Why Carriers Must Embrace Cloud Computing
Excerpted from Insurance Technology Report by Dave Shively
Cloud-based solutions are allowing traditional carriers to close the technology gap and get back into the game. Not only do they empower carriers to quickly get on par with their new entrant competitors, they are also able to keep pace as technology evolves.
The insurance industry has been remarkably slower than other industries in adopting cloud computing and software-as-a-service (SaaS) based models. Insurance carriers continue to rely on legacy core system applications to conduct business rather than leveraging new cloud-based solutions due to concerns about losing sensitive customer information and the costly repercussions thereafter.
Instead of replacing the legacy systems, carriers have chosen the path of plastering new applications in front of their outdated systems. They typically select functionality and replace it piece-by-piece.
However, that approach has now reached a point where they have to start addressing the core system issues in order to remain competitive. The costs for these replacements have spiraled out of control while failing to deliver the expected transformations leaving carriers with expensive obsolete technology that can no longer be supported.
Despite the concerns about data security, if executed well, cloud services can help businesses improve customer satisfaction, financial performance and operational efficiency securely. Cloud technologies are increasingly being seen as business enabling technologies, which can help the businesses not only to bring new business solutions with relatively lower initial investments, but also to maintain and sustain current business and IT operations with relatively lower operational costs. Below are five reasons why carriers must make the transition from legacy applications to cloud-based platforms:
1. Deliver Business Agility
Not only is the cloud a less expensive alternative, it also allows carriers to start up new solutions much more quickly than traditional in-house development or third-party package implementation. With the adoption of cloud computing models, organizations can deliver solutions in months rather than years. This model allows for not only business continuity, but also allows for both business providers and their customers to be more timely and profitable.
2. Reduce Operational IT Costs
As discussed earlier, carriers are now at the point where they have replaced many of their legacy-based applications and are only using a small portion of that initial mainframe capacity. In the end, legacy technologies end up being the more expensive and ultimately a more complex option to support in the long-run. Big carriers were the first ones to adopt cloud and SaaS based solutions. However, given the current economic climate and the cost benefits a cloud- based solution can deliver, smaller players are quickly jumping on the bandwagon as they can now afford these core solutions that were previously not an option resulting in an even more competitive marketplace and leveled playing field.
3. Provide Scalable Solutions
Cloud based IT infrastructure can be easily outsourced and can drastically cut project cycles and standardize IT operations, resulting in an overall reduced IT effort. The sophisticated monitoring and management solutions offered by many cloud computing providers allow IT departments to effectively and efficiently manage the cloud services. With extra cycles available, IT departments can focus on strategic business solutions and scale them quickly.
4. Leverage New Technologies and Applications
Cloud solutions are enabling new carriers to enter the insurance markets and deliver solutions online without the baggage of legacy technologies. They can operate as virtual organizations that are primarily online based. They can also easily integrate new strategic platforms such as mobile into their cloud-based solutions. Configurable cloud-based solutions allow carriers to modify the behavior of their systems within a matter of weeks as compared to legacy carriers that can take months or even years to implement these same changes. In an already highly competitive economic climate, legacy carriers can be at an extreme disadvantage.
5. Replace legacy systems safely and securely
Even though carriers have had concerns about confidential customer information being compromised through adopting the cloud, the truth is that with the recent advances in virtualization and security technologies, cloud applications are very secure. These applications have now matured to the point that they are even more secured in the virtualized-cloud environment than in legacy platforms.
Cloud-based solutions are allowing traditional carriers to close the technology gap and get back into the game. Not only do they empower carriers to quickly get on par with their new entrant competitors, they are also able to keep pace as technology evolves. With technology no longer a barrier, they can leverage their core business strengths to take back customers and win market share.
DDN: $100 Million Investment in Exascale Computing
DataDirect Networks (DDN), the world leader in massively scalable storage, today announced the establishment of a $100 million investment in its research and development efforts, specifically directed at resolving key challenges to achieving Exascale levels of performance in scientific computing.
"Data-intensive computing impacts individuals, organizations, industries and governments by enabling the creation of valuable information based on massive volumes of highly complex data," said Alex Bouzari, CEO and Co-Founder, DDN. "Significant investment is required to allow researchers to address challenges such as the design of new materials needed for better electric car batteries, the improvement of multi-physics models for more accurate severe weather modeling, and the development of high-resolution cosmological simulations to help understand dark matter and the universe around us. With today's announcement, DDN is establishing a clear direction for our Exascale computing agenda and reaffirms DDN's continued central role in the future of supercomputing."
Exascale computing refers to a computer system capable of reaching performance exceeding one quintillion computer operations per second. This level of computing capability is expected to arrive around 2018 and will represent a thousandfold increase over current state-of-the-art technology. To achieve this level of scalability, it is largely understood that radical innovation will be required to ensure applications can effectively scale across massive infrastructure that is highly resilient, power efficient and affordable.
Powering over 60 of the world's top 100 supercomputers (as ranked by the June 2012 Top500 list), DDN is at the forefront of supercomputing and Big Data storage scaling efforts. Throughout DDN's history, the company has successfully commercialized technology which is proven at the highest levels of computing as data-intensive computing challenges become democratized across the enterprise.
"With its investment in Exascale, DDN is signaling its intention to remain at the leading edge of high performance computing," said Addison Snell, CEO of Intersect360 Research. "HPC technologies are starting to become mainstream with the advent of Big Data, and now there is huge market incentive for HPC leaders like DDN to develop next-generation technologies for scalable and efficient data-intensive computing."
The new investments by DDN represent a substantial percentage of DDN's engineering resources and will be directed towards technology challenges which become critical at Exascale proportions, including:
I/O Acceleration: New file system, middleware and storage tiering methods will be required to eliminate scalability barriers associated with conventional methods of file, object and database access in order to achieve 1,000x scalability, TB/s performance and million-way application CPU parallelism.
Converged Infrastructure: The convergence of computing, storage and networking technologies will give rise to intelligent and accelerated data storage infrastructures which can co-locate pre-processing and post-processing routines natively within the storage infrastructure to enable applications to access data with increased acuity.
Information Value Extraction: Leveraging converged infrastructures, DDN R&D efforts will support the development of scalable data analytics environments to extract actionable insights from vast volumes of unstructured data.
Energy and Data Center Efficiency: With the emergence of storage-class memory and software tools, infrastructures can be built with fewer components compared to today's disk-based technologies. These initiatives will serve to significantly reduce hardware acquisition costs but will also make data centers much more space and power efficient by reducing storage footprint by more than 75%.
Telefonica Standardizes on Mediaroom IPTV
Excerpted from CED Magazine Report by Brian Santo
Telefonica Digital is rolling out IP video services based on Microsoft's Mediaroom across an unspecified number of countries included in its vast international reach.
The global operator has already begun deploying the technology in three countries — Brazil, Chile, and Spain.
Telefonica cut a long-term deal to continue to base its new Global Video Platform (GVP) on Microsoft TV technology. The two tested the IPTV platform last year with the introduction of Movistar Imagenio on Microsoft's Xbox 360. That service includes 12 linear channels focused on sports, including the Spanish First Soccer League (Liga BBVA) under Canal+ Liga Channel.
Alcatel-Lucent is managing the network deployment. GVP will deliver TV services over both managed (IPTV) and unmanaged (over-the-top) networks to a range of consumer devices.
Mediaroom provides a range of advanced features, including time-shifting and multi-screen, among others. Devices covered include set-top boxes, Xbox 360s, tablets and smart-phones.
Earlier this month, Telefonica launched its IPTV service in Brazil, following a similar launch in Chile in October. Both services, Vivo TV Fibra in Brazil and Movistar IPTV in Chile, are the first IPTV deployments that take advantage of the new GVP platform.
Beyond those countries, Telefonica said only that it anticipates extending the deployment of TV services based on the GVP to a number of its other operating businesses over the next few years.
"Video is a fast-growing market, and we already play a leading role in delivering pay-TV services to customers in Europe and Latin America. This new platform allows us to reflect the deep and rapid changes happening in this market. It offers the ease and convenience of a global, convergent platform while maintaining flexibility over content for our local businesses. Most important, it allows us to meet customer demands for access to video content on an ever-expanding range of devices," said Vivek Dev, Director of Digital Services at Telefonica Digital.
BitTorrent Inks Deals with 20 TV Set Makers
Excerpted from Multichannel News Report by Todd Spangler
Consumers using Internet-connected TV sets from some 20 manufacturers will soon be able to stream video content — including both authorized and unauthorized material — downloaded via embedded BitTorrent software, with the click of a remote.
BitTorrent has reached deals with 20 consumer-electronics manufacturers, according to CEO Eric Klinker.
Most of BitTorrent's deals are for TV models that will launch — as early as this holiday season — in Europe and Asia, Klinker said.
"You may not see them as much in the US," he said.
That's because for many Internet-connected HDTVs marketed in the US, the manufacturers already have deals with streaming-video providers such as Netflix, according to Klinker. "We are competing with the Netflixes and Hulus for space on the television," Klinker said.
Klinker denied that CE manufacturers are wary of associating with BitTorrent because the file-sharing application is widely known to be used to access infringing content. He noted that about 2 million titles of legal content are available in the BitTorrent universe, including movies, music, and books.
Asked how many unauthorized files are available via BitTorrent clients, Klinker said, "We have no idea. It's like asking Google's web browser Chrome how much pornography there is on the Internet." His point was that liability for illegal activities rests with users, as is the case with web browsers.
Previously, BitTorrent announced a deal with Vestel, a Turkish TV manufacturer that was showing what it called "the world's first BitTorrent certified TV" at the IFA show in Berlin last year.
BitTorrent, founded in 2004, is backed by venture-capital firms Accel Partners, DCM and DAG Ventures. According to Klinker, about two-thirds of BitTorrent's 100 employees are engineers.
The San Francisco-based company generates revenue from advertising (through syndicated search with partners that include Microsoft Bing and Ask.com); a premium, ad-free version of BitTorrent; and licensing deals such as those with TV manufacturers.
BitTorrent clients once consumed as much as 40% of global Internet traffic, but the company has seen its share of overall usage decline as video-streaming services like Netflix have taken off.
In addition, the company claims, its introduction of a new protocol into the BitTorrent client — designed to give priority to other applications — has reduced the amount of bandwidth the application's estimated 150 million-plus users consume in aggregate.
BitTorrent's share of usage declined from 19.2% of peak-period aggregate traffic in 2010 to 12.0% in 2012, according to bandwidth-management equipment vendor Sandvine. At the same time, BitTorrent traffic increased 40% from 2011 to 2012, the vendor found. By 2015, BitTorrent will shrink to less than 10% of total traffic, Sandvine predicts.
BitTorrent and other file-sharing applications were at the center of a brouhaha that emerged in late 2007, when Comcast was accused of "blocking" the protocols of peer-to-peer (P2P) applications.
The MSO said it was merely delaying P2P packets, but the Federal Communications Commission (FCC) ordered it to cease the practice, as the agency said Comcast violated its open Internet principles. A federal appeals court ruled that the FCC overstepped its authority in that case but Comcast had already moved to a protocol-agnostic bandwidth management technique; separately, the FCC enacted network-neutrality rules at the end of 2011.
Klinker said that in response to the network-neutrality issues, BitTorrent implemented a new best-effort protocol in clients starting in February 2010, which senses congestion on the network. "It essentially lets BitTorrent gets out of the way of every other application," he said.
BitTorrent, along with Microsoft, worked within the Internet Engineering Task Force to document the "uTP" protocol, and the IETF has now approved the protocol as a draft, Klinker said.
AOL Partners with Kaltura, Extends Video Reach
Excerpted from Online Media Daily Report by Gavin O'Malley
The AOL On Network on Wednesday announced a partnership with online video platform Kaltura.
The deal makes around 420,000 premium videos within the AOL On Network accessible from Kaltura's open-source video platform, which is used by 150,000 companies, including Best Buy, HBO, and TMZ.
"It opens up places where The AOL On Network's content can live that have traditionally been against the two-player experience," says Frank Besteiro, head of business development for the AOL On Network. "As a media company, our thoughts are on mass distribution for our content, and that means making our content player agnostic."
As a result of the partnership, Kaltura's publishers will be able to access and search the AOL video catalog, then add videos to their own accounts.
Regarding the business arrangements of deals, Besteiro said: "We will set up individual agreements with partners interested in taking our content through their Kaltura players. We can sell or they can sell both options as available to partners."
Despite an explosion in online video, Kaltura co-founder Michal Tsur said quality content remains in high demand. "The goal of working with AOL is to provide a solution for publishers that have views but lack actual content," Tsur said.
Added Besteiro: "Mass video views are the objective for this partnership. The ability to drive video views of our content across all platforms and devices is what every media company is looking to achieve."
Kaltura recently launched new multi-platform functionality with support for iOS, Android, Xbox, and Google TV, extending the reach of AOL content to users of mobile devices and connected TVs.
Launched just this past April, The AOL On Network brings AOL's entire video offering under one umbrella, and reaches more than 75 million unique visitors per month, according to AOL, citing comScore data.
Cost-Efficient Cloud Computing System Bridges Gap
Excerpted from The Guardian Report by Sean Nam
Developing countries are utilizing the growing adoption of "cloud computing" — the use of consumer devices to access remote computer and information resources — to expand their economic role in an increasingly global economy according to a report by UCSD researchers.
Dean of International Relations and Pacific Studies Peter F. Cowhey and Senior Fellow at the Institute on Global Conflict and Cooperation Michael Kleeman released the report, Unlocking the Benefits of Cloud Computing for Emerging Economies - A Policy Overview, documenting the benefits of cloud computing. They also touched on the communication policy implications of the technology.
Cloud computing is the use of computer resources that are delivered over a network. Gmail, Dropbox, and Flickr are examples of services that use cloud computing. Cloud computing has become a viable and expandable tool because computers become twice as powerful every 18 months while the price stays the same, and the cost of storage information decreases over time while broadband becomes faster as more and more people have access to it.
The decreasing cost of cloud computing has allowed Third World countries to utilize the cloud in the same way developed countries have in the past. Developing countries such as India, Mexico and South Africa have seen increases in Internet adoption rates as it becomes more affordable to access it. Cloud computing increases commerce in these countries by facilitating their entrance into the global market.
The results of the report showed that cloud computing also has created more jobs by reducing initial entry-level costs for new products and businesses. It also found that cloud computing services allow small- and medium-sized businesses to expand while sustaining quality products and services.
The analysis would compare the cost of cloud computing versus local computing and demonstrate how accessible broadband can close the gap between the economies of the poorest and richest countries.
"Whenever ICT capabilities change radically there are broad societal implications. The cloud democratizes ICT capabilities for emerging countries. So, there will be a continuing need to reassess its evolving impact," Cowhey said.
New Report Measures Success Factors in Cloud Computing
Excerpted from VAR Guy Report by Christopher Tozzi
Why do some cloud projects succeed and others fail? That's a question undoubtedly on the minds of many IT admins and business executives. And it's also the focus of a recent IT Process Institute (ITPI) report. I recently chatted with ITPI Managing Director Kurt Milne, who helped prepare the report, to get his views on how to deploy cloud computing effectively in the business environment. Here's what he had to say.
Milne, who has 20 years' experience in the IT industry and co-authored a book on building private clouds, knows this subject well. Meanwhile, the ITPI, founded in 2005, is also well-established as a consulting resource for the IT community, counting VMware, Red Hat, and Symantec among its customers.
The ITPI's full report on success factors in cloud computing, which was released publicly November 14th, is based on a survey of a 143 organizations in a number of channels that have created cloud environments. A few, according to Milne, were large Fortune 50 enterprises, but most were smaller companies or startups.
Regardless of the character of the company in question, however, Milne identified several key pieces of advice based on the study of successful cloud deployments:
Think about a cloud project in business terms, rather than technical IT ones. By keeping the ultimate commercial goals of the project in mind, organizations ensure that they do not simply build a cloud that geeks will find cool but that has little use in advancing the interests of the company itself.
Pay attention to users, and involve them in cloud projects during all stages, from design to production deployment. Too often, Milne said, companies adopt a "build it and they will come approach," expecting users to be attracted to new cloud resources automatically. But if the cloud doesn't reflect users' needs, they're unlikely to use it — which means, again, that organizations run the risk of building expensive cloud environments that are state of the art from an IT admin's perspective, but of little value in the eyes of the people they were intended to benefit.
Along similar lines, encourage integration between IT and business staff. While there is arguably a "cultural mismatch" between these two groups — IT people are not always great about considering business needs and users, and non technical staff rarely understand the intricacies of the code behind the cloud — efforts to bring them together lead to cloud projects that are effective on both the technical and business ends.
These perspectives and many more are discussed in greater detail in the full report. Interested readers can also download a free whitepaper summarizing the findings.
Winners in the Cloud Revolution
Excerpted from ZDNet Report by Jack Clark
The growth of cloud computing will affect the economic nature of industry as much as technology.
The huge scale at which the hardware and software of the cloud operates is having a significant effect on not only the types of technology being used in industry, but which companies stand to benefit from its rise.
A round of interviews with executives within cloud companies, hardware vendors, and analysts underlines the trend towards the growing influence of the cloud over the structure of the IT industry, and identifies the likely winners and losers among IT suppliers.
This article does not focus on cloud providers, such as Google, Microsoft, and Amazon, but on the companies that stand to benefit most from the changes to the IT ecosystem due to their rise.
The growth in commodity hardware, and the shift in infrastructure intelligence up from chip level to distributed software systems, gives an advantage to companies comfortable managing large-scale heterogeneous IT.
Virtualization expert VMware is in a good position as its technology cares little for the hardware it sits on top of. Furthermore it has the SDN technology from Nicira, which could help it appeal to a new breed of companies with different attitudes to how networks should work.
Big Switch Networks stands to benefit as well. The company, whose co-founder helped develop the OpenFlow networking protocol, has begun selling software tools that let companies spend less on network hardware by moving functions away from proprietary gear and onto low-cost commodity servers.
"Virtualization expert VMware is in a good position as its technology cares little for the hardware it sits on top of."
Another is Cloudera, which does a commercial version of the open-source Hadoop data analysis system. Cloudera's customers include Morgan Stanley, Monsanto, and government intelligence agencies.
Many of the software technologies generating much of the cloud interest are open source (Puppet, Chef, Hadoop, Linux), and so to win in this area companies need to help steward the development of the technology while concentrating on enterprise-grade support — a potent source of revenue in open software.
For this reason companies like Linux stewards Canonical and Red Hat also stand to benefit, along with companies backing strong open-source projects and making them into commercial products — like cloud infrastructure software maker Joyent via the node.js language, or Rackspace via OpenStack.
In the same way that Hadoop can be run locally or from a cloud service, such as Amazon Web Service's Elastic MapReduce technology, all companies that deal in scalable software have the potential to sell to the whole spread of the IT market, ranging from individual developers up to companies with on-premise commodified hardware and, perhaps, to the cloud providers themselves.
Asian manufacturing companies such Quanta, Wistron and Foxconn, that help organizations build their own equipment, stand to win out as well.
In the trade these companies are known as Original Device Manufacturers (ODMs) from their heritage of making low-end equipment for the mass market, such as notebooks. These days they make low-cost server, storage and network gear as well.
Cloud companies like Google, Facebook and Amazon have started to go direct to ODMs to get their kit built, sidestepping traditional vendors like HP, Dell and IBM.
Along with this, many of these ODMs work closely with Facebook's Open Compute Project — a trans-industry scheme to create low-cost server designs. Companies such as Wistron already have designs available and are preparing to sell them to companies large and small.
As more and more interest grows in commodity data center hardware, these companies will benefit.
Colocation and hosting providers such as Equinix, Telstra, and TeleCity with large data centers in key locations are also likely to win.
This is because as companies move to the cloud some will maintain a private cloud presence. Typically they will put their cloud in a data center operated by such hosting providers. These providers can then offer dedicated links to cloud operators as one-stop provision for both public and private cloud. The move to cloud in general will benefit those with such options.
Put together, these winners suggest that a major change is afoot in the IT industry as clouds build their own bespoke hardware and smaller companies look to save on kit by adding software capabilities. The companies with the greatest chance of winning are either those who are designed to serve huge infrastructure operators, such as the Asian ODMs, or software companies who can help lessen IT buyers' hardware budgets by giving them more control through software.
Coming Events of Interest
INTELLIGENCE IN THE CLOUD - December 4th in Washington, DC. This workshop continues the NAB's series of programs developed for military and government professionals to demonstrate how advances in the commercial industries can benefit the military and government sectors. The atmosphere for the workshop is interactive with attendee participation welcome.
Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.
2013 International CES - January 8th-11th in Las Vegas, NV. With more than four decades of success, the International Consumer Electronics Show (CES) reaches across global markets, connects the industry and enables CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $195 billion US consumer electronics industry.
CONTENT IN THE CLOUD at CES - January 9th in Las Vegas, NV. Gain a deeper understanding of the impact of cloud-delivered content on specific segments and industries, including consumers, telecom, media, and CE manufacturers.
2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.
NAB Show 2013 - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.
CLOUD COMPUTING CONFERENCE at NAB - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.
CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.
|