Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

CSC Leasing

Dancing on a Cloud

DataDirect Networks

Extreme Reach

Hertz Neverlost

Kaltura

Moses & Singer

SAP

Scayl

Scenios

Sequencia

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

gCLOUD

hCLOUD

fCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

February 11, 2013
Volume XLII, Issue 9


Mobile & Cloud Computing Triple Micro Server Shipments

Excerpted from DigiTimes Report by Joseph Tsai

Driven by booming demand for new data center services for mobile platforms and cloud computing, shipments of micro servers are expected to more than triple in 2013, according to an IHS iSuppli Compute Platforms Topical report.

Shipments of micro servers are forecast to reach 291,000 units in 2013, up 230% from 88,000 units in 2012. Shipments of micro servers commenced in 2011 with just 19,000 units. However, shipments by the end of 2016 will rise to some 1.2 million units.

The penetration of micro servers compared to total server shipments amounted to a negligible 0.2% in 2011. But by 2016, the machines will claim a penetration rate of more than 10% — a fifty-fold jump.

Micro servers are general-purpose computers, housing single or multiple low-power microprocessors and usually consuming less than 45W in a single motherboard. The machines employ shared infrastructure such as power, cooling, and cabling with other similar devices, allowing for an extremely dense configuration when micro servers are cascaded together.

"Micro servers provide a solution to the challenge of increasing data-center usage driven by mobile platforms," said Peter Lin, Senior Analyst for Compute Platforms at IHS.

"With cloud computing and data centers in high demand in order to serve more smart-phones, tablets, and mobile PCs online, specific aspects of server design are becoming increasingly important, including maintenance, expandability, energy efficiency, and low cost. Such factors are among the advantages delivered by micro servers compared to higher-end machines like mainframes, supercomputers, and enterprise servers — all of which emphasize performance and reliability instead."

Micro servers are not the only type of server that will experience rapid expansion in 2013 and the years to come. Other high-growth segments of the server market are cloud servers, blade servers, and virtualization servers.

The distinction of fastest-growing server segment, however, belongs solely to micro servers.

The compound annual growth rate (CAGR) for micro servers from 2011 to 2016 stands at 130% — higher than that of the entire server market by a factor of 26. Shipments will rise by double- and even triple-digit percentages for each year during the period.

Given the dazzling outlook for micro servers, makers with strong product portfolios of the machines will be well-positioned during the next five years — as will their component suppliers and contract manufacturers.

A slew of hardware providers are in line to reap benefits, including microprocessor vendors like Intel, ARM, and AMD; server brand vendors such as Dell and Hewlett-Packard (HP); and server original development manufacturers (ODM) including Taiwan-based firms Quanta Computer and Wistron.

Among software providers, the list of potential beneficiaries from the micro server boom extends to Microsoft, Red Hat, Citrix, and Oracle. For the group of application or service providers that offer micro servers to the public, entities like Amazon, eBay, Google, and Yahoo are foremost.

The most aggressive bid for the micro server space comes from Intel and ARM.

Intel first unveiled the micro server concept and reference design in 2009, ostensibly to block rival ARM from entering the field.

ARM, the leader for many years in the mobile world with smart-phone and tablet chips because of the low-power design of its central processing units, has been just as eager to enter the server arena — dominated by x86 chip architecture from the likes of Intel and AMD.

ARM faces an uphill battle, as the majority of server software is written for x86 architecture. Shifting from x86 to ARM will also be difficult for legacy products.

ARM, however, is gaining greater support from software and OS vendors, which could potentially put pressure on Intel in the coming years.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyUS Federal Communications Commissioner (FCC) Robert McDowell spoke for many of us in the technology sector when he urgently warned Congress this week that the free and open Internet is under attack — and inaction is not an option.

The Commissioner's statement of this clear and present danger was published in Fighting for Internet Freedom: Dubai and Beyond.

McDowell explained bluntly that the US must take action to stop the UN agency from gaining further governance power over the Internet as it intends to do at the ITU's upcoming 2014 ITU Plenipotentiary Conference (PP-14).

PP-14 will be held in Busan, South Korea from October 20th through November 7th next year.

As previously reported here, what happened at December's WCIT-12 seriously threatens to end the era of an international consensus to keep inter-governmental hands off of the Internet.

The UN debacle prompted widespread outrage, an unprecedented unanimous US House of Representatives vote in opposition to it, and a collective refusal from 55 member states to sign-on to the ITU's potentially very damaging treaty.

While not signing onto the Dubai treaty changes was an important act, it's clearly not enough given what the ITU is planning next.

As a result of its 89-55 vote, the ITU gained an unprecedented foothold of authority over the economics and content of key aspects of the Internet and it plans further exploitation.

McDowell said, "Internet freedom's foes around the globe are working hard to exploit a treaty negotiation that dwarfs the importance of the WCIT by orders of magnitude. In 2014, the ITU will conduct what is literally a constitutional convention that will define the ITU's mission for years to come."

The ITU is attempting to legitimize, under international law, foreign government inspections of the content of Internet communications to assess whether they should be censored by governments under flimsy pretexts such as network congestion.

Decades of consensus on the meaning of crucial treaty definitions that were universally understood to insulate Internet service providers (ISPs), as well as Internet content and application providers, from intergovernmental control, are under direct threat of being upended.

On January 11th, ITU Secretary-General Hamadoun Toure released the fourth and final ITU/WTPF-13 report outlining groundwork for Internet governance and regulatory topics.

This month, the ITU is preparing additional Internet governance plans at the World Telecommunication Information and Communication Technology Policy Forum meeting, which will continue in May in advance of PP-14.

The ITU/WTPF-13 report explicitly includes the creation of "global principles for the governance and use of the Internet."

It also redefines the multi-stakeholder definition of Internet governance by granting governments — now defined by ITU as underrepresented multi-stakeholders — far more Internet governance power.

It changes basic definitions so the ITU will have unrestricted jurisdiction over the Internet. It allows foreign phone companies to charge global content and application providers internationally mandated fees (ultimately to be paid by all Internet consumers) with the goal of generating revenue for foreign government treasuries.

IT subjects cybersecurity and data privacy to international control, including the creation of an international "registry" of Internet addresses that could track every Internet-connected device in the world. It imposes unprecedented economic regulations of rates, terms, and conditions for currently unregulated Internet traffic peering arrangements.

It establishes ITU dominion over important non-profit, private sector, multi-stakeholder functions, such as administering domain names like .org and .com web addresses. It subsumes into the ITU functions of multi-stakeholder Internet engineering groups that set technical standards to allow the net to work.

As the Commissioner warned, we must waste no time in fighting to prevent further governmental expansion into the Internet's affairs at the upcoming ITU Plenipotentiary in 2014.

In 2014, let's not look back with regret that we did not do enough. Now is the time to act. Support the US Congress in telling the world that we will be resolute and stand strong for Internet freedom. All nations should join us. Share wisely, and take care.

Domino's Pizza Cloud Computing: Food for Thought

Excerpted from V3.Co.UK Report by Dan Worth

The lure of a takeaway pizza is well-known and thanks to technology it's becoming ever easier to order whenever the mood takes you.

Smart-phones and tablets let us place orders from any location via dedicated apps and one firm that's seen the rise perhaps better than any other is Domino's Pizza.

It now sees over 50 percent of its orders placed online, and 20 percent are coming from mobile applications on Android and iOS devices, according to the firm's Chief Information Officer Colin Rees.

Rees was speaking at Cloud Expo in London last week and explained how the firm has embraced the benefits of cloud computing to ensure it can cope with the growing demand placed on its ordering systems.

Domino's, understandably, has fairly predictable demand windows, which revolve around a peak between 5 PM and 9 PM and particularly on Fridays, Saturdays, and Tuesdays - when it runs its two-for-$10 deal.

As a result, the firm has increased its capacity during these periods to ensure it meets demands, but it can also ask for even more from its provider whenever necessary.

This means during the quiet periods it's not paying for unnecessary extra capacity. However, this doesn't mean it can't react to events and boost capacity as necessary.

"Before Christmas when the weather was bad we knew sales would increase and we would need more capacity," Rees explained.

"So we doubled the processors we were utilizing in the servers overnight and could prove that the increase was necessary given the sales we processed, which would have proven too great if we'd left the capacity levels as they were the day before."

The firm's experiences are likely to mirror many similar-sized firms considering whether or not their business could operate in a similar manner.

Domino does not use an auto-scaling cloud service to do this, but instead works out what it will need whenever necessary with help from a cloud consultancy firm called Capacitas.

The firm, which acts as an independent advisor, can help them assess how much they will need so they don't pay for any unnecessary usage.

Danny Quilton, the Co-Founder of Capacitas, said operating in this way is more cost-effective and risk-free for firms that want to ensure their costs don't spiral out of control.

"Because software is always changing and being enhanced with new features and capabilities and whenever you do that there's a risk the software will become less efficient and become more resource hungry," he told V3.

"This means you need more computing capacity and so the danger is you have a computing state that runs away with large amounts of capacity and you quickly lose control of costs."

As such, Capacitas advises the firm on its software, making sure it's running as efficiently as possible and ensure costs are kept down.

However, while both Domino's and Capacitas were upbeat in the system they are using, Quocirca analyst Clive Longbottom raised some issues.

"All that Domino's is doing is hosting its system on an external platform, which may save it money in the short term, but doesn't give it any flexibility moving forward," he said.

"True cloud would allow for a contract that allows some burst capabilities within the contract, with only a need to renegotiate if the level is maintained above an agreed amount for a period of time."

Nevertheless, Domino's appears satisfied with its current cloud model and it could offer some food for thought for other firms seeing a similar rise.

Cloud Computing for SMEs — Time to Follow the Herd

Excerpted from The Guardian Report by Chris Harding

The benefits of cloud computing for small-to-medium enterprises (SMEs) are numerous, but you should also beware of potential pitfalls.

The latest market forecasts for cloud computing are predicting 30% annual growth in the industry, as more and more people adopt the latest technology to store information in a virtual space. But cloud computing isn't just for data, you can also use it to run applications and software remotely, without being tied to one computer.

For a small business, outsourcing IT to the cloud lowers the need for specialist skills and frees managers to concentrate on their core business. It may cost slightly more than in-house IT, but this is often outweighed, as it can sometimes enable a small company to take a "big company" approach to problems by increasing efficiency.

The farming industry is a good example. Most farmers run relatively small businesses, but they still have to deal with data processing such as accounts and payroll. There is also a surprising amount of specialized information processing which is usually done by hand.

For example, a dairy farmer owning 300 milking cows can spend 90 minutes per day creating manual data records and calculating production metrics. There is a cloud service specifically designed for dairy farmers that can relieve him of that. Similar cloud services are also available for poultry and arable farmers, and a multitude of other sectors.

The first level of cloud services is called infrastructure-as-a-service (IaaS). It works by providing virtual hardware, such as computers, raw processors, storage software platforms and so on. Instead of being physically based in an office, employees can access their data via the Internet.

The second level, known as platform-as-a-service (PaaS), provides all the resources necessary for small business owners to create their own software and programs. Usually this will include an operating system, programming environment, database, and web server. This can save you the cost of storing and investing in the hardware and software which would otherwise be necessary.

The third level available is software-as-a-service (SaaS), which provides you with software and programs that are available and ready to use. You can run them remotely, without having to go through lengthy installation processes and worry about how your hardware will cope with the application.

While IaaS and PaaS will have some value to businesses large enough to have their own computer installations, it is SaaS, with its access to applications that provides most value to small businesses.

When it comes to embracing cloud computing, the main worry on most people's minds is security. By accessing cloud services over the Internet, you are sharing them with people and organizations that you don't know, possibly including business competitors.

Is your data safe? The answer, in most cases, and with reputable cloud suppliers, is yes. Indeed, the level of security achieved by a good cloud supplier, with in-house experts that follow the latest developments, is generally much better than most small businesses can manage.

The next issue that comes up is control, as your data is held on someone else's system. It can be hard to get your head around the fact that you don't know which computer — or even which country — it is stored on. So to what extent do you still own it?

It's an area where you need to tread carefully. You may be subject to different laws depending on where the data is held — particularly data containing personal information, such as employee records. You may be in breach of contract with your customers if data is disclosed or withheld by your cloud supplier, and you could face a damaged reputation in such cases.

Different cloud suppliers have different contractual terms which you should be aware of as they might impact your ability to fulfill your legal, contractual, and moral obligations.

Finally, some people are concerned by the fact that it is difficult to "mix and match" cloud services. Cloud isn't like the Internet in this respect. You can plug in a new Internet router or change your Internet service provider (ISP) without too much hassle because the Internet is based on a few simple standards.

This is not currently the case for software programs running in the cloud. If you use a cloud service for your records or accounts, you are likely to find it more difficult to change to another similar service. And, if you use more than one cloud service, making them work together will be much harder than making a new router work with your ISP.

Cloud can provide real benefits to small businesses, increasing efficiency, time-management, remote working potential, and saving physical space, but there are also pitfalls to avoid. You should choose your cloud provider the same way you would choose any other supplier and ask a few crucial questions:

Is the cloud service provider stable and trustworthy, with a reputation to lose? Are the conditions of contract reasonable and fair to you? Will it benefit your business?

Look before you leap when considering a cloud service, but if the answer to these questions is yes, don't be afraid to take the plunge.

Cloud Computing & Managed Services to Drive IT Growth in 2013

Excerpted from MSP Mentor Report by CJ Arlotta

Demands are always counterbalanced with concerns, especially with regards to cloud computing, big data and mobility. CompTIA, a non-profit association for the IT industry, recently released its IT Industry Outlook 2013, revealing a strong demand for technology, while also acknowledging caution in the industry. Here are the details.

CompTIA's forecasts project a growth rate of 3 percent for the global IT industry in 2013. The forecast for the US market, however, is slightly lower at 2.9 percent on the low end, potentially reaching 4.9 percent. Managed services providers (MSPs) should concern themselves with the following trends driving growth in 2013:

Technology trends to watch — CompTIA reported that technology will continue to transition from a supporting tool to a strategic driver. Mobility as we know it will become a way of life for both employees and customers. Companies will begin taking cloud computing for granted. And the "big data" trend will force companies to review data practices, opening an opportunity for MSPs to reevaluate data opportunities with clients.

IT channel trends to watch — Managed services (nothing but good news) will be upping the game. Specialization will cause those in the IT industry to tackle growing markets. Convergence will create "strong bedfellows," according to the company.

CompTIA's report was based on an online survey of 518 IT industry companies conducted in late December 2012. For growth estimates, the organization used a consensus forecasting model. More can be found on CompTIA and the organization's initiatives on its website.

Good-Bye PC Maker Dell and Hello Cloud Company Dell

Excerpted from InfoWorld Report by Ted Samson

In May, "Mad Money"'s Jim Cramer declared Dell's stock dead, citing the company's inability to compete with the likes of Apple in the ever-shrinking laptop and PC market. That very day, the company's stock dropped significantly — and continued to do so for most of the remainder of 2012.

That price drop may be one of the best things to happen to Dell in recent memory in that it has put company founder Michael Dell in a position to inexpensively bring his company private. By going private, Dell removes itself from the scrutiny of shortsighted, consumer-fixated Wall Street analysts who are sometimes too dazzled by the new and shiny to see beyond the next quarter.

The truth of the matter is Dell has slowly been reinventing itself over the past few of years, making the transition from Dell, maker of low-priced PCs to Dell, cloud-focused enterprise hardware and services company. Just take a look at the 15 companies Dell has acquired in the past three years or so alone, and its road map becomes eminently clear.

Starting in late 2009, Dell acquired Perot System, an IT services provider. In February 2010, the company snagged Exanet, a developer of NAS software. Next, it grabbed data-center management company Scalent, followed by Ocarina, purveyor of storage-deduplication wares. It wrapped up 2010 with acquisition of cloud-integration company Boomi and storage hardware maker Compellent.

Onto 2011: On January 4th of that year Dell closed a deal to acquire SecureWorks, a security-service provider. That summer, it snagged RNA Networks, which specialized in software networking. Soon after, the company inked a deal to pick up Force10, maker of Ethernet switches.

Finally, Dell grabbed six companies last year — five in April alone: Make Technologies and Clerity Systems, both services-modernization companies; SonicWall, AppAssure, and Quest — all enterprise security companies; and Wyse, a thin-client specialist.

Now are those the acquisitions of a company that's looking to take on Apple in the premium PC, laptop, or tablet space? Or a company that wants to go head-to-head in a battle it just can't win against the likes of Asus and Acer on cheapo PCs?

No, those are the acquisitions of a company that aims to become an end-to-end provider of hardware and services for business customers large and small seeking to smoothly and securely migrate their business processes to the cloud; who are considering jettisoning PCs in favor of cloud-friendly thin clients (a la Project Ophelia); and who are struggling to get a handle on BYOD with all of its security and management headaches.

Microsoft evidently sees value in Dell's path, as it is reportedly contemplating chipping in a couple billion dollars to help Dell go private. That investment could be win-win for both companies. Microsoft needs a hardware partner to serve as a vehicle for its own cloud ambitions, and Dell needs to pad its lackluster software arsenal.

Simply going private isn't going to save Dell — the company will have to do some serious restructuring and rebranding as it undergoes its transformation. Going private will, however, give Dell the time and breathing room it needs, without having to explain to impatient shareholders every quarter why it hasn't built a better a iPad.

Cisco Simplifies Hybrid Cloud Computing

Excerpted from IT Business Edge Report by Michael Vizard

As cloud computing continues to evolve, the relationship between the network and virtual machine (VM) software is starting to fuse.

Case in point is a series of new offerings announced today by Cisco that includes the Cisco Nexus 1000V InterCloud for tying enterprise networks more closely to third-party cloud service providers, in addition to a new Cisco Nexus 6000 Series Switch that Cisco says is now the industry's highest-density 40 Gigabit Layer2/Layer 3 fixed switch and an extensible programmable controller that makes it simpler to extend and scale policies across an extended network.

According to Craig Huitema, Director of Data Center and Cloud Network Management for Cisco, while IT organizations now live in a world that consists of many clouds, they need to be able to consistently apply policies. At the same time, if a cloud service provider expects a customer to use its service, it has to find a way to seamlessly participate in that network.

The Cisco Nexus 1000V InterCloud addresses that latter issue by allowing cloud service providers to deploy a virtual switch inside the data center of a customer that essentially extends the cloud service provider's network fabric out to that customer. Because it's a virtual switch, the Cisco Nexus 1000V InterCloud makes it easier for cloud service providers to participate in a hybrid cloud computing environment, says Huitema.

Cisco also announced today two versions of the Cisco Nexus 6000 Series Switches, which Huitema says offer three times the port density of competitors. The Nexus 6004, for example, can be configured with 384 10GbE ports or 96 40GbE ports. In total, Huitema says the Cisco 6004 can support up to 75,000 virtual machines on a single switch.

In addition, Cisco unveiled a new version of the Cisco ONE Software Controller, which is intended to make the network programmable. It supports Java and RESTful API alongside both proprietary Cisco and emerging industry-standard OpenFlow agents. Huitema says the goal is to be able to manage an extended network of cloud applications running on top of a Cisco software-defined network (SDN) using, for example, an OpenStack-compatible management application.

While everyone agrees that cloud computing by definition will be hybrid, a lot of people don't always appreciate how complex hybrid cloud computing really is to manage. Cisco is clearly starting to address those issues at a network level that, hopefully, will soon serve to hide a lot of complexities associated with actually trying to manage a hybrid cloud computing environment.

DataDirect Networks Expands Worldwide HPC Footprint

Over two-thirds of the world's 100 fastest supercomputers are powered by DataDirect Networks (DDN), according to the latest rankings published by Top500.

With over 170 systems on the world-renowned listing, in 2012 DDN award-winning products saw increased adoption by the high-performance computing (HPC) community by 20 percent year-over-year among systems listed on the Top500 November 2012 list.

Delivering the highest level of platform scalability and performance efficiency, DDN products enable HPC clusters to accelerate discovery and minimize the cost of computing.

DDN supports more supercomputers and HPC clusters than any other storage vendor, and provides more total storage bandwidth to the Top 500 than all other storage vendors combined.

Notable new DDN customers in the Top500 list include No. 6, Germany's Leibniz Rechenzentrum; No. 17, the Tokyo Institute of Technology; No. 23, the U.K.'s University of Edinburgh, and No 24, the National Computational Infrastructure of Australian National University.

DDN also is the only vendor with Top500 customers on each continent that had Top500 sites on the November list.

"The global topic of HPC is more vital than ever as 'Big Data' computing has democratized much of the requirements and corresponding technologies that DDN has been investing in for over a year. At the highest end of scale — whether in HPC, in web scale computing environments or with high-scale Big Data analytics — our solutions, experience, and global partner network have established DDN as the de facto standard for scalable and efficient data-intensive computing infrastructure, said Erwan Menard, chief operating officer, DDN.

"We continue to invest heavily in our massively scalable storage portfolio of products to exceed the expectations of our customers all around the world — making a difference today and bridging the market to the exascale era."

IBM Simplifies Big Data and Cloud Computing Adoption

Making it easier for organizations to quickly adopt and deploy big data and cloud computing solutions, IBM today announced major advances to its PureSystems family of expert integrated systems.

Now, organizations challenged by limited IT skills and resources can quickly comb through massive volumes of data and uncover critical trends that can dramatically impact their business.

The new PureSystems models also help to remove the complexity of developing cloud-based services by making it easier to provision, deploy, and manage a secure cloud environment.

Together, these moves by IBM further extend its leadership in big data and next generation computing environments such as cloud computing, while opening up new opportunities within growth markets and with organizations such as managed service providers (MSPs).

Across all industries and geographies, organizations of various sizes are being challenged to find simpler and faster ways to analyze massive amounts of data and better meet client needs.

According to IDC, the market for big data technology and services will reach $16.9 billion by 2015, up from $3.2 billion in 2010. At the same time, an IBM study found that almost three-fourths of leaders surveyed indicated their companies had piloted, adopted, or substantially implemented cloud in their organizations — and 90 percent expect to have done so in three years.

While the demand is high, many organizations do not have the resources or skills to embrace it.

Today's news includes PureData System for Analytics to capitalize on big data opportunities; a smaller PureApplication System to accelerate cloud deployments for a broader range of organizations; PureApplication System on POWER7+ to ease management of transaction and analytics applications in the cloud; additional options for MSPs across the PureSystems family including flexible financing options and specific MSP Editions to support new services models; and SmartCloud Desktop Infrastructure to ease management of virtual desktop solutions.

The new IBM PureData System for Analytics, powered by Netezza technology, features 50 percent greater data capacity per rack and is able to crunch data 3x faster, making this system a top performer, while also addressing the challenges of big data.

The IBM PureData System for Analytics is designed to assist organizations with managing more data while maintaining efficiency in the data center — a major concern for clients of all sizes.

With IBM PureData System for Analytics, physicians can analyze patient information faster and retailers can better gain insight into customer behavior. The New York Stock Exchange (NYSE) relies on PureData System for Analytics to handle an enormous volume of data in its trading systems and identify and investigate trading anomalies faster and easier.

"NYSE needs to store and analyze seven years of historical data and be able to search through approximately one terabyte of data per day, which amounts to hundreds in total," said Emile Werr, head of product development, NYSE Big Data Group and global head of Enterprise Data Architecture and Identity Access Management for NYSE Euronext.

"The PureData System for Analytics powered by Netezza system provides the scalability, simplicity and performance critical in being able to analyze our big data to deliver results eight hours faster than on the previous solution, which in our world is a game changer when you look at the impact on businesses every second that passes."

The Nielsen Company, leading global information and measurement company, provides clients with a comprehensive understanding of consumers and their behavior leveraging Netezza technology to deliver complex analytic capabilities.

"Recently, Nielsen tested two major competitors with their latest products to tackle our highly complex analytic workload," said John Naduvathusseril, Chief Data Architect, the Nielsen Company.

"Both vendors did not match up on consistent performance, simplicity, data refresh speed and overall performance of our reporting needs. Other vendors require customization, which we cannot sustain and they still did not deliver the kind of performance as the PureData System for Analytics."

The IBM PureData System for Analytics is powered by Netezza technology. It is a strategic part of the IBM Big Data Platform, an integrated architecture that is intended to help organizations achieve Smarter Analytics by leveraging workload optimized systems that work together to tackle advanced analytics.

To help assess big data competency, visit: www.unlockingbigdata.com.

Google & Mozilla Create P2P Video Conferencing Clients

Excerpted from The Inquirer Report by Lawrence Latif

Software developers Google and Mozilla have shown off video conferencing using WebRTC between the Chrome and Firefox web browsers.

Last year Mozilla showed off WebRTC by demonstrating video conferencing on its Firefox web browser linked to its Social API.

Now Google has joined Mozilla in demonstrating peer-to-peer (P2P) video conferencing between the Firefox and Chrome web browsers using WebRTC.

Mozilla said WebRTC's Peerconnection interface allows applications to make P2P audio and video connections without having to install plugins. Google's Chrome 25 web browser, which is now in beta, has implemented WebRTC allowing users to set up connections with Firefox and other WebRTC enabled applications.

Google said all audio and video connections are encrypted using open audio and video codecs such as Opus and VP8, respectively.

Both Google and Mozilla are working with the W3C and IETF to develop WebRTC, and although it is still work in progress, the applications that are being developed to demonstrate its capabilities show that WebRTC can mean an end to having to download plugins and add-ons.

WebRTC developers are getting greater access to low-level hardware capabilities, allowing for development of rich applications without the need for downloading applications and plug-ins. While Google and Mozilla claim this will enable easier use of media rich applications, it will place greater emphasis on web browser security in order to sandbox such applications.

Microsoft & Huawei to Sell Windows Smart-Phones in Africa

Excerpted from NY Times Report by Kevin O'Brien

Microsoft, taking aim at the world's fastest-growing smart-phone market, said on Monday that it would team up with Huawei of China to sell a low-cost Windows smart-phone in Africa.

The phone, called the Huawei 4Afrika Windows Phone, will cost $150 and initially be sold in seven countries. Microsoft's Windows Phone software is fourth among smart-phone operating systems, with just 2 percent of the worldwide market in September, according to Canalys, a research firm in Reading, England.

"Microsoft is a small player in smart-phones and it needs as many partners as it can get," said Pete Cunningham, an analyst at Canalys. "And Africa is one of Huawei's strongest markets outside of China."

Microsoft's choice of Huawei, a leading maker of mobile networking equipment for African operators, does not detract from Microsoft's commitment to Nokia, which is relying on Windows Phone software to lift its new line of smart-phones and return the company to profitability.

Fernando de Sousa, the general manager for Microsoft Africa, said that in the next few months, Microsoft and Nokia planned to introduce two new Windows phones for the African market.

Africa is the world's fastest-growing region for smart-phones, with an average sales growth of 43 percent a year since 2000, according to the GSM Association, an industry trade group based in London.

In sub-Saharan Africa alone, 10 percent of the 445 million cell-phone users have smart-phones, but that is expected to increase rapidly as operators expand high-speed networks.

By 2017, most consumers in South Africa will be using smart-phones, up from 20 percent last year, according to the GSM Association. In Nigeria, the continent's most populous country, the outlook for sustained growth is even greater, with smart-phone penetration projected to reach just 30 percent by 2017.

The World Bank says that roughly a quarter of the one billion people on the continent are middle-class wage earners, the target group that Microsoft will try to reach with the Huawei phone, Mr. de Sousa said.

"Africans are generally quite conscious of brand, quality, and image," he said. "We are being very clear that we are not going to be building something cheap for this market. What we want to do deliver is real quality innovation at an affordable price. Compared to some smart-phones that cost $600 here, this is very affordable."

Microsoft plans to introduce the Huawei 4Afrika phone on Tuesday at events in Lagos, Cairo, Nairobi, Johannesburg, and Abidjan, Ivory Coast. It will also be sold in Morocco and Angola.

The phone, which will run the Windows Phone 8 operating system, will be sold with applications designed for African consumers. Some apps give easy access to African soccer results. Others, like in Nigeria, focus on the country's entertainment and film industries. An application developed in Egypt allows a woman who feels she is being harassed to alert the authorities to her location with one touch of her phone.

By targeting Africa, Microsoft is trying to build on momentum it recently gained through its partnership with Nokia. The company sold 4.4 million Lumia Windows smart-phones in the fourth quarter of last year, up from 2.9 million the previous quarter.

In November, the Microsoft chief executive, Steve Ballmer, said Microsoft had sold four times as many Windows phones at that point as it had a year earlier. A month later, Microsoft said sales of Windows phones over the holidays were five times those of a year ago.

Combined, Google's Android and Apple's iOS operating systems run about seven in 10 smart-phones worldwide, with BlackBerry at 15 percent. But by 2016, Canalys expects Windows to overtake BlackBerry to become the No. 3 operating system, with a 15 percent share, compared with 5 percent for BlackBerry.

Microsoft is not alone in its focus on Africa. Samsung, the largest seller of smart-phones and cell-phones, has recently expanded the less expensive range of Galaxy smart-phones to market in Africa and other emerging markets, said Anshul Gupta, an analyst at Gartner in Mumbai.

Mr. Gupta said there was pent-up demand among African consumers for a smart-phone costing $100 or less. He said several smaller Chinese phone makers, including TCL, ZTE and Lenovo, were working on developing simpler smart-phones that sold for $50.

Smooth BPM The Silver Lining Of Cloud Computing

Excerpted from CloudTweaks Report by Art Landro

Businesses rely on the implementation of processes. Cloud-based software provides a way for easy management and alleviates issues companies face trying to improve these processes, particularly when it comes to prototyping and modeling. A question many businesses are trying to address is: How can cloud ensure smooth business process management (BPM)?

The speed of getting started is a huge benefit of bringing cloud technology to BPM. Typically, BPM-in-the-cloud providers offer this capability "as-a-service," meaning that companies can start with BPM without the need to install and set up the software themselves.

The price point to enter BPM through the cloud is usually lower due to the "pay for use" subscription model. Companies can "sample" BPM to see what if it is right for them. Finally, it is easier to orchestrate applications and data that reside in the cloud, so running BPM in the cloud makes processes more efficient.

BPM is converging with platform-as-a-service (PaaS), combining the benefits of application development and process support in an integrated cloud model. It allows companies to build smart process apps that are highly flexible and tailored to serve the end user with a cloud-based solution that eliminates traditional IT/business productivity challenges.

There are three questions to answer when it comes to BPM and the cloud: Does a business have data and services in the cloud that processes must work with? Does it want to execute processes in the cloud? If so, how does it include existing data and systems that aren't in the cloud? Is cloud really suitable for an organization's needs today, tomorrow or somewhere in the future?

Clay Richardson from Forrester has spoken about the "mess of many." Trying to create enterprise-wide business processes across different business units and systems was hard enough when everything was inside an organization. When businesses start to bring in data and systems into the cloud, they very quickly end up with process challenges doubled.

Thus the "mess of many" — enterprise processes across on-premise systems as well as applications and software in the cloud.

BPM has always been used to improve business processes within an organization. As businesses move to the cloud it is crucial to maintain these consistent enterprise-wide processes when, for example, a CRM system runs in the cloud but ERP or HR systems run on-premise.

This idea of hybrid processes that span across on-premise and cloud is where the majority of companies are right now. This does not mean that there aren't "cloud-only" processes but this mix of everything is where most companies are at the moment.

A further impact of cloud technology on BPM is the idea of "mash apps." The concept of "mashing up" process information with other data from both on-premise and the cloud to create process-centric composite applications is becoming as important as the end product for BPM.

The rapid rise of cloud computing and readily available free web-based business applications mean more business users are deploying and using technology solutions without the IT department's involvement, one of the main attractions of "mash apps."

Why should companies consider vendors with a cloud proposition instead of traditional on-premise suppliers?

The easy answer to this is if a business isn't sure about BPM then it can try it out in the cloud before making a commitment. Organizations often start in the cloud and then move back into on-premise and vice-versa. This flexibility is important to consider together with the idea of a hybrid model.

There has been a noticeable rise in a new approach to a BPM appliance where the whole offering comes "in a box," often delivered as a cloud-based PaaS.

There are real benefits from cloud and BPM: quick start, no IT hassle and focus on business value; pay-as-you go subscription model; high degree of collaboration such as collaborative modeling; and orchestration of cloud services.

However, to get this benefit it is important that companies ask themselves the right, honest questions. Navigating BPM and the cloud and making the correct, pragmatic choices ensures an organization is future-proofed, can get started quickly and can take the hybrid approach to make sure they aren't getting themselves into that "mess of many" problem.

Big Data & Security: A Market Survey

More than 60 percent of IT professionals believe they will be the target of a cyberattack in the next six months.

Although point defense technologies such as firewalls and anti-virus/anti-malware help produce valuable information about events in and around the organization, enterprises need to leverage the volumes of data generated across the company to efficiently identify and respond to today's new threats.

Please click here for more. 

Coming Events of Interest

2013 Symposium on Cloud and Services Computing - March 14th-15th in Tainan, Taiwan. The goal of SCC 2013 is to bring together, researchers, developers, government sectors, and industrial vendors that are interested in cloud and services computing.

NAB Show 2013 - April 4th-11th in Las Vegas, NV. Every industry employs audio and video to communicate, educate and entertain. They all come together at NAB Show for creative inspiration and next-generation technologies to help breathe new life into their content. NAB Show is a must-attend event if you want to future-proof your career and your business.

CLOUD COMPUTING CONFERENCE at NAB - April 8th-9th in Las Vegas, NV.The New ways cloud-based solutions have accomplished better reliability and security for content distribution. From collaboration and post-production to storage, delivery, and analytics, decision makers responsible for accomplishing their content-related missions will find this a must-attend event.

Digital Hollywood Spring - April 29th-May 2nd in Marina Del Rey, CA. The premier entertainment and technology conference. The conference where everything you do, everything you say, everything you see means business.

CLOUD COMPUTING EAST 2013 - May 20th-21st in Boston, MA. CCE:2013 will focus on three major sectors, GOVERNMENT, HEALTHCARE, and FINANCIAL SERVICES, whose use of cloud-based technologies is revolutionizing business processes, increasing efficiency and streamlining costs.

P2P 2013: IEEE International Conference on Peer-to-Peer Computing - September 9th-11th in Trento, Italy. The IEEE P2P Conference is a forum to present and discuss all aspects of mostly decentralized, large-scale distributed systems and applications. This forum furthers the state-of-the-art in the design and analysis of large-scale distributed applications and systems.

CLOUD COMPUTING WEST 2013 — October 27th-29th in Las Vegas, NV. Three conference tracks will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Copyright 2008 Distributed Computing Industry Association
This page last updated February 10, 2013
Privacy Policy