Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

Dancing on a Cloud

DataDirect Networks

Kaltura

Unicorn Media

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

August 20, 2012
Volume XL, Issue 8


Sequencia Sponsors CLOUD COMPUTING WEST 2012

The DCIA and CCA proudly announce that Sequencia has signed-on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.

Sequencia is a premier provider of cloud solutions for service providers and enterprises. The company engineers and executes cloud solutions so that its clients can deliver IT services successfully via the cloud.

The company's system experts were responsible for designing and engineering one of the first network hypervisors for the cloud through their work at LineSider Technologies, as well as enabling the delivery of some of the industry's first Compute-as-a-Service (CaaS) clouds.

The firm believes that there currently exists an IT service void related to the intelligent design of cloud system and solution architectures. Sequencia was founded to fill this void by providing client-specific, end-to-end cloud solutions for both service providers and enterprises.

Sequencia believes cloud = IT-as-a-Service (ITaaS), and therefore, it focuses on cloud-enabled IT services.

The company provides a carefully managed framework for ITaaS transformation and enablement, and its methodology focuses on facilitating the full transition of business IT to the operational and cost benefits of IT services running in the cloud (private, public, or hybrid).

Sequencia is technology agnostic and non-prescriptive. Unlike many of its competitors, the company does not have an overarching approach to cloud that fits all clients.

It approaches each of its client's needs with a unique modular service management framework, designing cloud systems to integrate the very best technology for their business through each phase of the development process.

CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry's fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.

Sequencia will participate in a panel discussion at the Network Infrastructure conference within CCW:2012.

CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.

At the end of the first full-day of co-located conferences, attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.

So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.

Cloud Brings Foreign IT Spending to US

Excerpted from ComputerWorld Report by Patrick Thibodeau

US-based corporations and government agencies have been shipping application development work to offshore information technology (IT) services providers for years.

Now, thanks to cloud computing, foreign companies are starting to bring their business to providers of data center services located in this country.

Consider Grupo Posadas, a large hotel company in Mexico that today relies on five data centers to support more than 17,000 guest rooms in over 100 hotels. Grupo Posadas IT personnel run three of those data centers; the other two are run by outsourcing partners.

Later this year, most of the company's IT capability will be moved to a data center in Texas run by Savvis, a hosted services provider based in Town and Country, MO, said Grupo Posadas CIO Leopoldo Toro Bala.

The US data center will provide cloud-based infrastructure and managed database services, according to Toro Bala.

By moving some operations to Texas, the Posadas IT group will have more time to focus on developing systems like mobile and social networking tools that could help the business grow, he added.

"Our IT strategy is aligned to our growth, and our growth means that we need to be flexible and agile," he said.

The shift to the cloud will not affect IT costs. Instead, it will provide capabilities that will help streamline deployments of new IT systems, said Toro Bala. Previously, implementing a new system often required new equipment that could take months to deploy.

Cloud computing makes it possible to deploy new services in a matter of weeks. "That is the type of capability that we were lacking -- that agility," said Toro Bala

Meanwhile, as US providers of cloud-based services start to attract foreign customers, some countries are enacting laws to protect their domestic providers, and some foreign companies are overseeing so-called FUD (fear, uncertainty and doubt) campaigns designed to raise questions about the security of US data centers, said Daniel Castro, an analyst at the Information Technology and Innovation Foundation (ITIF).

For instance, ads by Deutsche Telekom and other companies claim that their cloud products are more secure than those of US vendors because US companies have to comply with laws such as the Patriot Act, executives from industry groups and tech vendors told a US House of Representatives subcommittee during a hearing late last month.

"We commonly see almost absurd positioning of what the Patriot Act permits," said Justin Freeman, Corporate Counsel of Rackspace, a provider of hosted services.

Such marketing efforts, said Castro, represent a significant threat to US providers of cloud-based services.

"The potential market for cloud computing is very large, and the US right now is the country that stands to gain the most from it," said Castro, who also testified at the hearing.

Castro said most countries have laws that are similar to the Patriot Act, and some, including Canada and Australia, allow businesses to turn over data voluntarily to government agencies. A US company would violate its terms of service if it did that, he said.

Concerns about a lack of security or privacy in U S data centers didn't affect the outcome of the outsourcing decision at Grupo Posadas, which has a long history of working with US IT companies, said Toro Bala.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyOn Wednesday, the United Nations' International Telecommunication Union (ITU) called for public consultation on its draft Internet regulation proposal to revise a major global treaty in December at the World Conference on International Telecommunications (WCIT) in Dubai.

The DCIA strongly urges DCINFO readers to express opposition to this ill-conceived, restrictive, and extremely damaging proposition between now and November 3rd, the closing date for comments.

The ITU wants the upcoming WCIT to enable government bureaucracies to define procedures for the provision and operation of international telecommunications networks around the world.

Without question, this would drastically curtail the continuing advancement of Internet-based businesses, including cloud computing, as well as severely harm Internet freedom.

But one example of a troubling provision of the ITU scheme: granting all national authorities the right to impose taxes on all incoming and outgoing telecommunications traffic and Internet traffic termination fees.

If the ITU amendments are ratified as written, the ongoing and smoothly proceeding transition to IPv6 would come to a grinding halt.

In June, we commended the US House Energy & Commerce Committee for its leadership and bipartisan passage of a resolution opposing the UN's attempt to assert and impose such unprecedented governmental regulation over the Internet.

The Internet's current multi-stakeholder governance model fosters continuing investment and innovation absent heavy-handed regulatory controls.

Beyond the substantial growth that the Internet and related distributed computing technologies are contributing to the global economy, unprecedented advances in political freedom can also be attributed to the current model.

About this pending showdown, Congresswoman Mary Bono Mack noted that, "In many ways, we're facing a referendum on the future of the Internet. Giving the UN unprecedented power over web content and infrastructure is the quickest way for the Internet to become a wasteland of unfilled hopes, dreams, and opportunities."

At WCIT, the International Telecommunications Regulations, comprising an international treaty developed nearly 25 years ago to deal with global telephone and telegraph systems at the time, will be opened for revisions.

And while any amended treaty would only be binding in the US if ratified by the Senate, the implications of currently proposed changes, if adopted elsewhere around the world, would have profoundly damaging effects on the operation of the Internet everywhere.

The secret drafting of ITU proposals in preparation for WCIT has been widely and rightly criticized by public interest groups for a serious lack of transparency. But our concerns go deeper than that.

If the ITU is successful in taking power over the Internet with the proposed amendments, such technologically valuable activities as the current flexibility of Internet-connected devices to perform as both clients and servers would be jeopardized.

Certain communications among devices would be hampered based on jurisdictional considerations and governmental security intervention measures, including repressive surveillance of Internet users and sanctioned censorship of the Internet.

Eli Dourado, a researcher at George Mason University, raised this key concern, "Who benefits from increased ITU oversight of the Internet? Certainly not ordinary users in foreign countries, who would then be censored and spied upon by their governments with full international approval. The winners would be autocratic regimes, not their subjects."

We join the Internet Society, representing engineering groups that develop and maintain core Internet technologies, in objecting to the ITU proposition on principle and as a practical matter.

Independent organizations including the Society, as well as the Internet Corporation for Assigned Names and Numbers, and the Worldwide Web Consortium, already deal much more effectively than the ITU possibly could with such fundamental tasks as network and domain name registrations, allowing the Internet to develop and evolve with relatively fast responses to changes in technology, business practices, and consumer behavior.

The negative economic impacts of the proposed treaty changes on expansion of Internet-based services as well as job creation without a doubt would be devastating. Please join us opposing them. Share wisely, and take care.

Rackspace CEO Bets the (Server) Farm on "Open Cloud"

Excerpted from The Street Report by James Rogers

Only three years ago, companies were nervous about the potential security and management risks of sending critical data and services into the "cloud." Today, the cloud-computing industry has developed so rapidly that Rackspace has put its faith in open-source cloud technology.

Rackspace developed OpenStack, an open-source cloud operating system in 2010, and is now cranking up its efforts around the technology. Earlier this month the San Antonio, TX based firm announced OpenStack-powered versions of its flagship Cloud Server offering. Rackspace also unveiled a Cloud Databases offering and a Control Panel product for managing the services, both built on OpenStack.

"We're working hard to build an open-cloud company," Rackspace CEO Lanham Napier said. Rackspace, he added, will launch additional open-cloud services during its third and fourth fiscal quarters.

Rackspace, which is one of "The Street's" top cloud stocks for 2012, reported robust second-quarter results earlier this week, although Napier's focus is firmly on the open cloud.

"We still have hard work ahead of us to meet our roadmap, but we're confident that we will get it done," he said, noting that high-performance storage, monitoring, and backup products are scheduled for the third quarter. A cloud network product will make its debut the following quarter.

Open-source technology lets Rackspace quickly scale its own cloud infrastructure, according to the CEO, significantly boosting efficiency. "In the long run, I believe that there's the opportunity to improve our economics because the platform is more capable," he explained. "We will absolutely share a lot of that savings with our customers."

Rackspace, however, does not offer guidance, so investors are unable to measure the strategy's impact on the company's financials.

Nonetheless, Canaccord Genuity analyst Greg Miller sees open-source cloud computing as a positive for Rackspace. The strategy opens the door to much larger applications and customers that could generate higher margins for Rackspace, he said in a note, but added that no meaningful revenue is expected until the first quarter of 2013.

OpenStack, which supports both public and private clouds, also has the backing of over 180 companies, including heavy hitters HP, Dell, and Citrix.

In a public cloud such as Amazon's S3 and EC2 offerings, customers access shared services like storage and server power from third-party companies, whereas private clouds can be run at customers' own sites or on dedicated resources at a third-party location.

Tech research firm IDC estimates that revenue from public IT cloud services alone exceeded $21.5 billion in 2010 and will reach $72.9 billion in 2015, growing four times faster than the IT market as a whole.

Another cloud specialist, Proofpoint, says customers are clearly becoming more comfortable with the cloud model. Whereas Rackspace focuses on cloud hosting, Proofpoint touts specialized security services, according to CEO Gary Steele.

"There are things that we can do as a cloud-based service that are impossible to run on-premise such as big data processing," he said in an interview. "Think about non-tech organizations — the financial services and health-care organizations — do they want to build these systems and hire all the staff to run them? Probably not."

Spotify, Pandora Spur US Digital Music Sales Past CDs

Excerpted from Bloomberg News Report by Andy Fixmer

The CD's reign as the music industry's biggest US revenue source will end this year, eclipsed by downloads and newer streaming services such as peer-to-peer (P2P) based music-sharing service Spotify and Pandora, according to a researcher.

US digital music sales will rise to $3.4 billion this year, exceeding the $3.38 billion in revenue from CDs and vinyl, Boston, MA based Strategy Analytics said this week. Globally, digital music will surpass physical purchases in 2015, the company said.

Record companies are licensing artists' catalogs to streaming services as CD purchases shrink. Sales of digital tracks and albums will rise 6.7 percent this year, while streaming revenue will climb 28 percent, Strategy Analytics said. Together they account for 41 percent of US music sales, compared with 22 percent worldwide.

"Streaming music services such as Spotify and Pandora will be the key growth drivers over the next five years as usage and spending grow rapidly," Ed Barton, Director of Digital Media at Strategy Analytics, said. "The industry will be hoping that digital can rebuild the US music market to something approaching its former stature."

Pandora, based in Oakland, CA, gained 4.4 percent to $9.71 at the close in New York. The company, which only serves US listeners, has declined 3 percent this year. Spotify, the closely held London-based service, streams music in 15 countries, including the US since July 2011.

US sales of CDs and vinyl will decline 9 percent in 2012, a slower rate of decline than the rest of the world, Strategy Analytics said. Total US recorded music spending this year will rise $134 million, or 2.1 percent, to about $6.38 billion, according to the researcher.

Vivendi's Universal Music Group, the world's biggest record company, is seeking European regulatory approval for its proposed $1.9 billion acquisition of EMI Group's recorded music business from Citigroup. A Sony-led investor group purchased EMI's music publishing in April for $2.2 billion.

Sony Music is the second-biggest record company, followed by billionaire Len Blavatnik's Warner Music Group.

Meanwhile, RIAA member payments are down 44 percent in just two years.

HP Delivers Innovations to Power Cloud Computing

HP this week announced new virtualization software solutions that simplify, automate, and secure the movement of virtual machines (VMs) and data in cloud environments, providing clients with increased agility for addressing dynamic market opportunities.

In legacy infrastructures, moving a VM workload such as a Microsoft Exchange application between data centers requires hundreds of complex, manual device-level configuration steps. Data mobility is similarly challenging because storage requires a physical deployment that consumes floor space, power, cooling, and infrastructure investment, which locks clients into a proprietary environment.

Three new HP Converged Infrastructure solutions help to eliminate these issues, delivering increased agility and simplicity.

HP Ethernet Virtual Interconnect (EVI) is the industry's first solution to simplify and speed the interconnectivity of up to eight geographically dispersed data centers in minutes, eliminating manual configuration tasks.

HP Multitenant Device Context (MDC) software delivers increased security for multitenant cloud applications, eliminating the comingling of data from different applications or departments.

And HP StoreVirtual Storage Appliance (VSA) increases flexibility by allowing clients to easily build storage pools on any x86 server using VMware vSphere or Windows Server Hyper-V hypervisors. As a result, clients can move data across heterogeneous servers, hypervisors, and data center locations to better meet organization needs, while reducing overall cost and complexity.

"Clients are seeking a more flexible virtualized environment that enables mobility of virtual machines and data services across the enterprise," said Bethany Mayer, Senior Vice President and General Manager, Networking, HP.

"HP's unique virtualization technologies meet that need, helping clients to significantly increase their agility and innovate at a pace their organizations demand."

Connecting geographically dispersed data centers optimizes workload mobility and disaster recovery. Typically, this connectivity requires months of error-prone and costly manual network redesign and reconfiguration.

HP EVI is the industry's only Data Center Interconnect overlay technology that enables "single touch" connection of up to eight data centers around the world from one location. Clients can optimize server and storage resources to perform workload mobility or disaster recovery by linking the software in HP EVI from one data center to another.

With HP MDC software, clients can create multiple, secure and isolated functions for organizational departments such as finance, human resources and engineering, while reducing the number of networking devices in the data center by 75 percent. Combining HP EVI with HP MDC simplifies data-center interconnectivity while reducing total cost of ownership by 56 percent with a single management platform.

"We needed a centralized, consolidated data-center infrastructure that delivered long-term operational efficiencies to accommodate our network of more than 100 agencies," said Jason Cohen, Global Chief Information Officer at Diversified Agency Services, a global marketing services and specialty communication company. "HP delivered exactly that, and the new Data Center Interconnect additions will ensure business continuity by distributing applications and compute resources for our enterprise private cloud."

To address flexibility and cost, many organizations have moved from physical storage deployments to VSAs and VM-based data services to support new application workloads and rapid data growth. However, most VSA software is proprietary, supporting only a single vendor's hardware and hypervisor.

HP StoreVirtual VSA is the first software-based virtual storage appliance to operate on any x86 server, while connecting to a range of third-party external storage solutions and supporting a mix of VMware vSphere and Windows Server Hyper-V hypervisors simultaneously. This heterogeneous, hardware-agnostic solution for delivering shared storage in virtualized environments eliminates vendor lock-in, reduces infrastructure cost by up to 60 percent and increases utilization without additional investment.

"The ability to reduce distance limitation in networking and to simplify storage infrastructure is important for organizations using VMware vMotion in VMware vSphere," said Gary Green, Vice President, Global Strategic Alliances, VMware. "We are pleased to continue to partner with HP to enable our customers to address IT challenges with better resource utilization and optimization for workload mobility across data centers."

HP Ethernet Virtual Interconnect and HP Multitenant Device Context will be available worldwide this fall as software upgrades for the HP FlexFabric core switches. HP StoreVirtual VSA will be available worldwide in September with a starting U.S. list price of $700 per license when purchased in multi-license packs.

Additional information about the new networking and storage solutions is available here.

IBM Moves Mainframe into Business Continuity Cloud

Excerpted from ComputerWorld Report by Robert Mitchell

Would you entrust your mainframe to the cloud? Perhaps not for production, but IBM is hoping to gain customers for its cloud-based disaster recovery services by offering support for virtual mainframes. Currently, IBM offers cloud-based backup and disaster recovery services for the AIX, Windows, and Linux platforms.

"We're moving away from just backup to a replication environment in the cloud for all critical servers" — including the mainframe, said Rich Cocchiara, Distinguished Engineer and Chief Technology Officer for Business Continuity Recovery Services, during a recent one-on-one meeting at ComputerWorld's offices.

"What cloud is doing is bringing the price down," perhaps to the point where more organizations may be willing to give up building or owning their own backup data centers. Instead of paying the capital expense of creating a backup data center, IT pays for access to virtual machines, as well as for the data backups and storage of the information.

The idea is to orchestrate end-to-end backups such that an entire data center — both front-end and back-end systems — can fail over quickly and reliably into a cloud-based backup data center.

As systems spin up, connections between them are reestablished and everything, hopefully, comes back online without a hiccup. "The toughest part is the synching of the data and the referential integrity you have to have," Cocchiara said.

There are two key differences between IBM's cloud-based disaster recovery approach and its traditional service offerings: the virtualized backup resources can be either online or spun up very quickly; and customers can provision them on demand, without making a call to IBM.

But you can't do that with IBM's mainframe disaster recovery services just yet. Today, the client must call and request that the backup mainframe be brought online, and an IBM staff person must create a new mainframe partition for the customer.

Cocchiara said IBM is working on mechanisms to let the user initiate the failover process without calling, and for IBM to automate that provisioning process on the back end. "In the near future you'll probably see that," he said.

When it comes to their mainframes, enterprise IT tends to be fairly conservative. Will it trust the cloud with its mainframe operations?

Maybe, eventually. But many organizations also get value from their physical backup data centers by using those systems for noncritical functions such as application development when they are not in use for failover or disaster recovery testing.

That's fine, said Cocchiara, as long as administrators resist the temptation to put production applications in the backup data center. "If you do that you end up with two production data centers and no real backup," he said.

IBM is also working on using analytics to anticipate outages and automatically fail over systems when the probability of imminent failure is high.

"We want to say, 'You're going to have an event. Let's start the failover process now and if it occurs you'll be ready.'" This application of analytics to risk, Cocchiara said, "is like a tornado early warning system that gets the business into a safe location before the event hits."

A Round of Applause for Adobe's Creative Cloud

Excerpted from PC World Report by John Dvorak

Anyone who reads me regularly knows that I am not a huge fan of cloud computing and its implications. I'm even less enamored by the idea of paying a monthly fee to use my word processor.

That said, I must admit that Adobe may have found the sweet spot. I actually like what the company is doing with its new Creative Cloud. Less cloud computing than other architectures, it's designed to fast-track people into the newest products rather than having them spend about the same amount of money to ride the Adobe roller coaster.

Ride the Adobe roller coaster? It's when you buy Adobe Creative Suite 2, skip CS3, and get CS4. Or, skip CS4 also and spring for CS5. Now, you have so many new features that you wind up behind the curve.

With Creative Cloud, for $49.99 a month, you're always up-to-date. The system was unveiled in April and since then, new components have been added. Everything is included in the $2,599 master collection, plus free websites and other cloud-only services including Muse, a fascinating web development tool.

As an added incentive, anyone who has a copy of any Adobe product from CS3 or higher can get a year of these programs for $29.99 a month through August. Students will always have access to the $29.99 deal.

While I would love this to be $9 a month forever, Adobe's products have never been cheap and they tend to be for professional users rather than casual users. Some people will never touch Illustrator and will only want Photoshop, so access to the master collection is a waste of money for them. Therefore, Photoshop will be available for $19.99 a month, although this deal is difficult to locate on the Adobe website.

With the Creative Cloud, Adobe thinks it can both make more money in the long run and satisfy its users. As much as I hate to admit it, I agree with the company on this.

To begin, these programs are not in the cloud. If you need to use InDesign, you must download it from the cloud and install it on your computer for good, upgrading only when a new version comes out. You run it natively and the cloud keeps a synced version of your files. You download all the components as you need them and should not have to repeat the process except when downloading to a second machine, which is allowed for the same user.

The only difference between this code and the standalone code is that this code calls home every month to make sure you are paying your bill. You should have seen this mechanism coming once software vendors began using authentication codes.

Overall, this will make the products cheaper for the serious users and it will bring in new users who are forced to use old code because they cannot afford $2,600 for the full package.

Would this convert someone who uses the budget Photoshop Elements? I doubt it, but it is quite tempting for the Photoshop-only user who might want to use programs like Lightroom, Illustrator, or the font packages.

I was recently briefed on this new package and was thoroughly impressed with the hands-on experience. As I play with some of the new components, I'll report on the most interesting. In the meantime, I have no qualms about recommending this product. It's a winner.

BitTorrent Launches OneHash: Torrent Web Streaming

Excerpted from Ghacks Technology News Report by Martin Brinkmann

Remember BitTorrent Torque? The technology moves BitTorrent technology into the browser so that web developers can use the benefits of BitTorrent in their web applications.

Back then a handful of demo apps were released that to demonstrate the possibilities. There was One click, a plug-in for Google Chrome that turned torrent downloads into regular browser downloads for instance.

Today OneHash has been added to the list of demo applications. It requires the Torque plug-in that you can download from the official website or when you visit a page that requires the plug-in. You can install the plug-in while the browser is running and use it immediately without restart.

OneHash basically makes available media that is provided as a torrent as a web stream that you can watch or listen to in your browser. You can either visit the homepage of the project and paste in a torrent link, magnet link or info hash right there, or check-out one of the featured pages first to get a feeling for OneHash.

Once you load a page using OneHash, you will notice that all media that is included in the torrent distribution is listed with play buttons on the page. Depending on the torrent, this may be just one video or audio file, or lots of them. Even mixed contents are supported by OneHash. The web app connects to the swarm and starts the download of the files. You will notice that play times appear over time, and that the availability depends largely on the popularity of the torrent and your computer's connection.

The download status is displayed in per cent on the tab in Google Chrome, and maybe other browsers as well. OneHash prioritizes files in the torrent and will make individual audio or video files available faster because of this. You can start playing the first media files while the remaining files are still downloaded to the computer.

OneHash in its current state is a prototype that may have its quirks. I sometimes needed to refresh the page before it picked up the already downloaded files so that I could start playing them in the web browser. But later, files seem to get downloaded when I streamed them to my computer and they stayed on the PC even after I closed the browser.

At its core, OneHash is a torrent web streaming service that you can use to listen to music or watch videos right in your browser without installed BitTorrent client. But it could become more than that, like a way for artists to stream live concerts to an audience. The core benefit here is that bandwidth is distributed among all listeners which in turn should reduce the bandwidth costs for the artist significantly.

OneHash is not the first web app that is making torrent video or music files available in your web browser. Back in 2008 we have covered Bitlet, a now defunct service that let you play torrent music files on the web.

Avnet Technology Solutions Expands Off-Premise Cloud Training Offerings

Excerpted from eChannelLine Report by Mark Cox

Avnet Technology Solutions has announced a new training framework available through its recently launched Avnet Cloud Solutions group, which will be available in the US and Canada.

The new framework is focused on the off-premise cloud computing market. It includes fundamental, advanced and mastery training related to developing a cloud practice, specializing in high-growth off-premise cloud workloads and gaining in-depth expertise in off-premise cloud offerings.

This announcement is a follow-up to Avnet's announcement in May of Avnet Cloud Solutions, a new organization specifically focused on off-premise cloud solutions for VAR, MSP, and ISV partners in the US and Canada, the same as the new announcement.

At that time, Avnet announced a number of cloud providers would be named to provide those offerings, and announced one, Savvis, a global enterprise-class cloud and managed services provider. Now it has announced another, ITpreneurs, which works in collaboration with CompTIA and the Cloud Credential Council to offer a vendor-neutral cloud certification program

"This announcement builds on the one in May and strengthens our training and enablement value propositions," said Tim FitzGerald, Vice President of the Avnet Cloud Solutions, Avnet Technology Solutions, Americas. "They provide content and training for sales and solutions architects, which is essential in today's fast changing cloud marketplace."

This curriculum is specific to off-premise cloud computing, FitzGerald said.

"Transitioning from an on-premise business to the adoption of off-premise cloud services is complex," he said. "Things like understanding the consumption of workload as a service versus capital expenditure are things that partners need to focus on to increase their competency in a massively changing world. This type of training can help them recommend the right consumption workload.

"It's more in depth, broad-based training, that makes them more valuable to their customers."

Qualified Avnet partners will be able to access the ITpreneurs cloud training program at a special rate.

The Cloud Practices training course available through ITpreneurs includes fundamentals and advanced and mastery coursework focused on the business principles, sales skills and technical knowledge needed to grow a cloud practice. Initially, this courseware will feature multiple levels of training related to cloud computing, ITIL and virtualization to help partners build a baseline of knowledge and attain industry certifications.

Avnet also offers an Avnet-developed training course on Cloud Workloads, which helps partners specialize in the high-growth off-premise cloud workloads best aligned with their business strategies and core competencies.

It is designed to help partners understand end-customer business drivers, off-premise cloud solutions, the supplier landscape, technical knowledge and adjacent professional services opportunities.

A further Avnet-developed training course on Cloud Offerings provides offering-specific instruction and deep-dive technical training, as well as online sales tools and resources to speed the selling process. It explores features and benefits, qualifying, scoping and quoting.

The new training through ITpreneurs is available now.

NTU Scientists Unveil Multiscreen Social TV Viewing Experience

Excerpted from CNET News Report by Jacqueline Seng

Scientists from Singapore's Nanyang Technological University (NTU)'s School of Computer Engineering have unveiled "Social Cloud TV", which is essentially a multiscreen mobile TV experience.

Developed by a research team headed by Assistant Professor Wen Yonggang, "Social Cloud TV" users can also chat — using video, voice or text — with their friends on the platform, as well as share their content on social-networking sites such as Facebook, Twitter and Google+.

They can access content that is stored locally, in the cloud, delivered via over-the-top (OTT) services or shared using a web browser on a smart TV or mobile device. Assistant Professor Wen and his team developed the backend processes — such as a compression algorithm and media transcoding — so that content is optimized for each device and screen size.

For example, the same video that you're watching on your tablet can be "thrown" to your TV when you get home. Similarly, when you leave the house, you can "pull" content from your TV to your tablet or smart-phone. Besides content, you can transfer chat sessions with your friends, too.

Assistant Professor Wen declined to comment on how the video session is migrated to or from a mobile device as its patent is currently pending. However, he did mention that it will be extremely user-friendly so that even the elderly can do it.

For now, the platform requires an Android app to run on mobile devices, but there are plans to develop for other operating systems such as iOS.

With discussions underway with a Singapore telco and a handful of international vendors to commercialize the idea, consumers can expect to see this in homes in one to two years' time.

We've seen similar multiscreen setups at CommunicAsia before — first with telecommunications infrastructure provider Ericsson's unified multiscreen TV solution last year and then Singapore telco StarHub's TV Anywhere service using OTT technology this year — but such services still have not really taken off in Asia yet.

Wen's platform allows for more social engagement, which may give it an added edge over the competition.

How Big Data Became so Big

Excerpted from NY Times Report by Steve Lohr

This has been the crossover year for Big Data — as a concept, as a term and, yes, as a marketing tool. Big Data has sprung from the confines of technology circles into the mainstream.

First, here are a few, well, data points: Big Data was a featured topic this year at the World Economic Forum in Davos, Switzerland, with a report titled "Big Data, Big Impact." In March, the federal government announced $200 million in research programs for Big Data computing.

Rick Smolan, creator of the "Day in the Life" photography series, has a new project in the works, called "The Human Face of Big Data."

The New York Times has adopted the term in headlines like "The Age of Big Data" and "Big Data on Campus."

And a sure sign that Big Data has arrived came just last month, when it became grist for satire in the "Dilbert" comic strip by Scott Adams. "It comes from everywhere. It knows all," one frame reads, and the next concludes that "its name is Big Data."

The Big Data story is the making of a meme. And two vital ingredients seem to be at work here. The first is that the term itself is not too technical, yet is catchy and vaguely evocative. The second is that behind the term is an evolving set of technologies with great promise, and some pitfalls.

Big Data is a shorthand label that typically means applying the tools of artificial intelligence, like machine learning, to vast new troves of data beyond that captured in standard databases. The new data sources include web-browsing data trails, social network communications, sensor data, and surveillance data.

The combination of the data deluge and clever software algorithms opens the door to new business opportunities. Google and Facebook, for example, are Big Data companies. The Watson computer from IBM that beat human "Jeopardy" champions last year was a triumph of Big Data computing.

In theory, Big Data could improve decision-making in fields from business to medicine, allowing decisions to be based increasingly on data and analysis rather than intuition and experience.

"The term itself is vague, but it is getting at something that is real," says Jon Kleinberg, a computer scientist at Cornell University. "Big Data is a tagline for a process that has the potential to transform everything."

Rising piles of data have long been a challenge. In the late 19th century, census takers struggled with how to count and categorize the rapidly growing United States population. An innovative breakthrough came in time for the 1890 census, when the population reached 63 million.

The data-taming tool proved to be machine-readable punched cards, invented by Herman Hollerith; these cards were the bedrock technology of the company that became IBM.

So the term Big Data is a rhetorical nod to the reality that "big" is a fast-moving target when it comes to data. The year 2008, according to several computer scientists and industry executives, was when the term "Big Data" began gaining currency in tech circles.

Wired Magazine published an article that cogently presented the opportunities and implications of the modern data deluge.

This new style of computing, Wired declared, was the beginning of the Petabyte Age. It was an excellent magazine piece, but the "petabyte" label was too technical to be a mainstream hit — and inevitably, petabytes of data will give way to even bigger bytes: exabytes, zettabytes, and yottabytes.

Many scientists and engineers at first sneered that Big Data was a marketing term. But good marketing is distilled and effective communication, a valuable skill in any field.

For example, the mathematician John McCarthy made up the term "artificial intelligence" in 1955, when writing a pitch for a Rockefeller Foundation grant. His deft turn of phrase was a masterstroke of aspirational marketing.

In late 2008, Big Data was embraced by a group of the nation's leading computer science researchers, the Computing Community Consortium, a collaboration of the government's National Science Foundation and the Computing Research Association, which represents academic and corporate researchers. The computing consortium published an influential white paper, "Big-Data Computing: Creating Revolutionary Breakthroughs in Commerce, Science and Society."

Its authors were three prominent computer scientists, Randal E. Bryant of Carnegie Mellon University, Randy H. Katz of the University of California, Berkeley, and Edward D. Lazowska of the University of Washington.

Their endorsement lent intellectual credibility to Big Data. Rod A. Smith, an IBM Technical Fellow and Vice President for Emerging Internet Technologies, says he likes the term because it nudges people's thinking up from the machinery of data-handling or precise measures of the volume of data.

"Big Data is really about new uses and new insights, not so much the data itself," Mr. Smith says.

IBM adopted Big Data in its marketing, especially after it resonated with customers. In 2008, Mr. Smith's team put up a Web site to explain the Big Data theme, and the site has since been greatly expanded. In 2011, the company introduced a Twitter hashtag, #IBMbigdata. IBM has a Big Data newsletter, and in January it published an e-book, "Understanding Big Data."

Since its founding in 1976, SAS Institute, the largest privately held software company in the world, has made software that sifts through databases, looking for nuggets of value. SAS, based in Cary, NC, has seen many a marketing term in its field, including "data mining," "business intelligence" and "data analytics."

At first, Jim Davis, Chief Marketing Officer at SAS, viewed Big Data as part of another cycle of industry phrasemaking.

"I scoffed at it initially," Mr. Davis recalls, noting that SAS's big corporate customers, like banks and insurance companies, had been mining huge amounts of data for decades.

But Big Data seeks to tap all that Web data outside corporate databases as well. And as SAS's technology has moved to exploit these Internet-era data assets, its marketing has changed, too. Last year, SAS started adopting Big Data and "Big Data analytics," along with a term it has been using for years, "high-performance analytics." In May, the company appointed a vice president for Big Data, Paul Kent.

"We had to hop on the bandwagon," Mr. Davis says.

IT may seem marketing gold, but Big Data also carries a darker connotation, as a linguistic cousin to the likes of Big Brother, Big Oil and Big Government.

"If only inadvertently, it does have a sinister flavor to it," says Fred R. Shapiro, editor of the Yale Book of Quotations.

Big Data's enthusiasts say the rewards far outweigh the risks. Still, smart technologies that promise to observe, record and make inferences about human behavior as never before should prompt some second thoughts — both from the people building those technologies and from the people using them.

Coming Events of Interest

ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.

ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.

CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.

Copyright 2008 Distributed Computing Industry Association
This page last updated August 19, 2012
Privacy Policy