Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2P Leaders

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features

Anti-Piracy

September 13, 2010
Volume XXXII, Issue 3


Mainline BitTorrent Client Offers Application Support 

Excerpted from The Inquirer Report by Lawrence Latif

Industry leading BitTorrent has updated its "Mainline" BitTorrent client bringing with it support for in-client applications.

The capability has been enabled in a beta version of the upcoming Mainline 7.1 client, which the company says will be available within two weeks. Third-party Javascript applications will allow users to access games, media, and music all within the confines of the client. A similar system has been available in the popular uTorrent client, also maintained by BitTorrent.

While the Mainline client doesn't enjoy the same popularity as uTorrent or Vuze, it is regarded as the benchmark "vanilla" client. Once open-source, for the past three years it has been closed-source and its feature-set has remained fairly conservative. The same can be said for the timeline in adopting in-client applications, and the move seems to be part of a wider initiative with the company.

In a bid to underscore the importance of in-client applications, BitTorrent's CEO Eric Klinker said, "This is our first step as part of a broader strategic effort to bridge the divide between creators and consumers. Many software and gaming companies are finding success with a 'freemium' philosophy, and with over 14 million BitTorrent Mainline users, apps are an ideal distribution platform for this new generation of developers and creators."

The firm also announced application program interfaces (APIs) for developers to conjure up new applications, with a software developers kit (SDK) and applications released from BitTorrent coming under a three-clause Berkley Software Distribution (BSD) license. It confirmed that new applications will be able to interact with other web services to grab data.

With in-client applications, BitTorrent is undertaking the final push to see that its protocol is used for more than just file sharing.

The Future of P2P in Game Delivery: Pillar or Pariah?

Excerpted from Solid State Networks Blog by CEO Rick Buonincontri

NOTE: This is the second in a series of posts on P2P technology and game publishing. See the first post, User Engagement: A Simple and Intuitive P2P Participation Policy and Solid State Networks' main P2P Best Practices page. Please subscribe to SSN's news feed or check back soon for more. - Ed

As an early developer of commercial P2P solutions for games, Solid State Networks has often had to work to educate customers, partners and regulators about how the technology actually works, how it can/should be implemented, etc.

Still, "P2P technology" is commonly confused with the attributes of its implementation which can, in turn, result in unintended consequences. This happened in 2009 when the US House of Representatives introduced HR 1319, a bill also known as the "Informed P2P User Act."

This bill, as originally introduced, would have required all P2P applications to comply with a restrictive set of rules for the stated purpose of preventing the inadvertent sharing of personally identifiable information (PII) by any user of any P2P network. At the time, we thought a more appropriate name for this bill would have been The Uninformed P2P User Act.

Fortunately, through the tireless efforts of the Distributed Computing Industry Association (DCIA), led by its CEO Marty Lafferty, along with input from key DCIA Member companies including Solid State Networks, subsequent drafts of the bill included language that excludes certain P2P implementations that are clearly not capable of sharing PII.

These applications, such as those used for content delivery, serve a very different function than the file-sharing applications that have motivated the bill's sponsors to pursue this legislation.

As a result of the efforts from the DCIA and Congressional and FTC staff, commercial P2P delivery solutions will not be unduly impacted by HR 1319 should the bill (which has since passed in the House) become law.

HR 1319 was indeed a close call and should be something of a wake-up call for both P2P vendors and the game publishers that benefit from P2P technology. Game publishers, whether you are using a commercial P2P solution or rolling your own P2P implementation, should be on the lookout for things that might impact your ability to utilize P2P technology in the future.

There are still many open questions with respect to the use of P2P technology for game delivery (or any type of content files) to and among consumers that need to be understood for P2P to mature as a widely adopted consumer technology, such as:

Who owns the resources that power a P2P network? Who has the right to use these resources?

How can a player's resources be controlled and used? How should they be used?

How are decisions made concerning the use of these resources? Who should make those decisions?

What role should players have in how their resources are utilized?

What are all of the benefits of P2P? How are those benefits distributed among publishers, P2P vendors and players?

What are the costs of P2P? How are those costs distributed among publishers, P2P vendors and players?

What are the requirements for disclosure? Are the minimum legal requirements enough?

What risks and potential liabilities are associated with implementing P2P for game delivery? How can those risks be mitigated?

Today, Solid State Networks is publishing the first edition of The Game Publisher's Guide to P2P Delivery. This guide is designed to help you understand P2P technologies and their various implementations so that you can formulate your own answers to these questions.

Solid State Networks will be updating this document as necessary to reflect changes in technologies and in the games industry. To get the document, please click here and use the simple request form. Solid State Networks will e-mail you a link to it.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWe are encouraged by the call-to-action this week from UK Music, essentially asking for a truce in that nation between the music industry and the telecommunications and software industries with whom it has been in conflict for the past decade.

Steps towards resolving fundamental issues regarding licensing, content protection, and digital distribution should serve as a powerful wake-up call to other regions of the world as well.

For more than seven years, the Distributed Computing Industry Association (DCIA) has been seeking this kind of detente and working to advance practical solutions for the core concerns that have kept content rights holders, starting with the music industry, from harnessing the benefits of distributed computing technologies for the delivery and monetization of their works.

UK Music is an umbrella organization representing the collective interests of the UK's commercial music industry, from artists, musicians, songwriters and composers, to major and independent record labels, managers, music publishers, studio producers and collecting societies.

The DCIA, meanwhile, is an international trade organization focused on commercial development of peer-to-peer (P2P), cloud computing, file-sharing, and related technologies. The DCIA facilitates a number of working groups, focused on solving specific problems associated with critical industry activities, ranging from digital rights management (DRM) to bandwidth utilization.

UK Music's members include the Association of Independent Music (AIM), the British Academy of Songwriters, Composers & Authors (BASCA), the British Recorded Music Industry (BPI), PRS for Music, the Music Managers Forum (MMF), the Music Producers Guild (MPG), the Music Publishers Association (MPA), the Musicians Union (MU), and Phonographic Performance Limited (PPL).

DCIA Member companies include industry-leading software developers and distributors, Internet service providers (ISPs), content rights-holders, and service-and-support firms. There is a complete listing of DCIA Members with links to their corporate websites on the DCIA primary website homepage.

The core goals of UK Music are to promote awareness and understanding of the interests of the UK music industry at all levels; the value of music to society, culture, and the economy; intellectual property (IP) rights and how they protect and promote creativity; and the opportunities and challenges for music creators in the digital age.

The DCIA's bylaws serve as the framework to address vital issues surrounding sanctioned distribution of copyrighted content including; (i) establishing business and technical standards to commercialize distributed Internet computer systems, while protecting the interests of stakeholders, (ii) encouraging the voluntary adoption of those standards in affected industry categories, and (iii) shaping public policy and promoting consumer awareness.

Feargal Sharkey, UK Music's CEO, made the subject remarks on Thursday - with a decidedly jingoistic slant - at the Britain's Digital Future conference sponsored by the Westminster Forum in London.

He said that the music and technology industries must finally set aside their differences and come together around a common goal: the ascendancy of British music to a position of world dominance. Perhaps such patriotic zeal is what's been missing from what has been a longstanding inter-industry dispute around the world.

In any case, we hope that Sharkey's competitive energy will motivate other content industry representatives in other regions also to move towards collaboration rather than confrontation with telecommunications providers and software developers.

This marked a change from Sharkey's usually stated view, which typically has been to oppose Internet users copying and redistributing music tracks by means of file-sharing applications. But now Sharkey says he's recognized that these relatively new delivery technologies and the creative content that they help music fans access are in fact inseparable.

"The optimum solution is to create a workable marketplace where the time and effort musicians put into producing work is properly remunerated and protected, and it's time creative industries and Internet service providers (ISPs) sat down together and, for once, had a grown up conversation about how to do this," he said.

"Our future is now totally dependent, totally entwined, totally symbiotic."

There's no question that Sharkey's call-to-arms for British music and technology producers was characterized by a sharp pro-UK edge: "By 2020, we want to rival the United States as the largest source of repertoire and artistry in the world."

"In short," he said, "We want to be number one."

If arousing nationalistic fervor will serve to inspire industries, which have come to oppose one another, to make the necessary changes to work together; then that is welcome, and should be emulated wherever it can be.

Along with his motivational thrust, Sharkey also called for a "reality check." By that, he meant that rather than being obsessed with the short-term goals of subscriber and user acquisition, broadband providers and software distributors should consider the sustainability of their business models.

"It's now imperative that we all look forward and move towards what I hope will be a very bright and successful future, where the British music industry can in ten years time become not only number one, but in doing so take British technology companies with us."

We believe a thorough examination of this area will yield the most valuable results to date since the advent of the so-called digital realm. Every participant in the distribution chain - from content rights-holder to ISP to client or website - stands to generate more revenue through a collaborative approach than by maintaining the current status quo.

While content providers suffer from both leakage and untenable enforcement expenses to combat copyright infringement, broadband network operators do not yet earn value-added revenue for the most part by having their services directly associated with content delivery and protection, and consumer-facing applications and the vast majority of others with a content-centric web presence are not yet even licensed. We support Sharkey's appeal for a change from that to "the ultimate solution," which will be a true digital music marketplace.

Also of note at this conference were comments made by Ian Livingstone, President of gaming firm Eidos, who argued that more needs to be done to highlight the gaming industry.

"Gaming is bigger than other creative industries. Sales will hit $90 billion by 2015, but it is still viewed with suspicion as a genuine market or career opportunity," he said. "Game production requires a vast number of highly skilled people, so universities and schools need to teach these skills," he said. Share wisely, and take care.

Ping: Why all the Fuss? No One I know Uses iTunes Any More

Excerpted from Telegraph Report by Milo Yiannopoulos

Apple's new social network for music, Ping, has taken an almighty battering, even from Apple fans.

But all the time, I've been thinking: do people actually still use iTunes? Why? Along with most of my friends, I abandoned Apple's music store in favor of subscription P2P streaming service Spotify a long time ago. And those of us on Spotify aren't impressed with a few half-baked attempts at social music discovery.

We've been nosing around our friends' playlists for months, checking out what our co-workers are listening to and unearthing embarrassing guilty pleasures buried deep within our their music collections (let's be honest, the best bit about social software is the opportunity for a good snoop).

We've also had the excellent ShareMyPlaylists.com, a third-party solution that now lists over 25,000 playlists and enables you to follow people and artists. There are only 20,000-odd registered users on the site so far, but it's growing fast and the community is very active. It's a great way to discover new music and founder Kieron Donoghue tells me there's a relaunch with a new design in the pipeline at the moment.

Ping, on the other hand, seems to be designed to coax you into purchases of mainstream artists' tracks. But it's an open secret in the technology community that, despite creating beautiful, industry-leading MP3 players and presiding over the largest electronic music store in the world, Apple's CEO Steve Jobs has shocking taste in music. Perhaps that explains the confused look on a million fanboys' faces when they first logged into Ping, and were presented with recommended artists like Lady Gaga, Jack Johnson, and Coldplay. Cue thousands of snooty tweets from snobby "alternative" fanboys about talentless, pre-packaged mainstream artists.

So while a social network based around music discovery - and, ultimately, purchase - is a no-brainer for Apple, they need to rethink the focus. It can't be about selling me albums from Jobs's own favorite artists - heaven forbid - nor simply promoting bands with whom Apple has advertising relationships. A social network for music needs to be about the artists I and my friends like. That's what "social" means. In other words, "social network" is not another way of saying "shop". Perhaps that's why Apple has been strangely silent about social until now: it hasn't a clue how a system like that works.

There's a clue in Apple's own website copy that suggests even they know Ping isn't good enough: "Now your music is more social," it says. Not social, but more social. Well, I'm sorry, but it's going to take a lot more than a perfunctory nod toward networking - one that doesn't even allow me to use Facebook connect to find my friends - to lure me back to that clunking, over-featured resource-hog, iTunes. Sorry, Steve. Must try harder.

A Start-Up That Will Change Your Life

Excerpted from Time Magazine Report by Gary Moskowitz

Despite co-founding Spotify, a P2P music-streaming service, Swedish-born Daniel Ek loves vinyl records. But don't be fooled. Ek is intent on making music more accessible and mobile, and his iPhone and digital-music library are never far from his side.

"People amass more music now than ever, but it's not about ownership. It's about accessibility," he says. "We're not selling tracks; we're selling access."

Spotify is a small downloadable application that allows access to a massive database (8 million tracks and 200 million user-generated playlists so far) of streamable music. Spotify aggregates content from rights holders, and that music is then made available through free, ad-funded or subscription-based downloads.

Ek's goal is to give his 500,000 subscribers all the music they want at a nominal fee - $15 a month in the UK. Some rights holders, like AC/DC, have balked, but services like Pandora have already proved the concept. "It took Facebook at least five years. Before that, everyone said it would never work," Ek says. "It takes time. Now look at Facebook."

Google TV Revealed: One Screen to Rule Them All

Excerpted from Wired News Report by Eliot van Buskirk

Google gave a live demonstration of Google TV at Berlin's IFA Tuesday, and CEO Eric Schmidt promised it would be a couch potato's dream come true.

"Once you have Google television, you're going to be very busy," Schmidt said. "It's going to ruin your evening."

Google TV is the search giant's bid to bring the web to the biggest screen in the house in a big way, something TV viewers and web surfers (often the same person) have tended to resist as distinctly different experiences. But as the Internet becomes a more viable delivery system for the kind of content we associate with the Barcaloungers and TV sets, Google, Apple, and others are trying to get a piece of that action as well.

Google TV is essentially an interface, blurring the distinctions between programming you get from your cable or satellite provider with search - Google's bread-and-butter. It is set to launch on a Sony HDTV, a Sony Blu-ray player and a Logitech set-top box in the United States this fall (other countries to follow), each with its own "incredible" remote control. After tapping a search button, a single box appears for searching the web, live television programming, recorded shows, on-demand programming, pay TV, online video clips and more.

"You would never want to buy a computer without an Internet browser these days," said Google TV Product Marketing Manager Brittany Bohnet. "Soon, you're never going to want to buy a TV without an Internet browser."

Google, which demonstrated the service on a generic Logitech box with a Dish Network DVR, is working on custom remote control hardware for the Chrome-powered Google TV in conjunction with Sony and Logitech that will likely include - as did Bohnet's wired demonstration keyboard - a full QWERTY keyboard, a pointing mechanism and television-specific buttons (volume, etc.).

But you'll also be able to use your Google-powered Android phone - or even an iPhone - as a Google TV remote. In addition to being convenient, this adds the ability to control the set using voice commands. Screaming at your TV may still have the same ineffectual result, but now, at least, you could say a channel name and your television would switch.

Perhaps more to the point: Google believes that keyboard-less search reduces the friction for web-based inquiries that it thinks people want to make concurrently with watching TV - and on the same screen instead of the tablet you have on your coffee table or the smart-phone in your pocket. In Bohnet's example, someone watching a show about Ferraris can price them via web search, although shopping is not (yet) directly incorporated into shows.

A search for "Star Trek" would reveal television episodes from your cable/satellite provider, video clips from the show on the web, websites about "Star Trek," and the option to pay your cable or satellite provider for on-demand versions of the movies. Wherever "Star Tre"k is - whether online or in your television provider's offerings - Google TV promises to find it. This is a powerful new way of searching, and Google does search well.

For Google's own YouTube videos, Google's hardware partners include a graphics accelerator to help render HD videos from YouTube. According to the company, more YouTube users are choosing to upload HD video these days, so if you're a heavy YouTube user, Google TV promises to make those videos look as good as they possibly can on a large screen.

Looking even further out, things like Google TV could make it more palatable to get mainstream TV shows over the Internet - just as we have become used to streaming and downloading movies. It should be noted that, perhaps not coincidentally, Google's YouTube itself plans on offering mainstream movie rentals this year. It's also no surprise that Google's own YouTube already works pretty well on Google TV via a new version of "LeanBack," which defaults to the high-definition versions of YouTube videos.

Any hope that this device will allow you to "cut the cord" and free yourself from monthly cable or satellite bills now is unfounded because it only works to its full extent if you're paying for cable or satellite. When the day arrives that you can watch TV shows on your TV delivered up by Hulu, who needs them?

Though they are not exactly comparable, Google TV invites comparisons to Apple TV, which was slimmed down, simplified and cut to $99 in an upgrade announced last week. Apple TV doesn't actually contain much TV - only Fox and ABC signed on to offer 99-cent TV show rentals - and Google said nothing at IFA about any content deals. The Wall Street Journal reported three weeks ago that talks with the networks were not going as well as Google had hoped.

It will also not act like a DVR, so for pausing and fast-forwarding you'll still need your cable company's box (or the vastly superior TiVo).

But that said, Google plans to add Android apps in "early 2011," they said here. Apple TV is not running iOS, so it can't - one of the five reasons we found it still boring.

What Google won't do is get into the content creation business. "There's a line that we decided not to cross," said Schmidt. "We want to work with content providers; and we're very unlikely to do any content production."

Accedo and Widevine Partner to Provide OTT Video Services

Accedo Broadband, the leading provider of TV app stores and applications, and Widevine, a provider of digital entertainment solutions, this week announced a partnership to deliver over-the-top (OTT) video services. Widevine's adaptive streaming, virtual DVD controls and digital rights management (DRM) platform will be available for integration with Accedo's application solutions. With this partnership, operators will be able to deliver the highest-quality viewing experience to their customers using Accedo applications.

The Widevine platform is deployed by major Internet content services and large cable, satellite and telecommunication companies launching over-the-top and TV Everywhere strategies. The company's software platform optimizes the entertainment experience for content delivered over any network to any device. The solution is natively supported in nearly all major brands and types of network connected consumer electronics including televisions, Blu-ray players, mobile devices, gaming systems and more.

"With the proliferation of TV devices with over the top delivery of premium video content, DRM solutions are more important than ever," commented Michael Lantz, CEO, Accedo Broadband. "We are excited about our new partnership with Widevine, which is enabling us to provide our customers with one of the market leading content protection solutions in Connected TV apps."

"Accedo is one of the global leaders in providing apps for IPTV and Connected TV. We are very happy to be able to jointly offer proven solutions to the market," commented Brian Baker, CEO, Widevine. "We are already working together on several great next-generation video services for OTT distribution and we look forward to bringing the partnership to the next level."

Accedo and Widevine will demonstrate Connected TV applications for Blu-ray players and TVs on their respective stands at IBC in Amsterdam. Widevine and Accedo can be found in the IPTV Zone in stands, 314 and 703 respectively.

Accedo Broadband is the leading provider of applications for IPTV and Connected TV. Accedo Broadband provides the largest available application store for IPTV and Connected TV containing, for example, IPTV games, quizzes, puzzles, video art, music, karaoke, lifestyle, niche sports, weather, social media and communication services. Accedo's Funspot gaming service is the most widely deployed IPTV gaming service in the world.

Accedo is a privately held company founded by telecom and media entrepreneurs Michael Lantz and Fredrik Andersson primarily backed by Swedish VC Industrifonden. Accedo Broadband is headquartered in Stockholm, Sweden with branch offices in London, San Francisco and Hong Kong.

Adobe Flash Pushes into the Enterprise with P2P, IP Multicast

Excerpted from NewTeeVee Report by Ryan Lawler

Adobe announced the latest version of its Flash Media Server (FMS) today, with new features aimed squarely at making it the streaming server of choice for enterprise webcasts and other communications.

Previous updates to Adobe's Flash Media Server - like FMS 3.5, which introduced HTTP streaming - were focused mainly on making streaming servers better suited for use by media companies. But the latest update is aimed at attracting potential new customers on the enterprise side of things.

The biggest additions to FMS 4 are the availability of IP multicast as well as Adobe's proprietary Real Time Media Flow Protocol (RTMFP) P2P technology. With IP multicast, Adobe is enabling enterprises to deliver live events behind the corporate firewall with a single stream, rather than delivering a separate stream for each user or connection. With its P2P-based RTMFP technology, enterprises can dramatically reduce bandwidth costs for large-scale events.

Not all versions of the new Flash Media Server will have these new features. In fact, with the launch of FMS 4, Adobe is breaking out its streaming server into three different products, each with different capabilities. The Flash Media Streaming Server 4 is Adobe's basic offering, with live and on-demand streaming, as well as RTMPE content protection, priced at $995.

The Flash Media Interactive Server 4, which costs $4,500, takes that one step further, with support for HTTP streaming, IP multicast and multi-user capabilities. The Flash Media Enterprise Server 4, meanwhile, offers all of those features plus RTMFP, enabling enterprises to blend its IP multicast and peer-to-peer delivery capabilities to increase the efficiency of video delivery behind the firewall and across the broader Internet. (Adobe didn't provide a price for the Flash Media Enterprise Server.)

Adding IP multicast will allow Adobe Flash to finally compete with Microsoft's Windows Media for enterprise webcasting behind the firewall. Before this announcement, Windows Media (and to a lesser extent, Real Player) acted as a de facto solution for webcasting, because it had multicast capabilities behind the firewall that Adobe didn't.

The addition also gives Adobe a new addressable market in the enterprise, which becomes important as media companies and content delivery networks that once relied on FMS for streaming services have transitioned to more scalable and less pricey commodity implementations of HTTP-based Flash streaming. Akamai, for instance, rolled out Flash streaming without Adobe FMS servers with the introduction of its HTTP-based Akamai HD Network last September.

Cloud Computing: The Invisible Revolution 

Excerpted from OS News Report by David Adams

I attended VM World last week, and as you might imagine, it was "cloud computing" this and "cloud computing" that the whole time. The hype factor for the cloud is in overdrive right now. But is it warranted? A lot of people, even tech-oriented ones, outside of the data center sysadmin types, wonder what all the hype is about. I've come to believe that cloud computing is major computing revolution, but for most computing users, it's an invisible one.

Geek ambivalence about cloud computing is interesting, because it's not like it's a new phenomenon. In tech years, the idea has been around for ages. But part of the problem is that the actual definition of cloud computing isn't really all that easy to pin down. And marketers have been fond of talking about things being cloud computing when it's really only peripheral, and it's really a shameless ploy to capitalize on a hot trend. But in a nutshell, here's what I think is the essence of the cloud computing concept.

There are two technological innovations that, available together, make cloud computing possible: ubiquitous Internet access and advanced virtualization technology. With virtualization, a "server" doesn't have to be a physical machine. In the olden days, if you wanted a server, you had procure a physical machine, or access to one. If you thought your needs would scale up, you would get a more powerful machine than you currently needed, just in case. And once you came close to outgrowing that machine, you would need to either get a new machine and migrate your system over, or scale out to more machines, by spreading components, such as a database, off to its own server, or doing load balancing between two. System administrators were constantly trying to strike a balance between paying for capacity that would never be used and dealing with problems or outages caused by usage spikes due to not scaling out quickly enough. And scaling out was sometimes very hard. Moving a running, missing critical system from an old server to a new, faster one was no picnic.

Virtualization made it possible to decouple the "server" from the server hardware. So if you needed more capacity (processor cycles, memory, or storage) you could scale out your server to new hardware, even if that meant moving to a new data center in a different part of the world, without all the fuss. And the ubiquitous network made it easier for the people who used these services to access them even if IT managers started to aggregate them into centralized "cloud" data centers. So this meant that a small startup could order a server from Amazon and never have to worry that they didn't order one that would be powerful enough if they hit the big time. A researcher would be able to build a system to crunch some numbers for three weeks, then just delete it when the calculation was done. And a large company's IT managers could start to decommission the various server boxes that were spread out in closets in offices around the country, and instead provision instances from their centrally-managed "private cloud".

I think the reason that so many geeks don't really understand what the big deal is over cloud computing is that unless you're running a big data center, you're not really the one who's reaping the direct benefit of cloud computing. I blame the marketing, to some extent. We hear about various cool web services, like Evernote or Dropbox, or even "operating systems" that depend on "the cloud", such as ChromeOS or eyeOS. (By the way, use Evernote and Dropbox.) But from the point of view of the end user, cloud computing is just a fancy word for web hosting. If I'm using Dropbox, I don't really care if the storage is on the cloud or on a big old-fashioned storage server somewhere. As long as the service is reliable, it doesn't matter to me. But is sure matters to the poor sap who has to maintain the Dropbox servers. Cloud computing makes it much easier for the company to grow as it needs to, even change hosting providers if necessary, without disrupting my service and without wasting money on unused capacity "just in case."

I guess the other big recipient of the value of cloud computing is the accountant (which would be another reason why the geeks wouldn't really get it, unless you're an accounting geek). Another buzzword that's commonly associated with cloud computing is "utility computing," which basically means that you pay for computing resources as a metered service, just like you would electricity. For the CFOs of the world, it means that you don't spend a bunch of money on hardware that you may or may not be extracting full value out of. I think it's safe to say that most large companies only end up using a small percentage of the computing resources that they have sitting in the racks and on the desks in their buildings, so from an efficiency standpoint, it's better to pay for what you use, even if theoretically you're paying a higher rate for each unit of potential processor cycle. The old way wastes time, money, and electricity.

So this is OSNews, and we primarily concern ourselves with operating systems here. Where do OSes fit into this new world? Well, virtual servers are still servers, and each and every one still needs an OS. What we've done is insert a "sub OS" called a (Type 1) hypervisor under the regular OS, and that hypervisor allows one or more servers to run one or more OSes or OS instances. You could have one OS instance spread across multiple physical machines, or hundreds of OS instances on one machine. A Type 2 hypervisor allows a guest OS to run inside a host OS, which is also useful, but is used for a very different purpose. Depending on the platform, a VM can be moved from one type of hypervisor to another, so you might configure a new server in a VM running as a guest on your laptop then transfer it to run on a "bare metal" hypervisor in a cloud hosting environment when you launch to production.

One aspect of the OS world that's made more complicated by cloud computing is licensing, and Microsoft in particular is in a kind of difficult position. One of the advantages of cloud computing is that you can turn server instances on and off willy nilly, as you need them. You can copy an entire instance from one datacenter to another, or clone one and made a derivative. But when you have to deal with whether the OS and software on that VM you're moving around is properly licensed, it adds a whole lever of unwelcome complexity to the mix. That's one reason why Linux, MySQL and other open source software has been very popular in cloud environments.

If you buy cloud hosting from Amazon, they just build the Windows license into the fee if you order a Windows instance. But if you're using a lot of capacity at Amazon, you end up getting kind of a bad deal, and you'd be better off buying your Windows server licenses at retail and installing them yourself on Amazon's VM.

And virtualization technology is getting bundled with operating systems more and more. Microsoft has its own hypervisor, which is included with Windows Server 2008. It's just one of the commercial virtualization platforms that's available today.

Another reason why cloud computing is an invisible revolution is that a lot of what's happening lately is in the arena of the "private cloud". OSNews' trip to VM world was sponsored by HP, which is putting a huge amount of effort into helping its enterprise customers replace their current infrastructure, which could easily be described as "servers here, servers there, servers we don't know about, we can't keep track of them all, and we certainly can't properly maintain them all". And one of the reasons why the server situation is so chaotic at big companies is that when someone needs a new server for something, and they contact IT, they get the runaround, or they're told they'll have to wait six months. So a lot of the innovation recently is around helping big companies set up a centralized data center where the whole thing is a private cloud, and when someone in some branch office needs a new server, one can be provisioned with a few keystrokes.

The people ordering the new servers don't even need to know it's a cloud. They don't care. All they know is that suddenly their IT people are getting things done a lot quicker. So again, to the outsider, it just looks like regular old hosting, or regular old IT provisioning.

So what about the so-called cloud OS? Where does that fit in? I'm afraid a lot of that is marketing hype, because for the user of a cloud OS, it doesn't really matter whether the apps and storage they're accessing over the network are stored in a cloud or on a regular old server. But the reason that it's meaningful is that it would be impractical for any company to offer a server-based desktop user experience without having cloud computing backing them up on the server side. It would just be too difficult to deal with the elasticity of demand from the users without the flexibility that comes from virtualization.

I think the reason for the marketing hype is that people are inherently wary about their "computer" not really existing inside the actual case that's on their lap or under their desk. Both novice and advanced computer users are nervous, though for different reasons. For some reason, the idea that their computer exists "in the cloud" is just inherently less scary than "it's on our server, in a rack, in our datacenter, in California". Though in reality there's barely any distinction. And until "the cloud" becomes the only way that anyone hosts anything, like at some point movie studios stopped advertising that their films were "in color!" I think marketers will still make the distinction.

But don't let the hype and misdirection confuse you from the real issue. We're in the midst of a huge revolution. And one of the reasons that a lot of people fail to appreciate the big deal is precisely why it's a big deal: for the end user, the move to cloud computing is supposed to be transparent and painless. Even for the programmers and power users and other geeks using these systems, they're just working like they always used to work, and that's the whole point.

MIT Study Suggests Social Networks Influence Behavior

Excerpted from Online Media Daily Report by Laurie Sullivan

Research conducted at the Massachusetts Institute of Technology (MIT) could provide insights for technologists designing the next wave of social networks.

The two-year study spearheaded by Damon Centola, an assistant professor at the MIT Sloan School of Management, suggests that events on social networks can change behavior for health-related practices, but digging deeper into the findings appears to reveal much more.

The study set out to determine whether linking the same people in groups through social networks and the Web could affect diffusion dynamics, depending on how people connect with each other. It turns out that it does. Dynamics make a significant difference on how information spreads, according to Centola.

To study the difference that a social network makes, Centola developed and ran a series of experiments using an Internet-based health community where people could rate information and share it with friends. Two groups were created, each relying on the same people to influence events, but the friends connected to each other in different ways.

The 1,528 people in the study had anonymous online profiles and a series of health interests. Centola matched them with other participants who shared the same interests, calling them "health buddies." Participants received e-mail updates notifying them about activities of their health buddies. When friends of friends clicked on the form, their friends would be notified, too. Centola says that how far and fast the information spread depended on how people were connected.

Overall, 54% people in clustered networks registered for the health forum, compared with 38% in the networks oriented around longer ties. The rate of adoption in the clustered networks proved to be four times as fast. People were more likely to participate regularly in the health forum if they had more health buddies who registered. Only 15% of forum participants with one friend in the forum returned to it, but more than 30% of subjects with two friends returned to it, and more than 40% with three friends in the forum made repeat visits. It's clear from this study that influencers affect the spread of information - something that Forrester Research analysts Josh Bernoff and Ted Schadler explain in the book titled "Empowered: Unleash Your Employees, Energize Your Customers, and Transform Your Business."

Structure makes the difference, but longer ties that lead across the social space make the world smaller and make it easier to spread information faster, Centola says. The study finds that the structure of the population make a difference in the way information spreads.

While connections spread information, does the spread of information actually influence behavior? "The networks that make information spread more quickly actually make behavior spread more slowly," Centola says.

The findings have important implications for the health industry and also shed light for those designing features for online social networks like Facebook, LinkedIn and other online communities. These findings could influence the way that social networks like Facebook or LinkedIn design future features to online communities. They provide insights on targeting different groups of people in advertising campaigns and how certain types of populations, depending on the structure, could become more of an influence in friends and family networks. These networks could become more effective in the adoption of certain products.

While Centola did not design the study for marketers, it's easy to see how a cluster of connected networks or communities could facilitate the spread of a new product adoption. It also identifies how marketers could target specific clusters of the population.

Syniverse Technologies Achieves New Milestone in Messaging Traffic 

Excerpted from TMCnet Report by Rajani Baburajan

Syniverse Technologies, a company that provides technology and business services for the telecommunications industry, announced it has attained a new messaging traffic milestone in the second quarter of 2010.

Consumers worldwide are turning to their mobile devices with even greater frequency to text and tweet with friends and family, share pictures and video, manage their financial accounts, and interact with their favorite brands, company officials stated.

On an average, Syniverse moves a total of 1.6 billion mobile messages per day in the second quarter of 2010, an increase of 23 percent from a year ago.

The peer-to-peer or "P2P" messaging volumes, which include both SMS and MMS, were up 23 percent quarter-over-quarter. While MMS traffic continued with an explosive 202 percent growth rate year over year, SMS traffic achieved 22 percent year-over-year gains, the company noted.

"With SMS now enabled on nearly 100 percent of the world's handsets - both smart devices as well as inexpensive feature phones - all types of mobile providers as well as new entrants to mobile are motivated to take advantage of its broad reach," Tony Holcombe, President and CEO, Syniverse, said.

According to Holcombe, the company added more than 15 new SMS customers across the globe in the second quarter alone.

With more than 500 million messaging-enabled mobile devices currently active, Syniverse has observed an interest in messaging within Latin America over the past few months.

Portio Research's "Mobile Messaging Futures 2010-2014" report also supports this trend.

The report forecasts that the SMS traffic in Latin America will grow from 250.2 billion messages in 2009 to 403.7 billion in 2014, while MMS traffic is expected to grow from 1.8 billion to 5.7 billion over the same period.

The company reported that it has signed almost 20 messaging contracts in Latin America this year. One of these is with Open Mobile (News - Alert), a long-time Syniverse roaming and networking customer.

"Syniverse's network and roaming solutions have served as the enabler of Open Mobile's services for years," Frank Bell, president and COO, Open Mobile, said. "When we decided to strengthen our mobile messaging capabilities, we knew that Syniverse was the right partner to engage because of its continued track record of superior customer service, reliability and flexibility."

Earlier this year Syniverse Technologies announced it has expanded its relationship with RealNetworks, a provider of digital entertainment services to consumers, to use that company's intercarrier SMS platform to support Syniverse's P2P messaging interoperability solutions, TMCnet reported.

Coming Events of Interest

NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year, this show features lively debate on timely cutting-edge business topics.

M2M Evolution Conference - October 4th-6th in Los Angeles, CA. Machine-to-machine (M2M) embraces the any-to-any strategy of the Internet today. "M2M: Transformers on the Net" showcases the solutions, and examines the data strategies and technological requirements that enterprises and carriers need to capitalize on a market segment that is estimated to grow to $300 Billion in the year ahead.

Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.

Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.

Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.

P2P Streaming Workshop - October 29th in Firenze, Italy. ACM Multimedia presents this workshop on advanced video streaming techniques for P2P networks and social networking. The focus will be on novel contributions on all aspects of P2P-based video coding, streaming, and content distribution, which is informed by social networks.

Streaming Media West - November 2nd-3rd in Los Angeles, CA. The number-one place to come see, learn, and discuss what is taking place with all forms of online video business models and technology. Content owners, viral video creators, online marketers, enterprise corporations, broadcast professionals, ad agencies, educators, and others all come to Streaming Media West.

Fifth International Conference on P2P, Parallel, Grid, Cloud, and Internet Computing - November 4th-6th in Fukuoka, Japan. The aim of this conference is to present innovative research results, methods and development techniques from both theoretical and practical perspectives related to P2P, grid, cloud and Internet computing. A number of workshops will take place.

Copyright 2008 Distributed Computing Industry Association
This page last updated September 19, 2010
Privacy Policy