June 14, 2010
Volume XXXI, Issue 2
Internet Video to Hit 193 Million Users in the US by 2014
Excerpted from MediaDailyNews Report by Wayne Friedman
US Internet video users will continue to climb in the next two-to-three years, at around 8-to-9%. And more of their viewing is coming from longer premium TV and movie content.
According to Internet researcher eMarketer, by 2014, 77% of all US Internet users - 193.1 million - will be watching some video content monthly. This is up from 67% of all US users, or 147.5 million.
Looking at October 2009, eMarketer said 20% of its respondents streamed at least one movie, while 13% downloaded a film. This calculation was up from six months before, when 14% of respondents streamed a movie, with 7% downloading a film.
The credit goes to the continued gains made by premium TV provider Hulu, which is second only to YouTube in overall video streams. Another strong factor is the increase in Internet-capable TV sets that allow viewers to screen video in a so-called "lean-back" environment.
Demographics of online viewers are growing across the board, with young viewers still leading the way. eMarketer says the biggest users are 18-24 viewers. But 25-34 viewers and teens are not far behind.
The online video research quotes data that says 29% of all viewers under 25 get most - if not all - of their video online, versus 8% for the entire TV/video population.
Experts Debate Net Neutrality, BitTorrent Offers Technical Solution
Excerpted from BroadbandBreakfast Report by Lindsey Sutphin
Telecommunications experts on a Thursday panel took on the prickly issue of how the Federal Communications Commission (FCC) can best regulate high-speed Internet access.
At an event hosted by the Information Technology and Innovation Foundation (ITIF), presenters gave their opinions and predictions about broadband. All presenters advocated some type of Congressional action as a part of the solution to regulation issues.
Steven Teplitz, Senior Vice President at Time Warner Cable, was critical of the FCC's proposed "third way" solution, which would recognize only the transmission component of broadband as a telecommunications service.
He said Title I, which classifies internet service providers as information service providers, was unsound. He said Title II, which classifies firms in the telecommunications arena, had too much baggage, and the third way proposition would have those same problems. Instead, he encouraged a fourth way, where Congress would set up policy framework and legal authority for regulation.
James Speta, a professor at Northwestern School of Law, said, "I think that the fact that the arguments are uncertain pushes me and others in the direction of a Congressional resolution because certainty is something this market can help with."
He also criticized the FCC's plan to regulate the Internet, citing the change in the Commission over the years and the possibility that a subsequent Commission could change the plan, therefore making it unstable and unreliable.
Free State Foundation President Randolph May also was critical of the FCC, saying: "It makes me want to put a big sign on the 8th floor where the FCC is, that reads 'When you're in a hole, the first thing to do is to stop digging.'"
He thought the FCC should use a "light touch" approach to Internet regulation, and said if the FCC needs more authority for Internet regulation, then Congress should define a policy framework for the regulation.
"Circumscribed market-oriented rule would provide the FCC with a principal basis for fact-based complaints alleging Internet service providers (ISPs) are not acting competitively, and at the same time causing consumers harm," said May. "Using antitrust-like jurisprudence that incorporates rigorous economic analysis, the Commission would focus on specific allegations that were causing consumers harm."
While most presenters focused on the debate over the FCC's legal authority, BitTorrent CEO Eric Klinker focused on a technical solution to the net neutrality argument. He talked about the development of "uTorrent Transport Protocol (UTP)," which was designed to unclog Internet congestion by letting other Internet traffic take precedence. The technology, which he compared it to cars pulling over to the side of the road to let ambulances go by, is a market solution that may not require as much drastic change to the Internet as other solutions, he said.
Richard Bennett, research fellow with ITIF, lauded the technical approach presented by Klinker because he said it allocates Internet applications in a manner that satisfies more people.
He concluded that regulators have problems with the Internet because it does not act in the same manner as other communication technologies, and that it needs its own set of regulations instead of being placed within traditional classifications.
Report from CEO Marty Lafferty
In a week marked by escalating divisiveness among US governmental bodies regarding net neutrality, during which, for example, Congressman John Culberson (R-TX) threatened to ban the Federal Communications Commission (FCC) from using public funds in any new attempt to regulate the Internet, the DCIA welcomed the establishment of the Broadband Internet Technical Advisory Group (BITAG).
The central purpose of this new and timely initiative is to foster collaboration among technologically and professionally qualified participants on a full range of critical issues involved in network management that can affect the experience of end-users and Internet-based businesses.
It is not intended to usurp federal agencies, but rather to support them in defining the new roles that they need in order to enable the development and administration of beneficial regulations.
As is generally the case with such voluntary industry working groups, BITAG can benefit greatly from having well-crafted regulations act as an enforcement backstop.
BITAG should also work well alongside technical standards-setting bodies like the Internet Engineering Task Force (IETF), which encompasses BITAG's issues as well as broader concerns, and the DCIA-sponsored P4P Working Group (P4PWG), which is focused on a vital subset of BITAG issues.
It will also address related concerns that matter greatly to networked-device manufacturers and online content developers. Initial participants in this new coalition include such major players as AT&T, Cisco Systems, Comcast, Google, Intel, Microsoft, and Verizon, among others.
The DCIA believes that just as important as these directly affected key entities within the private sector, consumer interests should also be reflected in a meaningful way in BITAG. For this new effort to be as valuable as it can be, BITAG needs to strongly represent the voice of the end-user as well as relevant industry perspectives in its processes.
As we noted immediately after the appeals court ruled in April that the FCC lacks the requisite authority to regulate Internet activities, it should be very helpful at this juncture for industry participants and observers to come together through an initiative like BITAG.
We need to inform lawmakers in depth about technological developments in this space, explain how ongoing innovation continually improves network management processes for the benefit of consumers, and defuse policy disputes by preemptively advancing technological solutions and business practices.
The DCIA believes that the Internet has represented the greatest positive example of human progress over the past several decades - with enormous unrealized potential still ahead - while its many remarkable advances have been the least influenced in any helpful way by regulatory activity.
We urge officials to exploit BITAG as a new-age tool to compliment their efforts in responding to network management issues that, if unresolved, could pose potential problems for their ultimate constituents, consumers.
We are hopeful that the basic modus operandi for BITAG's role, with federal agencies as proactive observers, will be as follows: 1) carefully identify the relevant parties to Internet-related disputes; 2) empower them to work out optimal technological and business practices solutions; 3) and hold them accountable for doing this. The threat of bringing to bear legal measures and new regulations if parties fail to achieve acceptable results under such an MO will provide ample incentives for them to produce.
Since 2007, in parallel with the FCC's attempt to control network management practices that was overturned by the court, the DCIA has sponsored and facilitated the P4PWG. The intention of establishing this working group, which owes its existence to Verizon, Pando Networks, and Yale University, was to formulate an approach to P2P network traffic management as a joint optimization problem. Both Comcast and BitTorrent contributed substantially to the P4PWG and currently participate in leadership roles.
The objective of certain participating ISPs, for example, was to minimize network resource utilization by P2P services. The objective of certain participating P2P software firms, conversely, was to maximize throughput. The joint objective of both ISPs and P2P software developers was to protect and improve their customers' experience.
P4P itself was defined as a set of business practices and integrated network topology awareness models designed to optimize ISP network resources and enable P2P based content payload acceleration. P4P was successfully field-tested by a number of large and small companies and Yale University; and key results of these trials were published.
Learning from this process included that it would not be inappropriate for ISPs to receive reasonable compensation from content providers using P2P for the services and delivery enhancements that ISPs may offer to them through capabilities like P4P. Alternate, flexible financial arrangements may assist ISPs by providing the appropriate financial incentives to add significant capacity for such services in better alignment with traffic demands.
In the second half of 2010, P4P will continue to provide the way to solve a pending bandwidth crisis before it becomes a serious threat, including for both live streaming and file downloads; and offer a means to collaboratively and cooperatively address future capacity concerns. New commercial products are being introduced that were motivated by the P4PWG's work; and the pace of progress is expected to accelerate.
There is now the potential to have carrier-grade P2P with P4P, which in turn will open opportunities for innovative new services, based on the certainty that the fastest path from point A to point B on a network is via P4P-enhanced P2P.
Benefits to consumers include faster payload delivery, higher quality of service (QoS), and potential assurances of not being subject to service interruptions or degradation. In short, P4P enables content distribution that is more efficient for both the consumer and the network operator compared to alternative architectures.
It is through such initiatives that we will help to continually improve the experience of end-users accessing the applications and content of their choice over the Internet.
We look forward to providing facilitator Dale Hatfield, Adjunct Professor at the University of Colorado, with distributed computing industry input regarding the operational and organizational structure of BITAG. It is important to establish effective working processes on a number of critical fronts to identify and address problems as they arise and before they erupt into conflicts that can put broadband network operators and Internet-based software providers seriously at odds with one another.
Arguably, a consensus around these processes for addressing problems in an area as fast-moving and complex as the cutting-edge of Internet technology implementation, with ever-evolving consumer expectations regarding usage and performance parameters, is more important than agreeing to condone the most attractive current practices.
Change comes so quickly that BITAG itself needs clearly to be "future-focused" to achieve its full potential value. In that regard, we urge DCINFO readers to peruse the Cloud Computing Bill of Rights: 2010 Edition. Share wisely, and take care.
Cloud Computing Is an Irreversible Trend
Excerpted from Information Week Report by Srikanth RP
The first company in the cloud computing arena to hit $1 billion in revenues, Salesforce.com showed the world that a cloud computing play was truly sustainable. Srikanth RP caught up with Jeremy Cooper, VP, Marketing, APJ, Salesforce.com to understand his perspective on how the cloud is transforming enterprise information technology (IT).
SRP: How do you see the evolution of the cloud, and how different do you think the 'cloud' will be in the next two years?
JC: In any era, businesses and technologies have to be constantly efficient. If the 1980's saw the transition of the mainframes to the client server architecture, then this decade will see the transition of on-premise software to the cloud.
This can be seen from the fact that today, both applications and platforms are moving to the cloud. Today, the cloud environment makes it possible to run even complex business processes. Over time, as more enterprises start migrating to the cloud, I expect data centers in end-user companies to drastically reduce in size.
The focus of the IT function too will change. From managing IT infrastructure, the IT function will focus on driving innovation. Cloud computing is an irreversible trend, and for enterprises it is not a question of 'if they will adopt cloud computing technologies,' but rather 'when they will adopt or hop on to the cloud.'
SRP: What do you think will be the impact of cloud delivery models on independent software vendors (ISVs)?
JC: The global applications market has been traditionally dominated by the big software vendors.
However, with the emergence of cloud computing platforms, even smaller companies have an equal chance of competing with the big players. For example, entrepreneurs can quickly create applications using the Force.com platform - without the hassles of investing in infrastructure - and then reach out to a vast global customer base using our AppExchange marketplace.
Today, by using a simplified programming model, developers can build and test applications five times faster, and at about half the cost of traditional software platforms. To match this, existing vendors that have built their revenues using the traditional way of delivering software, will now need to innovate to cost effectively deliver applications that are also faster to deploy.
With the availability of easy-to-use application development tools for business people, customers will no longer be patient with tools that require deep technical skills.
SRP: Your company's most recent launch, Chatter, is being viewed as a competitor to Microsoft SharePoint and IBM Lotus Notes. What do you believe will be your core differentiators over the competition?
JC: There is a need for a social collaboration platform that is simple to deploy and use, and that is our primary goal in Chatter. Similar to social networking websites, users can decide to follow people, documents, and applications. For example, users can post status updates to share their communications with respect to a sales deal. Every connected entity will automatically receive updates whenever any changes have been made. Chatter will also allow more than 100,000 applications built on Force.com to have social collaboration features.
Enterprises can also use Chatter to analyze conversations about specific products or campaigns in social networking websites such as Twitter and Facebook. Most importantly, unlike the competition, where users have to spend a considerable amount of time, energy and investment in configuring and implementing the software, we can automatically deliver Chatter as a service, and help enterprise users quickly use enterprise social networking functionalities.
Net Neutrality Group Signals Cooling of Hostilities
Excerpted from CNET News Report by Declan McCullagh
A new industry effort that bypasses Washington politicians and regulators indicates that a cooling of hostilities over net neutrality rules is underway.
Longtime political rivals including AT&T, Google, Comcast, Verizon, and Microsoft, among others announced Tuesday they had joined together to form a technical advisory group to "develop consensus on broadband network management practices or other related technical issues that can affect users' Internet experience," including applications and devices.
The formal name of the effort is the Broadband Internet Technical Advisory Group (BITAG), which will be chaired by Dale Hatfield of the University of Colorado at Boulder, a former chief technologist of the Federal Communications Commission (FCC).
Tuesday's announcement was, in retrospect, almost inevitable. After a majority of the US Congress told the Democrat-controlled FCC not to slap strict net neutrality rules on broadband providers, there was little chance of new regulations. And in an election year dominated by discussions of jobs, the economy, and health care, regulating broadband providers is hardly a Congressional priority.
Supporters of net neutrality say that new Internet regulations or laws are necessary to prevent broadband providers from restricting content or prioritizing one type of traffic over another. Broadband providers and many conservative and free-market groups, on the other hand, say that some of the proposed regulations would choke off new innovations and could even require awarding e-mail spam and telemedicine the identical priorities.
If Congress does return to the topic in 2011, it's difficult to predict what might happen, and whether the Google-eBay-Amazon axis would prevail over broadband providers. Which is why both sides appear to have decided that having a series of informal discussions - far away from the halls of the FCC and Capitol Hill - might be more productive.
Adam Thierer, President of the free-market Progress and Freedom Foundation (PFF), called the BITAG a way to de-politicize "Internet engineering issues by offering an independent forum for parties to have technical disputes mediated and resolved - without government involvement or onerous rulemakings."
The plan is for BITAG to "function as a neutral, expert technical forum and promote a greater consensus around technical practices within the Internet community," Hatfield said. Among the factors that will be considered: whether a practice is commonplace, whether alternative technical approaches are available, and whether a technical practice is aimed at specific content, applications, or companies.
This is in part a reference to Comcast's controversial throttling of some BitTorrent transfers during periods of network congestion, which led to the FCC declaring the practice to be illegal. Comcast sued, and a federal appeals court in April unanimously sided with the broadband provider. By then, however, Comcast and BitTorrent had long since reached a peace accord.
For the last few years, liberal advocacy groups including Free Press and Public Knowledge had enjoyed a close alliance with Google and other web companies on the topic. (Some money has changed hands: Public Knowledge acknowledges receiving funds from Google, but won't reveal how much, and says Google's rivals also give undisclosed sums, which could be larger or smaller.)
Coalitions remain influential only if they can limit defections. For these advocacy groups, the danger is that their corporate allies might conclude the BITAG's work is sufficient and withdraw support for new laws and regulations, making their enactment much less likely.
And in fact, Free Press responded on Tuesday by claiming "this or any other voluntary effort is not a substitute for the government setting basic rules of the road for the Internet" and "there must be a separate FCC rulemaking process." Public Knowledge, too, said BITAG is "not a substitute for FCC rules and enforcement procedures."
In theory, many Democrats favor net neutrality. President Obama recently reiterated through a spokesman that he remains "committed" to the idea, as have some Democratic committee chairmen.
But theory doesn't always mesh with political practice. Rank-and-file Dems are clamoring for net neutrality about as much as Bush-era Republicans were clamoring for limited government: it's a valuable talking point, but if Silicon Valley has reached a working detente with broadband providers, well, there may be no need to do anything hasty.
Octoshape Selected as a Red Herring Top 100 Europe Tech Start-Up
Red Herring announced its Top 100 Award in recognition of the leading private companies from Europe, celebrating these start-ups' innovations and technologies across their respective industries.
The Octoshape Infinite HD and Infinite Edge technologies are pioneering the CDN video delivery space. Octoshape applies P2P technology to uniquely enable the highest quality video experience attainable over the Internet today, offering the longest viewer engagement times, while providing the cost control necessary to usher in the next evolution of TV quality video delivery over the Internet globally.
Red Herring's Top 100 Europe list has become a mark of distinction for identifying promising new companies and entrepreneurs. Red Herring editors were among the first to recognize that companies such as Facebook, Twitter, Google, Yahoo, Skype, Salesforce.com, YouTube, and eBay would change the way we live and work.
"Choosing the companies with the strongest potential was by no means a small feat," said Alex Vieux, Publisher and CEO of Red Herring. "After rigorous contemplation and discussion, we narrowed our list down from hundreds of candidates from across Europe to the Top 100 Winners. We believe Octoshape embodies the vision, drive and innovation that define a successful entrepreneurial venture. Octoshape should be proud of its accomplishment, as the competition was very strong."
"Octoshape is proud to be recognized by Red Herring," said Stephen Alstrup, CEO of Octoshape, "Our explosive growth in the video delivery space has been fueled by our innovation and dedication to consumer excellence."
Ingram Micro Teams Up with Cloud Providers
Excerpted from SearchCloudComputing Report
$29 billion monster tech distributor Ingram Micro (IM) has partnered with cloud providers Salesforce.com, Amazon, and Rackspace to make it easy for the managed service providers (MSPs) and value-added resellers (VARs) that piggyback on Ingram's distribution system to get cloud to their customers.
The new line-up is called the Cloud Conduit, and IM says it wants a piece of the pie as distributed, pay-as-you-go services take off.
"Cloud computing introduces a new playing field for solution providers and MSPs. With the advent of cloud computing comes tremendous opportunity for our channel partners to add high value managed solutions and services from the cloud into their services portfolio and ultimately earn more business," said IM.
In a jolly mix-up of marketing messaging, Rackspace Cloud earned a spot in IM's Seismic Services Division, so partners can apparently sell clouds from underground.
Niklas Zennstrom of Atomico
Excerpted from Times Online Report by James Ashton
The sun is streaming through the windows of Niklas Zennstrom's London office. It feels almost like California, where the Swedish internet entrepreneur might have settled a few years ago had he not faced the threat of litigation.
"I was toying with the idea of going to California, but it was just heating up too much," Zennstrom admits with a mischievous smile. Silicon Valley's loss is London's gain.
A decade ago, his internet file-sharing service Kazaa was being chased through the courts by record companies and film studios that accused him of abetting online infringement, so Zennstrom had no choice but to put down deeper roots on this side of the Atlantic.
He ruffled a lot of feathers with Kazaa, but that was only a starter. Zennstrom and Janus Friis, his Danish business partner, then sent a shiver down the spine of traditional telecom companies with the success of Skype, the Internet telephony service. It was sold for $2.6 billion in 2005 to eBay, the online auction site, and Zennstrom became a legend.
He is a big fish in Europe's digital pond - and is about to get bigger with Atomico, an investment fund that is raising $165 million to hunt out the Next Big Thing.
Zennstrom looks like a tall, slightly awkward Bill Gates. But although he has gone for the American preppy look today - blue blazer, light chinos and striped shirt - some think the 44-year-old Swede is the best chance Europe has of putting its stamp on a global digital economy dominated by American giants such as Google, Twitter, and Facebook.
"There are few people here who have the same stature as the big players in the valley," says Brent Hoberman, Co-Founder of Lastminute.com, who has invested alongside Zennstrom in a number of start-ups.
Zennstrom is trying to encourage other dotcom entrepreneurs to follow in his footsteps. Is he a digital pioneer or just a disruptive force?
"Disruptive is a tricky word for people," says Zennstrom. "Transformative is better. It is about creating sustainable competitive advantages - not creating trouble for the sake of it."
Zennstrom wants to correct the problem he found when he was trying to raise money to set up Skype in 2003 at a time when the dotcom crash was fresh in investors' minds. "We were just asking for a million to give up a third of the company. Nobody wanted to do it. At the same time, he thinks entrepreneurs should be dissuaded from selling out too early. He has long-term plans for Rdio, his latest venture, and surprisingly Zennstrom has been greeted with open arms by the entertainment firms that were baying for his blood not long ago.
Rdio will license songs from all the big music groups and charge subscribers $5 or $10 a month for unlimited access to a catalog of 5 million songs streamed onto smart-phones, a move that could pose a threat to the iPod. It has been launched in America before Spotify, the P2P streaming music service, and has plans to come to Europe.
Kazaa's long-running legal battle with the music industry was settled four years ago with an agreement on compensation. Zennstrom makes that dispute sound like a few sessions of marriage guidance. "When you have a good fight and you settle, through that process you actually gain mutual respect for one another," he says.
Despite his status, his involvement in a venture is no guarantee of success. "Sometimes it goes big, sometimes it is a complete write-off," he says, referring to Joost, his stab at Internet television that flopped when broadcast networks failed to get behind it.
Zennstrom almost failed to sell Skype. Meg Whitman, eBay's chief executive, jetted halfway around the world to Estonia to seal the deal with him, only to find he was still in London. "That was not disrespect, that was a missed flight," he giggles.
It was not enough to put off Whitman. Thanks to the faulty logic that online auctions would somehow be improved if buyers and sellers Skyped each other in the closing stages, eBay paid $2.6 billion for a company with 54 million members but projected annual revenues of only $60 million.
"It was not Christmas, but we thought it was a strategic window," says Zennstrom, who had been looking for a buyer since Microsoft and AOL began making advances into Internet telephony.
Last year, Skype had revenues of $716 million and 560 million registered users. It has been profitable for the past three years.
Its progress could not mask the fact that eBay had made an expensive mistake. Last year it put Skype back up for sale, and Zennstrom and Friis were keen to get back in.
They gave themselves leverage by suing eBay for overstepping the terms of a license to Skype's underlying P2P technology that also underpinned Kazaa and Joost. The fact that they still owned it came as a shock to eBay investors, but Zennstrom says firmly: "It was very clear all along."
A consortium led by Silver Lake Partners was ready to buy Skype from eBay. Zennstrom and Friis muscled their way in by taking a 14% stake in the consortium and calling off the lawyers.
It has been suggested that Skype could grow by going on to mobile phones and pushing into the business market. Could it be floated on the stock market later this year? "There have been some rumors, but I'm not commenting," says Zennstrom.
He was born in the Swedish university city of Uppsala. Both his parents were teachers and they instilled the importance of getting a good education. Sweden in 1991 was not the entrepreneurial hub that it became later. "People were not really aware that you could go out, raise money and start your own company. It wasn't really on the agenda," says Zennstrom.
He was about to join a management consultancy when his father showed him a newspaper advertisement. It was for trainees for Kinnevik, a paper and mining conglomerate that had ambitions to break into telecom and media. The service it was planning to launch, Tele2, became the main competitor to Sweden's national phone company, giving Zennstrom an early lesson in how to disrupt the status quo. It was there he met Friis.
Sweden never really recovered from the dotcom crash, says Zennstrom, while Britain has a much more entrepreneurial vibe. One thing that could stop this is the proposed increase in capital-gains tax from 18% to 40% or 50% that is expected to be in George Osborne's June 22 budget.
"If I were not already here today and I decided to move somewhere, maybe I would move to Switzerland or maybe to America," says Zennstrom. "Britain was really attractive and it is less attractive now. You are starting to see a bit of a tipping point. Wonder why he doesn't just withdraw to his yacht - sailing is his passion - but people who have enjoyed Zennstrom's level of success always want to see if they can carry it off again. Is that what motivates him to continue?
"There are two reasons," he says. "To make money, but also to make an impact, to change the world in some way or another. If you have those two objectives, then you are in good shape. People who only want to make money are not going to have a good time."
The founder of Atomico wakes at 7 AM at his house in Kensington, west London, and breakfasts on yogurt. He heads straight to the gym and then catches a taxi to the office.
"I don't spend so much time at my desk," says Niklas Zennstrom. "I chat to colleagues or meet portfolio companies. There are not many reporting lines here. From a management point of view it is quite easy." Atomico reviews 50 potential investments a week.
In the evening Zennstrom takes calls from America. He works late less frequently than he used to and rarely goes to business dinners or parties, preferring to be home by 8 PM to eat with his wife.
Niklas Zennstrom's passion is sailing, a legacy of his childhood, when he spent months at the summer house that his grandfather built on the Stockholm archipelago.
"If you are growing up there, it is difficult not to sail," he says. Now he is making up for time lost when he took a break to build his businesses. Zennstrom will race his 72ft yacht, Ran 2, in the Newport-Bermuda race later this month. He won the Fastnet race a year ago.
His charity, Zennstrom Philanthropies, run with his wife Catherine, backs human-rights and climate-change causes. Even the Baltic sea has suffered since he was a boy, he says. "It has changed. We used to go fishing and now it is just algae and no fish. It is a tragedy."
TeliaSonera Launches Spotify P2P Music Service for TV
Excerpted from Wall Street Journal Report by Gustav Sandstrom
Swedish telecommunications operator TeliaSonera said Thursday it is launching Spotify's P2P music streaming platform through its digital-TV services in Sweden and Finland.
From June, Spotify will be available to around 120,000 of TeliaSonera's digital-TV customers, it added.
Spotify's platform gives users access to music streamed over the Internet from a number of record labels, either free of charge or for a monthly fee without advertising and at a higher bit rate. The service currently has some seven million users in markets including Sweden, Finland, and the UK.
"Bringing Spotify into the living room is a natural step to make our music service available to people wherever they happen to be," said Spotify Chief Executive Daniel Ek.
TeliaSonera in 2009 signed an agreement with Spotify to become the only operator in Sweden to market its services and to participate in its development.
What Is Cloud Computing, and Can it be Trusted?
Excerpted from Network World Report by Jon Brodkin
Everyone in the IT industry is talking about cloud computing, but there is still confusion about what the cloud is, how it should be used, and what problems and challenges it might introduce. This FAQ will answer some key questions about cloud computing.
What is cloud computing?
Gartner defines cloud computing as "a style of computing in which massively scalable IT-related capabilities are provided 'as a service' using Internet technologies to multiple external customers." Beyond the Gartner definition, clouds are marked by self-service interfaces that let customers acquire resources at any time and get rid of them the instant they are no longer needed.
The cloud is not really a technology by itself. Rather, it is an approach to building IT services that harnesses the rapidly increasing horsepower of servers as well as virtualization technologies that combine many servers into large computing pools and divide single servers into multiple virtual machines that can be spun up and powered down at will.
How is cloud computing different from utility, on-demand, and grid computing?
Cloud by its nature is "on-demand" and includes attributes previously associated with utility and grid models. Grid computing is the ability to harness large collections of independent compute resources to perform large tasks, and utility computing is metered consumption of IT services, says Kristof Kloeckner, the cloud computing software chief at IBM. The coming together of these attributes is making the cloud today's most "exciting IT delivery paradigm," he says.
Fundamentally, the phrase cloud computing is interchangeable with utility computing, says Nicholas Carr, author of "The Big Switch" and "Does IT Matter?" The word "cloud" doesn't really communicate what cloud computing is, while the word "utility" at least offers a real-world analogy, he says. "However you want to deal with the semantics, I think grid computing, utility computing, and cloud computing are all part of the same trend," Carr says.
Carr is not alone in thinking cloud is not the best word to describe today's transition to web-based IT delivery models. Cloud computing might best be viewed as a series of "online business services," says IDC analyst Frank Gens.
What is a public cloud?
Naturally, a public cloud is a service that anyone can tap into with a network connection and a credit card. "Public clouds are shared infrastructures with pay-as-you-go economics," explains Forrester analyst James Staten in an April report. "Public clouds are easily accessible, multi-tenant virtualized infrastructures that are managed via a self-service portal."
What is a private cloud?
A private cloud attempts to mimic the delivery models of public cloud vendors but does so entirely within the firewall for the benefit of an enterprise's users. A private cloud would be highly virtualized, stringing together mass quantities of IT infrastructure into one or a few easily managed logical resource pools.
Like public clouds, delivery of private cloud services would typically be done through a web interface with self-service and chargeback attributes. "Private clouds give you many of the benefits of cloud computing, but it's privately owned and managed, the access may be limited to your own enterprise or a section of your value chain," Kloeckner says. "It does drive efficiency; it does force standardization and best practices."
The largest enterprises are interested in private clouds because public clouds are not yet scalable and reliable enough to justify transferring all of their IT resources to cloud vendors, Carr says.
"A lot of this is a scale game," Carr says. "If you're General Electric, you've got an enormous amount of IT scale within your own company. And at this stage the smart thing for you to do is probably to rebuild your own internal IT around a cloud architecture because the public cloud isn't of a scale at this point and of a reliability and everything where GE could say 'we're closing down all our data centers and moving to the cloud.'"
Is cloud computing the same as software-as-a-service?
You might say software-as-a-service (SaaS) kicked off the whole push toward cloud computing by demonstrating that IT services could be easily made available over the web. While SaaS vendors originally did not use the word cloud to describe their offerings, analysts now consider SaaS to be one of several subsets of the cloud computing market.
What types of services are available via the cloud computing model?
Public cloud services are breaking down into three broad categories: SaaS, infrastructure-as-a-service (IaaS), and platform-as-a-service (PaaS). SaaS is well known and consists of software applications delivered over the web. IaaS refers to remotely accessible server and storage capacity, while PaaS is a compute-and-software platform that lets developers build and deploy web applications on a hosted infrastructure.
How do vendors charge for these services?
SaaS vendors have long boasted of selling software on a pay-as-you-go, as-needed basis, preventing the kind of lock-in inherent in long-term licensing deals for on-premises software. Cloud infrastructure providers like Amazon are doing the same. For example, Amazon's Elastic Compute Cloud charges for per-hour usage of virtualized server capacity. A small Linux server costs 10 cents an hour, while the largest Windows server costs $1.20 an hour.
Storage clouds are priced similarly. Nirvanix's cloud storage platform has prices starting at 25 cents per gigabyte of storage each month, with additional charges for each upload and download.
What types of applications can run in the cloud?
Technically, you can put any application in the cloud. But that doesn't mean it's a good idea. For example, there's little reason to run a desktop disk defragmentation or systems analysis tool in the cloud, because you want the application sitting on the desktop, dedicated to the system with little to no latency, says Pund-IT analyst Charles King.
More importantly, regulatory and compliance concerns prevent enterprises from putting certain applications in the cloud, particularly those involving sensitive customer data.
IDC surveys show the top uses of the cloud as being IT management, collaboration, personal and business applications, application development and deployment, and server and storage capacity.
Can applications move from one cloud to another?
Yes, but that doesn't mean it will be easy. Services have popped up to move applications from one cloud platform to another (such as from Amazon to GoGrid) and from internal data centers to the cloud. But going forward, cloud vendors will have to adopt standards-based technologies in order to ensure true interoperability, according to several industry groups. The recently released Open Cloud Manifesto supports interoperability of data and applications, while the Open Cloud Consortium is promoting open frameworks that will let clouds operated by different entities work seamlessly together. The goal is to move applications from one cloud to another without having to rewrite them.
How does traditional software licensing apply in the cloud world?
Vendors and customers alike are struggling with the question of how software licensing policies should be adapted to the cloud. Packaged software vendors require up-front payments, and make customers pay for 100% of the software's capabilities even if they use only 25% or 50%, Gens says. This model does not take advantage of the flexibility of cloud services.
Oracle and IBM have devised equivalency tables that explain how their software is licensed for the Amazon cloud, but most observers seem to agree that software vendors haven't done enough to adapt their licensing to the cloud.
The financial services company ING, which is examining many cloud services, has cited licensing as its biggest concern. "I haven't seen any vendor with flexibility in software licensing to match the flexibility of cloud providers," says ING's Alan Boehme, the company's senior vice president and head of IT strategy and enterprise architecture. "This is a tough one because it's a business model change. It could take quite some time."
What types of service-level agreements are cloud vendors providing?
Cloud vendors typically guarantee at least 99% uptime, but the ways in which that is calculated and enforced differ significantly. Amazon EC2 promises to make "commercially reasonable efforts" to ensure 99.95% uptime. But uptime is calculated on a yearly basis, so if Amazon falls below that percentage for just a week or a month, there's no penalty or service credit.
GoGrid promises 100% uptime in its SLA. But as any lawyer points out, you have to pay attention to the legalese. GoGrid's SLA includes this difficult-to-interpret phrase: "Individual servers will deliver 100% uptime as monitored within the GoGrid network by GoGrid monitoring systems. Only failures due to known GoGrid problems in the hardware and hypervisor layers delivering individual servers constitute failures and so are not covered by this SLA."
Attorney David Snead, who recently spoke about legal issues in cloud computing at Sys-Con's Cloud Computing Conference & Expo in New York City, says Amazon has significant downtime but makes it difficult for customers to obtain service credits.
"Amazon won't stand behind its product," Snead said. "The reality is, they're not making any guarantees."
How can I make sure my data is safe?
Data safety in the cloud is not a trivial concern. Online storage vendors such as The Linkup and Carbonite have lost data, and were unable to recover it for customers. Secondly, there is the danger that sensitive data could fall into the wrong hands. Before signing up with any cloud vendor, customers should demand information about data security practices, scrutinize SLAs, and make sure they have the ability to encrypt data both in transit and at rest.
How can I make sure that my applications run with the same level of performance if I go with a cloud vendor?
Before choosing a cloud vendor, do your due diligence by examining the SLA to understand what it guarantees and what it doesn't, and scour through any publicly accessible availability data. Amazon, for example, maintains a Service Health Dashboard that shows current and historical uptime status of its various services.
There will always be some network latency with a cloud service, possibly making it slower than an application that runs in your local data center. But a new crop of third-party vendors are building services on top of the cloud to make sure applications can scale and perform well, such as RightScale.
By and large, the performance hit related to latency "is pretty negligible these days," RightScale CTO Thorsten von Eicken. The largest enterprises are distributed throughout the country or world, he notes, so many users will experience a latency-caused performance hit whether an application is running in the cloud or in the corporate data center.
uTorrent Goes 1.0 for Mac OS X
Excerpted from Tuaw Report by Sang Tang
uTorrent has recently been updated to version 1.0 (Windows users are up to version 2.0.2), its first major point release since going beta on Mac OS X. After living most of its life in Windows, the popular BitTorrent client went beta on the Mac in late 2008, and has seen a host of updates since then.
This might be a time to consider trying uTorrent. In Tuaw tests, the app launches faster than Transmission, and occupies a smaller footprint - its DMG and the app are both smaller than Transmission. Whether or not you end up switching to uTorrent, however, is another matter, as Transmission is comparable in its feature set. This really boils down to personal preference.
While BitTorrent apps live, and do whatever they want, freely on Mac OS X, the story is different on the iPhone. Apple has kept a tight lid on restricting BitTorrent-related apps (such as a BitTorrent client controller) on the iPhone, noting that "this category of applications is often used for the purpose of infringing third-party rights."
uTorrent 1.0 is available as a free download at the uTorrent website.
Veoh Founder Dmitry Shapiro Joins MySpace Music as CTO
Excerpted from TechCrunch Report by Leena Rao
Dmitry Shapiro , the Founder and CEO of shuttered P2PTV site Veoh, is joining MySpace Music as Chief Technology Officer (CTO). Shapiro will report directly to Courtney Holt, President of MySpace Music.
In his new role, Shapiro will be "responsible for all aspects of technical developments for the MySpace Music platform", including new versions of artist profiles and tools as well as the overall music experience on MySpace.
Shapiro is known for founding Veoh, whose assets were recently sold to Israeli start-up Qlipso. Veoh has had a troubled history, facing copyright litigation with Universal Music Group (UMG).
While the start-up Veoh won a summary judgment in its favor last year, the lawsuit proved to be too costly and distracting. Veoh was forced to file for bankruptcy in February.
Prior to founding Veoh, Shapiro was the Founder and CEO of P2P network security company Akonix Systems. Interestingly, Shapiro is an angel investor and Chairman of Irata Labs, the company that built the Twitter game Spymaster. MySpace's parent News Corp just bought Irata Labs in April.
MySpace has seen an exodus of talent over the past few months, so it's nice to see a key hire at the social network for change.
Spanish Judges Compare File Sharing to Lending Libraries
Excerpted from Billboard Business News Report by Howell Llewellyn
Spain's anti-piracy campaigners have suffered a new blow following yet another court ruling that clears a website that offered links to protected cultural content of infringing intellectual property legislation.
Three judges closed the five-year case by arguing that file sharing was comparable to the "loan or sale of books," and therefore no offense had been committed.
The case against the file-sharing site CVCDGO began five years ago when Spanish police raided a house in Madrid, dismantled the site, and arrested four people.
The three judges at Madrid Provincial Court - Maria Riera Ocariz, Eduardo Jesus Gutierrez Gomez, and Francisco Cucala Campillo - ruled that "since ancient times there has been the loan or sale of books, movies, music, and more. The difference now is mainly in the medium used - previously it was paper or analog media and now everything is in a digital format which allows a much faster exchange of a higher quality and also with global reach through the Internet."
In common with most file-sharing cases in Spain recently, the court found that since the site did not host the actual copyright files and generated no profit directly from any infringements of copyright, the presence of advertising on the site did not constitute a crime.
The decision has saddened audiovisual rights collecting society Egeda and Columbia Tristar, who alerted police to the site in 2005 claiming it allowed users to download films on file-sharing networks. Advertising-financed CVCDGO claimed in 2005 it had had 11 million visits since launching in 2004. It still had servers located in San Diego, CA following raids in Madrid, Malaga, and Seville.
Coming Events of Interest
Digital Media Conference East - June 25th in McLean, VA. The Washington Post calls this Digital Media Wire flagship event "a confab of powerful communicators and content providers in the region." This conference explores the current state of digital media and the directions in which the industry is heading. The DCIA will present a "Content in the Cloud" panel discussion.
Distributed Computing & Grid Technologies - June 28th - July 3rd in Dubna, Russia. This fourth international conference on this subject, also known as GRID2010, will be held by the Laboratory of Information Technologies at the Joint Institute for Nuclear Research and is focused on the use of grid-technologies in various areas of science, education, industry and business.
NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year, this show features lively debate on timely cutting-edge business topics.
Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.
Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.
Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.
|