February 15, 2010
Volume XXIX, Issue 9
Save While You Learn and Network at P2PCMC & MSNY
Sign-up now for both the P2P & CLOUD MARKET CONFERENCE and Media Summit New York (MSNY) and save $479 compared to individual delegate registration rates.
If you already plan to attend the P2P & CLOUD MARKET CONFERENCE, this means you can add MSNY for $596. Or if you already plan to attend MSNY, it means you can add the P2P & CLOUD MARKET CONFERENCE for only $120.
Either way, you can register at the deeply discounted rate of just $995 for both of these outstanding events by clicking here or calling 410-476-7964.
This cost includes the continental breakfast, conference luncheon, refreshment breaks, and VIP networking cocktail reception as well as all keynotes and panel sessions at the P2P & CLOUD MARKET CONFERENCE - in addition to all events and amenities at MSNY.
With this very attractive pre-registration combined-conference discount, you can advance your knowledge of new developments in the media marketplace and the most-cutting edge technologies that are being adopted for it.
The P2P & CLOUD MARKET CONFERENCE will explore marketing strategies, business models, case studies, and future opportunities related to peer-to-peer (P2P) and cloud based commercial offerings.
MSNY is the premier international conference on media, broadband, advertising, television, cable and satellite, mobile, publishing, radio, magazines, news and print media and marketing.
Twitter Turns To BitTorrent to Streamline Server Updates
Excerpted from The Next Web Report by Matt Brian
Social networking behemoth Twitter is harnessing BitTorrent technology to deploy server updates across its vast network.
According to BitTorrent CEO Eric Klinker, the micro-blogging site has crafted its own BitTorrent-inspired technology to roll-out large scale server deployments, cutting down the length of certain operations as well as minimizing site downtime and susceptibility to security breaches.
The technology, code-named Murder (as in a collection of crows), is an open-source, custom-coded set of scripts by Twitter employee Larry Gadea, which incorporates BitTorrent software BitTornado to speedily transfer data across large sets of servers.
The scripts, when utilized, operate much in the same way as downloading a file from a popular torrent site making use of a tracker. But here, Twitter runs one of the scripts to create a self-contained server on one machine, a "seeder" - or the first Twitter server the other servers liaise with to get pieces of data. These other machines are known as "peers" - the servers to which Twitter wishes to distribute the files.
It is thought that Murder will allow Twitter to streamline tasks that previously would have taken minutes to execute now to be able to complete them in a matter of seconds.
Klinker also indicates that Twitter will be releasing more information about the project and its performance in the near future.
Abacast Announces Record Revenues, Appoints Board Members
Abacast, a provider of streaming solutions for the online radio and video industries, this week announced two new Board Members and record revenue for 2009.
Tim Napoleon, President of AllDigital, joins Abacast's Board, bringing with him a wealth of business and technical experience in the streaming media and digital advertising space. Most recently, he was the Chief Strategist, Media and Entertainment, for Akamai Technologies, working with enterprise and media companies such as Apple, MLB, XM Satellite, MTV, NBC, and News Corp.
John Kesler, VP of Business Development at Emmis Indiana Broadcasting, a division of Indianapolis based Emmis Communications, is also joining Abacast's Board. He has been in the terrestrial radio industry since 1976 and has held a variety of executive positions.
"Tim's experience with delivery and monetization of multiple types of streaming media will benefit Abacast greatly as we move our online radio, advertising, and video platforms forward," said Dan Huntington, Abacast Chairman of the Board. "John's extensive background and experience in the terrestrial radio industry will help us as we evolve Abacast's radio streaming and platform solutions."
Separately, Abacast announced 18% revenue growth and record revenues for 2009, its fourth consecutive year of double-digit top-line growth.
"With terrestrial radio industry revenues down 20-to-25% in 2009, we are pleased with the double-digit growth we achieved last year and will continue to focus on return-on-investment (ROI) for our online radio customers," said Rob Green, Abacast Interim CEO and Director.
Abacast is a commercial quality, hybrid content distribution network (CDN), offering the most options in the industry to distribute and monetize rich media for the online radio, online video, and enterprise markets.
Report from CEO Marty Lafferty
If you have one event to attend to celebrate the end of winter, make it the P2P & CLOUD MARKET CONFERENCE.
This first-ever special event is scheduled for Tuesday March 9th at the Princeton Club of New York and is being held in conjunction with the Media Summit New York (MSNY).
In the first weeks of February 2010, we have already experienced groundbreaking news ranging from P2P being deployed in Akamai's mainstream content delivery network (CDN) to speed the delivery and improve the technical quality of National Football League (NFL) video content to BitTorrent being harnessed by Twitter to streamline server upgrades across its vast network.
Meanwhile, interest in cloud computing continues to soar - some 3,233% since 2007 - with start-ups like Makara now making it easier to get traditional software applications onto the cloud infrastructure and Heroku helping companies deploy and manage their cloud applications. At the same time, the more established players in this space like Amazon EC2 and Salesforce.com are breaking records.
Our keynotes will include John Waclawsky, Member Services, DCIA; Rick Kurnit, Partner, Frankfurt Kurnit Klein & Selz; Mike Saxon, SVP, Technology, Media and Telecom, Harris Interactive; Nicholas Butterworth, CEO, HD Cloud; Zeeshan Zaidi, COO, LimeWire; Kumar Subramanian, CEO, MediaMelon; Michael Papish, CEO, MediaUnbound; Nicholas Longano, CEO, MusicMogul; Robert Levitan, CEO, Pando Networks; and Lydia Parnes, Partner, Wilson Sonsini Goodrich & Rosati.
The day-long March 9, 2010 conference features keynotes and panels of industry leaders from the forefront of innovation. There will be a continental breakfast, conference luncheon, refreshment breaks, and VIP networking cocktail reception.
P2P & CLOUD MARKET STRATEGIES will address questions such as how are the market strategies different for using P2P or cloud computing to distribute consumer entertainment and corporate enterprise data? What characteristics are required for software companies to succeed in key market segments? Should software companies concurrently pursue multiple strategies? How do live P2P streaming and wide-area cloud deployments impact major market segments? What unique market attributes can yield new opportunities for monetization?
Panelists will include Simon Applebaum, Host & Producer, Tomorrow Will be Televised; Ian Donahue, President, RedThorne Media; Jason Henderson, Games Product Manager, Verizon Communications; David Johnson, Of Counsel, Jeffer Mangels Butler & Marmaro; Mark Mackenzie, VP, Digital Media Ventures, Alliance Bernstein; and Mike Tedesco, VP, Product Development and Technology, World Wrestling Entertainment.
P2P & CLOUD BUSINESS MODELS will zero in on such issues as has as any alternative business model - paid-download, subscription, or advertising-supported - yet proven to be the most promising in the consumer sector? Have any more innovative approaches been attempted? What are the most advanced approaches for P2P content protection? What wholesale content and enterprise business models are coming into play? How can users, both at the consumer and corporate levels, navigate among P2P and cloud service offerings?
Panelists Will include Mick Bass, VP, Alliance Management, Ascent Media; Vincent Hsieh, CEO, Aleric; Steve Lerner, Practice Director, CDNs & Management, RampRate; Dan Schnapp, Partner, Hughes Hubbard & Reed; and Greg Stephens, Director/VP, Songwriters Association of Canada (SAC).
P2P & CLOUD CASE STUDIES will ask what techniques have proven best so far in terms of monetizing the enormous traffic that P2P generates? What successes in cloud computing have been achieved in the wholesale entertainment and enterprise categories? What has been the relative worth of the different formats and interactivity that this channel supports? What case studies from related businesses can be applied to P2P and cloud computing and how?
Panelists will include Melike Amjarv, Independent Producer; Tom Chernaik, Principal, DigComm; Norman Henderson, VP of Business Development, Asankya; Steve Mannel, Media & Communications Industry Director, Salesforce.com; and Chuck Stormon, VP, Strategic Accounts & Alliances, PacketExchange.
P2P & CLOUD FUTURE OPPORTUNITIES question what can the industry do to ensure that the benefits of P4P and similar mechanisms are applied to the distribution of copyrighted works? How can participants at various levels of this channel gain support of rights holders? Which identification techniques (e.g., watermarking and/or fingerprinting) should be used to protect content and enhance the ecosystem? What new solutions will impact P2P and cloud computing software developers and distributors to the greatest degree?
Panelists will include Scott Campbell, CEO, Virtually Atomic; Lawrence Low, VP of Product Management & Strategy, BayTSP; Rich Moreno, Principal, Sivoo; Graham Oakes, Chairman, Digital Watermarking Alliance (DWA); Neerav Shah, VP of Business Development, Verimatrix; and David Ulmer, Senior Director, Multimedia, Motorola.
P2P & CLOUD MARKET CONFERENCE early registration rates, which offer substantial savings, end March 2nd. For more information, please visit www.dcia.info/activities.
Registration can be done online here or by calling 410-476-7964. For sponsor packages and speaker information, please contact Karen Kaplowitz, DCIA Member Services, at 888-890-4240. Share wisely and take care.
Ipswitch Adds Enterprise P2P File-Sharing Capability
Excerpted from SC Magazine Report
Ipswitch has launched a new WS_FTP server that enables P2P file sharing.
It claimed that WS_FTP Server 7.5 will give businesses the ability to encrypt, control, authenticate, and manage files moving inside and outside of their organizations.
Other features include the ability to allow organizations to quickly and securely share files through Microsoft Outlook or a web browser, oversee and manage all file-sharing activities internally and externally, and eliminate the "file attachment" burden from the e-mail server.
Jonathan Lampe, Vice President of Product Management at Ipswitch File Transfer, said, "We are setting a new standard for information sharing by giving organizations the visibility and controls they need, while making it easier than ever for employees to securely share files of any size - both internally and externally."
Where Was the Super Bowl Cloud Computing Ad?
Excerpted from Silicon Angle Report by John Casaretto
If 2010 is the year of cloud computing, you would think someone would have doubled down on a thirty-second advertisement during what turned out to be the most watched telecast in history.
Sure. Maybe the audience might be a little broad, but it is still a heck of a way to get the word out. Need I remind you of the numerous dry, pointless, and consistent ads from Cisco, IBM and such?
I'm not saying we have to get all "GoDaddy" with the ad, but there could have been something. I mean it is the super technology that will be affecting everyone, right? Right?
It's more than likely that the audience would have no clue what they're watching. What we don't need is the common person walking away from the Super Bowl more confused about a topic that confuses the majority of the very people in the technology business selling and using the stuff.
The message of cloud computing, grid computing, utility computing is not ready for the masses. Until a major company steps up and proclaims through a very clear offering or message, "THIS IS THE END OF SOFTWARE AND COMPUTING AS YOU KNOW IT", then it is clearly not ready for that level of awareness.
That being said, it may be until the next Super Bowl that the mass audience opportunity presents itself again. We have a long way to go within our own community before we can get to that point.
P2P Service Voddler Gets Another $3.5 Million in Funding
Excerpted from paidContent Report by Ingrid Lunden
Swedish P2PTV streaming service Voddler, still in beta, has picked up an additional $3.5 million in funding. This round comes from Eqvitec Partners and brings the total amount that Voddler has received to $20.2 million. Hadar Cars, a Partner at Eqvitec, will join the Board of Voddler in conjunction with the closing. To coincide with the announcement, Voddler is also extending its beta program to Norway.
Voddler already has 420,000 users in Sweden, and the plan is to extend that to other countries in Europe by the end of this summer. Voddler says it has more than 30 patents covering its delivery technology, which is built on a P2P distribution system to speed-up delivery of video content.
"It's very safe for the movie companies," says the spokesperson. "We can take back the segments of content stored with our users at any time."
Voddler currently distributes films from 15 studios, including Columbia, Disney, Dreamworks, Touchstone, MGM, Paramount, Sony, and Touchstone. With lots of initial interest for the service from both users and the industry, the main challenge for Voddler will be monetizing its content. The company says it is mulling over a subscription model.
Currently between 90 and 95% of the content is free-to-view, with films running 2-to-4 minutes of pre-roll ads. The remainder of Voddler's movies are offered on a per-day rental model costing between $2.50 and $5.00 per title. Most of the content is in English with a choice of languages for subtitles, and films that "are delivered in HD are sent out by us that way, too," says the spokesperson.
This round of funding follows another that closed last year. In October 2009, Voddler filed SEC documents stating it had raised $2.2 million from backers that included Deseven Capital. Then, in November 2009, it picked up an additional $4.6 million from private backers.
Demystifying Cloud Computing
Excerpted from CIO Insight Report by Kevin Smilie
It's tough to tell the difference between true cloud computing and the smoke created by hype and buzz. Here's how to clear the air.
As chief information officers (CIOs) consider new corporate computing options, they often find themselves awash in cloud metaphors thanks to the overwhelming desire to achieve the promise of cloud computing.
The potential for cloud computing is compelling. For business, it promises faster access to technology and better alignment to demand. That offers agility, which can deliver significant competitive advantage. For example, a retailer can use the vast capacity of the cloud to quickly analyze consumer behavior and respond with pricing changes, different inventory levels or new advertising - even when its own server capacity is fully taxed. That can make the difference in a quarter's financial results.
Cloud computing has the potential to make that extra computing capacity available in minutes or hours and provide the flexibility to turn it off as soon as it's no longer needed without the residual capital asset and operating costs. The problem is that it's hard to tell the difference between clouds and smoke. There is a lot of hype in the market. CIOs have heard many of the promises before with utility computing.
But things are different this time. We've already adopted cloud services into our personal lives with technologies such as the iPhone, and that's taken place as CIOs redefine the functions that really need to be done within the walls of the enterprise.
Bottom line: We've seen how cloud computing has benefited consumers, and many CIOs are ready to try it within the enterprise. Small and midsize businesses (SMBs) are already adopting cloud computing, since they lack the "advantage" of complex legacy environments that constrain their larger competitors.
Services such as Salesforce.com and Google Apps are well-established within the SMB market and are rapidly expanding.
CIOs of large companies are beginning to adopt certain proven services while piloting cloud computing services for broader uses such as development and test platforms as they seek to understand the new service delivery option available to them.
So, what do CIOs need to know to tell the difference between a real cloud and the smoke of marketing hype? Consider three things:
First, decide how to best harness the cloud for your business needs. Develop a plan that fits cloud computing into your IT service delivery model. That means understanding when technology services must remain within the organization and when they can be shifted outside. Consider the complexity and integration of your application portfolio. And understand both your local regulatory environment and your company's willingness to move services and data outside your firewall.
Second, recognize the limits of cloud computing. Understand the market and what it can deliver today. Compare your current costs to the price of available services. Compare available service levels to your needs. Recognize the constraints of your current agreements and what you can do to remove them.
And third, set expectations for what the cloud will do for your company and when. Partner with your customers to progressively introduce new services and gain their support. Determine how cloud computing will be incorporated into your IT governance and get agreement among your corporate leadership.
Cloud computing offers real advantage to companies that can see through the smoke and get a clear view of the new business technology landscape.
Smart Grids May Utilize P2P Networks
Excerpted from Smart Meters Report
While utility companies may see lower profits after establishing a global smart grid that uses and manages power more efficiently, other companies responsible for applications, devices, data, and building the new infrastructure stand to reap significant financial rewards.
Cisco is one of those companies poised to be major player in smart energy. Phil Smith, Vice President of Technology for Cisco in Europe notes, "We think routing and switching and apps are a great opportunity. And Cisco hasn't done too badly there. But the whole metering structure and the way the information is measured is not in place either. There needs to be a much flatter structure that allows energy to be fed in and out of the grid in a much more peer-to-peer basis that would allow storage and micro-generation."
The cable television industry is currently developing P2P distribution systems to free-up bandwidth and improve delivery efficiency. Based on the technology made famous by Napster, P2P systems connect a file supplier with a recipient who wants to download it. Despite the legal issues prompted by users' copyright infringements, the technology itself is a highly efficient distribution method.
"The grid itself is about real-time management that delivers energy when it's cheapest or most available from a certain areas," explains Smith. "If you look at utilities companies, they are built in a very traditional model that looks like what networks used to be. In the early days, most networks started in the middle. Utilities are the same - they are built on power generation. It's shared down through a hierarchy through a plug in the wall. Clearly the girds today are not set up to deal with the challenges of today."
The potential fiscal impact of smart grids is significant. The United States Department of Energy (DoE) estimated establishing a smart grid would result in a $117 billion savings over the next 20 years. Research from the UK's Department of Energy and Climate Change concludes the international development of smart grids is creating a exploding global market, estimated to be worth $50 billion over the next five years.
"Clearly it's an attractive modernization opportunity," acknowledges Smith. "But I think unless you create a structure you can't have a proper smart grid because if you can't share properly, you can't do all the things you need to do. You don't have to build more power stations, as that's expensive. We can see it evolving through a real-time infrastructure and putting in distributed intelligence at the demand and supply ends."
Is Opera 10.5 the "Fastest Browser on Earth?"
Excerpted from Webmonkey Report by Scott Gilbertson
Opera has released the first beta for Opera 10.5, boasting that it's "the fastest browser on Earth." We took a copy for a spin and found that it is indeed snappy, besting Safari 4 and Firefox 3.6 in our informal testing.
At the moment the Opera 10.5 beta is available for Windows only; the Mac and Linux versions of Opera 10.5 remain alpha releases, though Opera assures Webmonkey that beta releases for both are in the works.
Part of the reason of the delay on other platforms may be Opera 10.5's focus on tightly integrating with Windows 7. Opera 10.5 beta takes advantage of all the Aero Glass effects in Windows 7 and integrates nicely with Aero Peek, Jump Lists, and other Win 7-specific features.
Opera 10.5 also looks significantly different, having eliminated the menu bar in favor a new "Opera menu," which looks and behaves much like the single button menus found in Microsoft Office. The Opera menu is unobtrusive, hanging down like an inverted tab on the far left of your window, and saves considerable screen real estate, making it very nice for netbooks. If it's not to your liking you can turn the old menu back on by clicking "show menu bar."
Aside from the revamped look of Opera 10.5 the big news in the beta release is speed. Opera is calling the beta "the faster browser on Earth," a bold claim, but one that, at least partially, lives up the hype.
In our informal testing, Opera recorded the fastest start-up times of any browser in Windows 7, besting even Chrome by just a hair. When it comes to page rendering times the new Carakan rendering engine and the new Vega graphics engine in Opera 10.5 clearly speed things up, but as for the fastest browser on Earth, well, it's hard to say.
Certainly Opera 10.5 is significantly faster than the current, official version of Opera, and can hold its own with any other browser out there. Opera 10.5 consistently beat Safari's page rendering times, but against Firefox and Chrome the results were a bit more of a mixed bag - sometimes Opera came out on top, other times not.
However, at this point all four browsers are so close in terms of speed that the real differentiating factor is the feature set. And it's here that Opera really shines with nice Windows 7 features as well as plenty of extras, including everything from a BitTorrent client to Opera's Unite web server tools (not part of the beta release, but no doubt set to arrive before Opera 10.5 is finished).
Opera 10.5 beta also has some small, but very useful new features like the new URL bar search field. Part of Opera's new URL search features are lifted from Firefox's Awesomebar - allowing you to search your history and bookmarks as you type - but Opera goes a little beyond Firefox by allowing you to search actual content on the pages you've visited, and integrates your search engine plug-ins.
Opera 10.5 also sees the browser continuing its pioneering support for web standards with more HTML5 support (including the video tag using the Ogg Theora codec) and CSS 3 (transitions and transforms are now supported).
Last but not least, Opera catches up to other browsers by adding a private browsing mode.
Although this release is still a beta, we found it to be plenty stable in our testing and the speed boost definitely makes it worth a download if you're an Opera fan (and using Windows). Mac and Linux users will have to wait, but we'll be sure to let you know when those versions are available.
Distributed Computing Links PCs to Map the Milky Way
At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.
Enthusiastic and inquisitive volunteers from Africa to Australia are donating the computing power of everything from decade-old desktops to sleek new netbooks to help computer scientists and astronomers at Rensselaer Polytechnic Institute (RPI) map the shape of our Milky Way galaxy. Just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world's second fastest supercomputer.
The project, MilkyWay@Home, uses the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which is widely known for the SETI@home project used to search for signs of extraterrestrial life. Today, MilkyWay@Home has outgrown even this famous project, in terms of speed, making it the fastest computing project on the BOINC platform and perhaps the second fastest public distributed computing program ever in operation - just behind Folding@home.
The interdisciplinary team behind MilkyWay@Home, which ranges from professors to undergraduates, began the formal development under the BOINC platform in July 2006 and worked tirelessly to build a volunteer base from the ground up to build its computational power.
Each user participating in the project signs up his or her computer and offers up a percentage of the machine's operating power that will be dedicated to calculations related to the project. For the MilkyWay@Home project, this means that each personal computer is using data gathered about a very small section of the galaxy to map its shape, density, and movement.
In particular, computers donating processing power to MilkyWay@Home are looking at how the different dwarf galaxies that make up the larger Milky Way have been moved and stretched following their merger with the larger galaxy millions of years ago. This is done by studying each dwarf's stellar stream. Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.
The galactic computing project had very humble beginnings, according to Heidi Newberg, Associate Professor of Physics, Applied Physics, and Astronomy at RPI. Her personal research to map the three-dimensional distribution of stars and matter in the Milky Way using data from the extensive Sloan Digital Sky Survey could not find the best model to map even a small section of a single galactic star stream in any reasonable amount of time.
"I was a researcher sitting in my office with a very big computational problem to solve and very little personal computational power or time at my fingertips," Newberg said. "Working with the MilkyWay@Home platform, I now have the opportunity to use a massive computational resource that I simply could not have as a single faculty researcher, working on a single research problem."
Before taking the research to BOINC, Newberg worked with Malik Magdon-Ismail, Associate Professor of Computer Science, to create a stronger and faster algorithm for her project. Together they greatly increased the computational efficiency and set the groundwork for what would become the much larger MilkyWay@Home project.
"Scientists always need additional computing power," Newberg said. "The massive amounts of data out there make it so that no amount of computational power is ever enough." Thus, her work quickly exceeded the limits of laboratory computers and the collaboration to create MilkyWay@Home formally began in 2006 with the assistance of the Claire and Roland Schmitt Distinguished Professor of Computer Science Boleslaw Szymanski; Associate Professor of Computer Science Carlos Varela; Postdoctoral Research Assistant Travis Desell; as well as other graduate and undergraduate students at RPI.
With this extensive collaboration, leaps and bounds have been made to further the astrophysical goals of the project, but important discoveries have also been made along the way in computational science to create algorithms that make the extremely distributed and diverse MilkyWay@Home system work so well, even with volunteered computers that can be highly unreliable.
"When you use a supercomputer, all the processors are the same and in the same location, so they are producing the same results at the same time," Varela said. "With an extremely distributed system, like we have with MilkyWay@Home, we are working with many different operating systems that are located all over the globe. To work with such asynchronous results we developed entirely new algorithms to process work as it arrives in the system." This makes data from even the slowest of computers still useful to the project, according to Varela. "Even the slowest computer can help if it is working on the correct problem in the search."
In total, nine articles have been published and multiple public talks have been given regarding the computer science discoveries made during the creation of the project, and many more are expected as the refined algorithms are utilized for other scientific problems. Collaboration has already begun to develop a DNA@Home platform to find gene regulations sites on human DNA. Collaborations have also started with biophysicists and chemists on two other BOINC projects at Rensselaer to understand protein folding and to design new drugs and materials.
In addition to important discoveries in computer science and astronomy, the researchers said the project is also making important strides in efforts to include the public in scientific discovery. Since the project began, more than 45,000 individual users from 169 countries have donated computational power to the effort. Currently, approximately 17,000 users are active in the system.
"This is truly public science," said Desell, who began working on the project as a graduate student and has seen the project through its entire evolution. "This is a really unique opportunity to get people interested in science while also allowing us to create a strong computing resource for RPI research." All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home website.
Desell cites the public nature and regular communication as important components of the project's success. "They are not just sitting back and allowing the computer to do the work," he says, referencing that volunteers have made donations for equipment as well as made their own improvements to the underlying algorithms that greatly increased computational speed. Varela jokes, "We may end up with a paper with 17,000 authors."
In addition to the volunteers, others within RPI and outside of the Institute have been involved in the project. Some of these collaborators include Rensselaer graduate students Matthew Newby, Anthony Waters, and Nathan Cole; and SETI@home creator David Anderson at Berkeley. The research was funded primarily by the National Science Foundation (NSF) with donations of equipment by ATI, IBM, and NVIDIA
Urlin Offers a New P2P File-Sharing Client
Internet users today look for high-quality digital content sharing networks. Urlin intends to become a new standard for delivering high-quality content over the Internet.
The Urlin P2P client is designed to bring maximum convenience to the user. Easy content search and high-speed content delivery are key features of Urlin. With the Urlin client, any network user can search, share, and download multiple files quickly. It even allows people downloading a file to upload parts of it at the same time.
Urlin can be used for distribution of very large files, very popular files, and files available for free; as it is a lot cheaper, faster, and more efficient to distribute files using Urlin than a regular download. All one needs is a computer with an Internet connection and the Urlin client.
Urlin P2P has many unique features, which will make it a favorite among users of file-sharing tools. Using it does not require any tracker registration. The user's download/upload ratio is not used. The powerful integrated meta-search gets the most relevant file and torrent results. The client allows multiple simultaneous downloads, which have "quick-resumes" in case of interrupted transfers. The client also has configurable integrated IP-filter and intelligent bandwidth, queue, and speed management
The Urlin client is small in size and easy to install. The set-up wizard provides easy installation and configuration. Users can choose any of the 12 built-in skins depending on their choice of color and theme. The client provides absolute anonymity for file-sharing users. The Urlin client has support for localization and with a language file present, will automatically switch to your system language. If your language isn't available, you can easily add your own, or edit other existing translations to improve them.
The Urlin client also has many features that help its search function to yield only relevant file and torrent results. The release wizard pane contains built-in release wizards for movies, animations, music, images, games, software, and books. The built-in IP-filters tool is available to impose restrictions. It is an easy-to-use packet filter. It features the ability to block incoming/outgoing IP packets. You can create/delete your own IP-filters. You can enable/disable created filters.
The search pane has different panes for movies, music, software, and games search. The search fields allow you to enter a query to be searched. Urlin is a technology development company that specializes in state of the data compression and effective content delivery technology. Its Urlin client, the next generation P2P file-sharing software program, is designed to bring maximum convenience, easy search, and high-speed content delivery to the user.
Cacaoweb Leverages P2P for Megavideo
In its latest move, cacaoweb has added a feature to remove time limits and advertising from the Megavideo platform. Megavideo is a website and platform that has been providing the latest TV shows and movies for free via P2P streaming.
The Megavideo website is hosted in the US and has been seeing significant growth for the last 2 years. cacaoweb now effectively removes the previously existing time limit of only allowing users to view 72 minutes of content as well as removes advertising from Megavideo content. Users simply need to install the light cacaoweb plug-in on their computers to experience these cacaoweb innovative features.
These non-trivial features were achieved in cacaoweb by using its open-source P2P network: when a user watches a video, cacoweb downloads parts of the video from other peers.
According to Gerard Lantau, Chief Technical Officer (CTO) of cacaoweb, "This new feature of cacaoweb brings us closer to our long-term goal of providing end-users with free and anonymous access to all multimedia content in native quality. It will bring more users and exposure to the cacaoweb platform. TV show and movie websites like Joohoo or cacaoTV have already made the move to cacaoweb. We hope that more video platforms will join us in the future."
cacaoweb has provided an application program interface (API) and integration guide for webmasters using Megavideo delivered content in their websites. By integrating cacaoweb, they allow their users to watch this content with no time limits or advertising interruptions. Additionally, webmasters are able to place their own advertising adjacent to the videos, generating an effective new revenue stream.
Sensitive Information Retrieved from File-Sharing Networks
Excerpted from Help Net Security Report
Security researchers Larry Pesce and Mick Douglas demonstrated on Friday - at this year's ShmooCon security conference in Washington, DC - the amazing variety of sensitive information that people send out over file-sharing networks, without a thought as to what would happen if such information fell into the wrong hands.
Using search terms such as word, doctor, health, passwd, password, lease, license, passport and visa; file names like password.txt, TaxReturn.pdf, passport.jpg, visa.jpg, license.jpg and signons2.txt; and a myriad of file extensions, they managed to get their hands on tax forms containing complete personal information of the taxpayer, IRS forms with identification numbers on it, driver's licenses and passports, event schedules (names, hotel room numbers, performance dates and locations), financial retirement plans, and even information about a student that offered to help US forces in Iraq and is currently hiding for fear of torture and death!
The conclusion? Security awareness is still nonexistent among the typical low-level users, and the process of education must be continued for as long as it takes to make everybody aware of the dangers of sharing sensitive and/or personal information through insecure channels.
Network World reports that the two researchers also presented the Cactus Project, whose purpose is to help organizations carry out this kind of research and impose changes to improve security when it comes to file sharing on the Gnutella -based network.
Spotify Does Damage Control on Warner Music Group
Excerpted from Digital Music News Report
Warner Music chief Edgar Bronfman all-but-dismissed Spotify as a licensing possibility in the US on Tuesday, and that is stirring speculation of a broader, worldwide catalog pull.
Not so, according to Spotify. The company is now attempting to tamp-down speculation of a broader divorce outside of the US. "To be clear, WMG is not pulling out of Spotify. Media is taking things out of context," Spotify tweeted. "So don't worry-be happy :)"
Bronfman seemed to be referring to a new licensing deal in the United States, and existing deals are probably okay. Then again, Bronfman is Chairman of the global enterprise, and a broader pullout is certainly not out of the question. In pockets of Europe, Spotify is showing important progress on premium, though the broader percentages are still discouraging.
At Midem, Spotify chief Daniel Ek shared a premium subscriber number of 250,000, a number that approaches just 4% of the broader user base.
Labels Admit Damages Are Out of Proportion
Excerpted from TechDirt Report by Mike Masnick
Slashdot points us to the news that, as was widely expected, the record labels have opted for a third trial of Jammie Thomas-Rasset, rather than accept the reduced award of $2,250 per song, as set by the judge. Not surprisingly, the labels are doing this because they disagree with the precedent of a judge changing the jury award, and the new trial is limited solely to the damages question.
But, honestly, the whole thing is a bit weird. If the judge can reduce the older jury award, and a new jury sets a higher rate, can the judge just reduce it again, and we go through this entire process for the fourth time?
The Slashdot post, written by Ray Beckermann claims that the labels "could only win a verdict that is equal to, or less than, $54,000," in the new trial, but I'm not sure why he says that.
Is it because the judge would reduce it again? This is not at all clear. Still, the actual filing from the RIAA's lawyers has some interesting claims: "Plaintiffs find it impossible to accept a remittitur that could be read to set a new standard for statutory damages - essentially capping those damages at three times the minimum statutory amount of $750 (or $2,250) for noncommercial individuals. This far-reaching determination is contrary to the law and creates a statutory scheme that Congress did not intend or enact."
It's a bit of a stretch to claim that this would be a cap on "any" unauthorized noncommercial file distribution. Any court still has the right to take into account the specific circumstances to make sure the award is proportionate to the rights being violated. The labels' lawyers are stretching what the judge said here.
What the judge was doing here was recognizing that the amount the jury awarded was clearly out of proportion to the actual infringement.
Woe is the RIAA. If the awards for unauthorized sharing of a $1 song that might help promote their artists and help them make more money - if only the RIAA were to adapt to a changing marketplace - might "only" be 2,250 times the market price of the song? Cry me a river. And, even more ridiculous is the claim that this is some undue burden on the RIAA that it might have to actually sue over all of the songs someone distributed in an unauthorized manner, rather than just selecting a handful as it does now.
This is a major issue. Technically, the RIAA has been able to just pick a couple dozen songs and sue over those, knowing that the totally disproportionate statutory damages will "cover" the rest. But does that seem right to anyone?
The idea that rather than proving the actual harm done by the actual distribution, the RIAA is allowed to just pick a "sampling" and without proof get back many times the price without even presenting any actual evidence of the wider damage or the wider distribution of more files?
It seems perfectly reasonable to expect the RIAA should have to actually include what they claim was infringed, rather than being able to just pick a handful, knowing that the totally out of proportion statutory damages will "cover" the rest.
In fact, the paragraph above is effectively the RIAA admitting that it knows the statutory damages are out of proportion, but believes this is "fair" because the RIAA is too cheap and too lazy to actually sue people for what it claims they infringed.
Coming Events of Interest
paidContent 2010 - February 19th in New York, NY. Join paidContent.org for its first namesake conference with senior business leaders representing publishers, content technology companies, investors, analysts, and leading members of the press and blogging community to discuss the most pressing business issue of our day.
10th Annual Digital Music Forum - February 24th-25th in New York, MY. The only event in the United States that brings together the top music, technology, and policy leaders for high-level discussions and debate, intimate meetings, and unrivaled networking about the future of digital music.
P2P & CLOUD MARKET CONFERENCE - March 9th in New York, NY. Strategies to fulfill the multi-billion dollar revenue potential of the P2P and cloud computing channel for the distribution of entertainment content. Case studies of sponsorships, cross-promotion, interactive advertising, and exciting new hybrid business models.
Media Summit New York - March 10th-11th in New York, NY. MSNY is the premier international conference on media, broadband, advertising, television, cable & satellite, mobile, publishing, radio, magazines, news & print media, and marketing.
DDEX Open Meeting & Workshop - March 11th-12th in Paris, France. The open meeting features an update on DDEX's standards, case studies on implementations, and an explanation of DDEX's work-plan for 2010, and the workshop focuses on "Identification Standards and Metadata in the Music Industry," and is being held with the assistance of CISAC and IFPI.
Cloud Computing Congress - March 16 in London, England. A practical guide on cloud computing for business - the value proposition, and the impact on the IT function. Building and managing applications in the cloud - how to manage and control applications and resources in the cloud environment. Security, testing and management of cloud infrastructures.
Cloud Expo - April 19th-21st in New York, NY. Co-located with the 8th international Virtualization Conference & Expo at the Jacob Javits Convention Center in New York City with more than 5,000 delegates and over 100 sponsors and exhibitors participating in the conference.
|