January 30, 2012
Volume XXXVIII, Issue 3
Rendering the Cloud Ready for Hollywood
Excerpted from TelecomTV Report by Guy Daniels
Those clever animation chaps at Pixar have turned their attention to the cloud, and how on-demand services could help the movie post-production business with image rendering.
Rendering is the stage in the post-production process where the layers of effects and graphics applied to individual "frames" of digital footage (whether building on top of existing footage or starting from scratch) are processed to create a new "flat" output. As CGI and VFX artists beaver away on their high-powered computer systems, much of their work is held locally in cache, being processed into playable images on the fly. This needs locking down - or rendering - to make the footage playable elsewhere.
The rendering process is a huge undertaking and involves considerable investment in data-crunching hardware. The alternative has been to physically farm-out the work to specialist companies located nearby, but it is still a costly undertaking.
Which is why Pixar Animation Studios has created its RenderMan On Demand service.
According to Pixar, RenderMan On Demand is "an online rendering service for Pixar's RenderMan, offering immediate access to the power of the clouds vastly scalable computing resources without the expense of building and running your own render farm."
The managed service is accessible directly through an interface developed and administrated by cloud services provider GreenButton, and is now available on Microsoft's Windows Azure with Linux availability due later in 2012. Pixar give the following additional details:
"For two decades, Pixar's Academy Award-winning RenderMan has led the revolution in rendering visual effects and animation, and is the standard for creating the outstanding levels of visual photorealism that audiences expect. RenderMan On Demand is designed to provide studios and individual artists with immediate access to a professionally configured rendering service. The initial phase is targeted at small to medium-sized studios based on Microsoft Windows environments. Over the next two years, it will be expanded into a comprehensive solution for users of all levels and studios of all sizes."
The service is purchased in hourly units on a pay for use basis. The price includes rendering time, uploads, downloads, and storage charges, and can be pre-purchased from a browser. Pricing depends on the number of processing cores and the amount of RAM required.
Nicolas Chaverou, Project Manager for the Golaem Crowd "crowd simulation" system, is already a happy convert:
"Working within a tight deadline has always been difficult especially when rendering animation at the very last minute. In spite of the time difference, the process was very straightforward, asset upload and distribution on the Cloud, and 54 minutes of Cloud Rendering later it was in a wrap instead of the 20 days it would have otherwise required."
MIT Researchers Develop Compression Algorithm
Excerpted from Around the Net Report
Massachusetts Institute of Technology (MIT) researchers presented a new algorithm this week at the Symposium on Discrete Algorithms (SODA) that could be used for image compression of large video files required to wirelessly transmit to smart-phones without draining battery life or consuming an overabundance of bandwidth.
The algorithm supports the Fourier transform concept, described as fundamental in information sciences. While it's universal in signal processing, it can also be used to compress image and audio files, solve differential equations and price stock options, and more.
Read the whole story at Massachusetts Institute of Technology.
Report from CEO Marty Lafferty
The DCIA is excited to announce the CLOUD COMPUTING CONFERENCE, a new full-day event-track within the 2012 NAB Show in Las Vegas, NV, taking place on Monday April 16th at the Las Vegas Convention Center.
Our NAB - CLOUD COMPUTING CONFERENCE will demonstrate the new ways cloud-based solutions are providing increased reliability and security for content distribution.
From collaboration during production, to post-production and formatting, to interim storage, delivery and playback on fixed and mobile devices, to viewership measurement and analytics, cloud computing is having an enormous impact on video delivery.
Topics will include cloud privacy, reliability, and security issues; advanced capabilities, new features and cost advantages; the impact on consumer electronics and telecommunications industries; and the years ahead for cloud computing.
The DCIA plans to announce our impressive line-up of industry-leading speakers for this major industry event next week. Sponsorship opportunities at the new CLOUD COMPUTING CONFERENCE and exhibiting opportunities at the new CLOUD COMPUTING PAVILION on the show floor are still available.
The first-ever CLOUD COMPUTING CONFERENCE at NAB will gather senior executives from media and enterprise companies to explore this rapidly emerging technology that promises to help manage proliferating devices, improve scalability for IT solutions, and deliver higher speeds, better workflow, and efficient storage solutions.
If IPTV or online delivery is in your current or future operating plans, you won't want to miss these discussions focused on cloud-delivered content and its impact on consumers, television manufacturers, telecom industries, and the media.
For information on sponsorship opportunities, please contact advertising@nab.org.
If you are a cloud services provider to digital media, NAB Show's new CLOUD COMPUTING PAVILION offers an affordable and professionally-produced turnkey package to showcase your cloud solutions.
Become an exhibitor and position your innovative technology and ideas squarely in the center of this multi-billion dollar marketplace. Contact exhibits@nab.org for more information.
Here's an overview of what to expect at the CLOUD COMPUTING CONFERENCE from our series of keynote addresses and panel discussions through the day on Monday April 16th.
We plan to open with an overview of the advanced capabilities, new features, and cost advantages that cloud computing is bringing to the entire audio/video (A/V) ecosystem. Consider this a crash course in how cloud computing is being applied throughout the creation and distribution chain for television and radio programming, motion pictures, corporate A/V production, and user-generated content (UGC).
Then we'll step back to examine the pitfalls. First, what are the privacy issues, reliability questions, and security concerns raised by implementing cloud computing solutions in this space?
For consumers, creators, rights-holders, software providers, and broadband network operators, and related cloud services vendors, migrating to the cloud poses different but inter-related hurdles to overcome. And second, how is the distributed computing industry addressing these problems?
From there, we plan to delve into specifics regarding the current status and what's just around the corner with cloud-computing solutions deployments at key stages in the A/V ecosystem -- from content origin to delivery performance measurement and analysis.
Our first set of keynotes and panel sessions in this section will examine audio/video pre-production, production, and post-production clouds. Software-as-a-Service (Saas), platform-as-a-Service (PaaS), and even Infrastructure-as-a-Service (IaaS) solutions are being brought to bear to improve virtually every key aspect of file-based workflow for A/V content.
The next set of discussions will explore what has been the most publicized area for the implementation of cloud computing for A/V: storage and delivery. We'll go beyond the hype to provide delegates with a far deeper understanding of the technology policy/rights considerations, and economics behind such concepts as "cloud media lockers" and the newest peer-assisted hybrid solutions in "quantum computing."
And our third set of keynotes and panels will highlight one of the relatively unsung areas of cloud computing deployment that promises to yield enormous value - arguably as much if not more than each of the two preceding areas - in the fullness of time: measurement and analysis. For marketers, sponsors, and advertisers, the ability to access "dashboards" that provide anonymized listener and viewer behavior in an unprecedented level of detail in real-time alone will bring revolutionary changes to programming, scheduling, and sell-through services. Combining these with aggregated demographic and psychographic data, audience flow trends, and additional behavior information will vastly improve the productivity of current revenue streams and efficiently guide the way to new ones.
We will also invest time to discuss the implications of cloud computing deployments in the A/V ecosystem on the consumer electronics (CE) and telecommunications industries. The impacts range on these sectors range from the predictable to the totally unexpected.
Finally, we'll close-out a full-day's immersion into all matters "cloud" by forecasting what's still to come in the years ahead for cloud computing. The fact is that we're still at an early stage of realizing the full potential of this emerging technology and attendees should be able to benefit tremendously from a glimpse of what industry experts see as what's yet to come.
Sign-up today for the NAB Show and the CLOUD COMPUTING CONFERENCE. Share wisely, and take care.
Videology Measures Offline Segments of In-Stream Videos
Excerpted from Online Media Daily Report by Gavin O'Malley
Can you accurately measure the impact of online video advertising on offline consumer purchases?
Videology is going to try. The ad platform, formerly known as TidalTV, is entering into dual partnerships with database marketing and behavioral targeting services provider I-Behavior and Kantar Shopcom, which runs a database containing information from 231 million consumers across 270 CPG, retail, travel, lodging and services categories.
The goal is to help marketers reach users based on their demographic makeup or in-store activity, explained Kevin Haley, Chief Scientist at Videology.
"What advertisers really want to know is if their advertising moves soap off the shelves," says Haley. He says the ability to provide advertisers with ongoing, offline ROI measurement should have a "significant impact on advertising strategies within the digital video space."
With the three-way partnership, advertisers can target offline purchase-based segments across Videology's in-stream video network of more than 80 million consumers, Haley promised.
Meanwhile, given the volume of data that will result from the enterprise, Haley sees an opportunity for analysis of purchase behavior at the brand level, including increases in sales volume, frequency of purchase and retail penetration.
Launched in late 2007, Videology was known to the world as TidalTV until earlier this month. The name change was meant to convey a more video- and technology-heavy image.
To coincide with renaming, Videology recently debuted a sell-side platform to complement the capabilities currently offered to media agencies.
Videology is competing for a share of a vastly expanding and competitive market. eMarketer estimates that by 2015, 76% of Web users -- or 195.5 million people -- will be watching online video each month. In the same period, the research firm predicts online video advertising spending will surge from $1.97 billion to $5.71 billion.
YouTube Streams Four Billion Videos Daily
Excerpted from Reuters Report by Alexei Oreskovic
YouTube, Google's video website, is streaming 4 billion online videos every day, a 25 percent increase in the past eight months, according to the company.
The jump in video views comes as Google pushes YouTube beyond the personal computer, with versions of the site that work on smart-phones and televisions, and as the company steps up efforts to offer more professional-grade content on the site.
According to the company, roughly 60 hours of video is now uploaded to YouTube every minute, compared with the 48 hours of video uploaded per minute in May.
YouTube, which Google acquired for $1.65 billion in 2006, represents one of Google's key opportunities to generate new sources of revenue outside its traditional Internet search advertising business.
Last week, Google said that its business running graphical "display" ads - many of which are integrated alongside YouTube videos - was generating $5 billion in revenue on an annualized run rate basis.
Still, most of the 4 billion videos that YouTube now streams worldwide every day do not make money. Three billion YouTube videos a week are monetized, according to the company.
YouTube recently redesigned its website to more prominently showcase specialized "channels" organized around different types of content. In October, YouTube announced that it had struck 100 original video programming deals with media partners including Madonna and Jay-Z. Thomson Reuters and YouTube recently announced a partnership to create a Reuters TV channel for the website.
The Future of Web TV: Strength in Numbers
Excerpted from Online Video Insider Report by Paul Kontonis
Fueled by content, I exited completely energized and inspired from CES 2012. Why? Because every technology provider and hardware manufacturer highlighted their product features and innovations through content; specifically, video content.
Listening to Tom Hanks talk about his new Yahoo web series - and how the format presents limitless opportunities - only added to my excitement.
All indicators are pointing to this being the year for web video: Netflix's original content deals, YouTube investing over $100 million in original Web series, Tom Hanks partnering with Yahoo on original programming - the list goes on. Add in highly anticipated original content slates from CBS, Sony's Crackle, Michael Eisner's Vuguru, and you have the foundation for a game-changing year.
The challenge for the buying side of the industry and advertisers remains how to find all the new content while it's still available for investment. The television upfronts have long been the major sales driver for the broadcast and cable industries. Over the years, various Web video platforms and portals have attempted to create this same momentum. With all the recent innovations and investments in the space, this is going to be the year of the original web television upfronts - and I say, bring them on!
Leading the unifying charge has been the International Academy of Web Television (IAWT). With the inaugural Awards at CES, the IAWTV has taken the major step of establishing an industry awards platform to recognize and celebrate the best in Web television.
You may ask - what do awards really bring to the table, aside from a shiny trophy? But as Hactivision recently wrote, awards are a necessary step in establishing "norms and values for audiences. They adjudicate quality, innovation, and diversity, and help raise awareness." The IAWTV awards, with the help of YouTube and Yahoo, highlighted the most celebrated content within an international community of creators, distributors, studios and networks.
Most important, the awards helped contribute to the growing sense of unity that has become so important to the web video industry. It is through unity that the marketplace will continue to manifest and deals will flow more easily. Currently our industry is hindered by a lack of sales and analysis norms. How does an advertiser find and evaluate content opportunities with so many around? With the continuing unity of creators, brands, and publishers, I anticipate a video content matchmaking platform that will provide advertisers the marketplace to discover, engage, and package scalable content.
The scale and reach of original Web video content will only continue to expand. With all of the announcements that came out of CES (connected televisions, Sony's bet on GoogleTV, immersive tablets, and smart appliances all seamlessly serving up and sharing content) video is expected to be the killer app. This will add to the growth of Web video audiences -- and even more important, the audiences for original web video. Look for an original web series to forecast TV-like scale and reach in a single episode within a TV-like time period.
Ultimately, unity means strength in numbers - and that's how we will grow the web television industry beyond just projections. This is the year that web series become the main product, and not just an extension of TV. Our numbers - and this industry - are growing stronger every day.
Distributors Are Making Their Mark on the Cloud
Excerpted from CRN Report by Scott Campbell
Solution providers aren't the only ones seeking to capitalize on cloud computing. Distributors are making their mark in the cloud as well, investing heavily in their own cloud programs and tools with one goal in mind: staying relevant to solution providers as the technology landscape shifts underneath them yet again.
As more businesses adopt cloud technology, some channel observers feel distributors run the risk of disintermediation as sales of on-premise hardware and software lose share to off-premise solutions. Solution providers are forging relationships directly with cloud vendors, and the need for the so-called middleman will become obsolete. Or so the theory goes.
It's a battle distributors have faced before. In the past few decades, distributors have seen obstacles posed by the Internet, the direct model, ever-shrinking product margins, and the commoditization of technology itself. Each time, distributors met the challenge. They automated processes, slashed their own costs and built or bought resources when necessary. And they've remained relevant. Case in point: The world's largest distributor, Ingram Micro, expects to close fiscal 2011 with more than $35 billion in revenue, its biggest year ever.
But cloud computing feels different. In the past, distributors evolved with a slight tweak of their business model, in some cases as simple as charging for services they basically had been giving away before (think logistics, tech support, integration services).
Providing cloud solutions is an entirely different business model than distributors - and many solution providers, for that matter - are used to. One doesn't just flip a switch and start offering hosted applications or backup from across the country for a monthly recurring revenue stream all while constantly monitoring a customer's network for irregularities. It takes the right resources, business processes and execution to successfully sell cloud solutions and add enough value to make solution providers - and end users - care.
The ability to add value - and convince solution providers that they are, indeed, adding value - has become an imperative for distributors, particularly as a whole flock of new cloud-only solution providers has emerged.
Cumulus Global, a $1.2 million Westborough, MA, solution provider, is the kind of fast-growing cloud provider distributors want to get close to, as last year it doubled its sales vs. 2010. But when a cloud vendor asked Cumulus Global CEO Allen Falcon to contact Ingram Micro for a new purchasing process, Falcon brushed off the request.
"Most of our relationships are with cloud vendors directly. One or two of our vendors have gone to distributors for the purchasing process, but at this point it's made it more cumbersome to order those products and services," Falcon said. "To me, distributors still appear as nothing other than purchasing mechanisms. They're still looking for their role in the overall process."
For distributors to add value, they will have to come up with better pricing, better purchasing and deal registration processes, and marketing development, Falcon said. And that's something he said he has not seen.
Please click here for the full report.
Music and Movie Fans - BitTorrent Is Coming Back
Excerpted from Technorati Report by Dan Reyes
Music and movie fans, rejoice! BitTorrent to your TV is coming soon. Some people may have heard about BitTorrent before, but they probably don't know much about it. Well, to give you an idea of what it is, BitTorrent allow consumers to discover, play, share and move all types of high-quality personal media in the comfort of their living rooms.
Furthermore, BitTorrent is also an ideal solution for people to access their huge personal media libraries and play high-quality content on any screen, at any time. It is a peer-to-peer (P2) file-sharing protocol to send huge files very fast over the Internet.
BitTorrent recently partnered with companies in Asia, Europe, and Russia to roll out a series of "BitTorrent Certified Devices" that include Blu-ray players, set-top boxes (STBs), media adapters and TVs.
In addition, BitTorrent is content-sharing with multiple devices on the same home network. Best of all, the service offers free downloads and it's a fast and easy client for Windows and Macs with many features. Simply put, BitTorrent delivers the world's content to you.
Telefonica Buys into Cloud Specialist Joyent
Excerpted from TelecomTV One Report by Ian Scales
Telefonica Digital has become a major backer of (and strategic investor in) Joyent, the company behind Node.js, thought by many observers to be the key to driving development in the cloud "back end."
Joyent has just completed an $85 million investor round and will see Telefonica Digital (the new Telefonica next gen services arm) put skin in the Joyent game and get access to Joyent's 'technology expertise' as well as its technology. Joyent has a range of cloud products and development avenues and is in the process of building a web-scale competitor to Amazon's cloud services.
It has facilities in five data centers in the US, divided into pods to isolate possible failures. But the Joyent development that is capturing the tech world's attention now is Node.js - the js stands for Javascript which is the technology buried in most browsers to do the complicated local processing activities that browsers are increasingly called on to do within the Web 2.0 environment. Node.js is about setting Javascript to work on the server-side - in the cloud. So real cloud applications where processes are handed off from the client to the cloud (and back again) can be done by one developer with one family of tools. Node.js is an open source project 'stewarded' by Joyent and already has backing from the likes of Microsoft.
Mobile Usage Soars for Internet, Ad Forecast to Hit $2.6 Billion
Excerpted from Online Media Daily Report by Laurie Sullivan
Tablets have become the consumer's fourth screen, especially among those with smart-phones. Techies with smart-phones continue to use tablets at a higher rate. Those in the United States - at 17% -- are among the highest, followed by Japan at 11%, and the United Kingdom at 10%, according to Google. The data appears to fall into line with AdWords tools allowing marketers to add WiFi ad targeting.
Google also added the ability to target by mobile operating system in AdWords.
The research - which Google conducted in two phases during 2011, in January and February followed by September and October - finds consumers shifting from feature phones or smart-phones for Internet access. In fact, they use smart-phones more than desktop or laptop computers in the US, UK, Germany, France, and Japan.
Germany had the biggest increase in smart-phone owners using their device for daily Internet access, jumping from 39% to 49%. Japan had the highest percentage accessing the Internet daily on their smart-phone, at 88%. A little more than two-thirds of smart-phone users in the US - and more than half of smart-phone users in the UK - access the mobile Internet daily.
Research firm eMarketer estimates mobile advertising spending in the US reached $1.45 billion in 2011, up 89% from $769.6 billion in 2010. This year, US mobile ad spending will grow 80% to $2.61 billion.
The revised US mobile growth forecast of 47% to $1.8 billion in 2012 - up from $1.2 billion last year - reflects a stream of new market data from major advertising publishers and research firms, as well as better-than-expected performance from Google.
Google's share of overall US mobile ad revenue rose 51.7%, or about $750 million, in 2011. The company isn't the only one to see success in mobile. Apple's iAd platform, an ad network, generated slightly more than $90 million in revenue last year to take a 6.4% share of overall US mobile ad revenue. Millennial earned $90.9 million, for a 6.3% share.
As more marketers explore and launch mobile ad campaigns based on the increased use of smart-phones and tablets by consumers, do the ads impact purchases? A mobile study of 1,300 respondents conducted between December 26th and January 11th from digital marketing firm InsightExpress sheds light on ad recall and perception.
Men ages 18 to 29 are more likely to become aware of having seen mobile ads, are more positive toward them, and are more likely to consider them new and different compared with traditional and digital ads.
Of men who participated in the study, 32% said they use their mobile phones versus a computer Internet connection or walking into a store more often when purchasing items, versus 12% of women in the same age group. Men are more likely to search for an item on mobile. About 65% searched for a product in a nearby store using their phone.
In general, young men use mobile more for information-gathering than women. While 59% of men use their mobile phone to find better prices on items, 49% use their mobile phone to search for an item to find reviews, and 41% use their mobile phone to take a picture or send it to someone.
Octoshape and Crunchfish Solidify Mobile Partnership
Octoshape and Crunchfish are partnering to integrate Octoshape Infinite HD technologies into Crunchfish mobile application development platforms. The combination of Crunchfish's unique mobile app design and Octoshape's high quality and efficient global content distribution technology sets the stage for rapid monetization of broadcast quality content on all connected devices. The first apps for iPhone, iPad and Android were showcased at CES in Las Vegas this year.
"When we started Crunchfish AB we had the dream of making technology that made a difference in people's lives," said Paul Cronholm, Co-founder and CEO at Crunchfish AB. "The collaboration with Octoshape realizes improved media experiences for people all over the globe on any device. We at Crunchfish are proud to do this together with Octoshape."
In today's connected device market, media companies and content aggregators face stiff challenges. Obstacles on the network side include providing high quality video experiences for which people are willing to pay. Hurdles on the application development side include providing consistent UI and consumer experiences across a diverse range of devices. The partnership between Octoshape and Crunchfish resolves these issues for broadcasters by combining the best-in-class video distribution technology from Octoshape with the innovative application development experience of Crunchfish.
"We find ourselves at the beginning of an explosion in global media distribution via connected devices," said Michael Koehn Milland, CEO of Octoshape. "We are pleased to partner with Crunchfish to help our customers rapidly develop compelling and consistent user experiences to their device audiences."
Streaming media innovator Octoshape provides the enabling technology required for content owners to deliver online video over best-effort public networks to the largest audiences and with the highest-quality viewing experience. The company is writing the next chapter of content delivery. The Octoshape approach is more scalable and affordable than traditional CDN schemes, while providing feature-rich, high-quality viewing to the largest of audiences.
Crunchfish is a Swedish corporation that is passionate about mobile device innovation. Our focus is on the user experience as well as new ways to interact with machines. We do this by combining our strong graphical design with early results from our research and development teams.
Cloud Computing: Bridges the Old and the New
Excerpted from IT Business Edge Report by Arthur Cole
A plethora of applications are being considered for the cloud, but it may take at least another year before cloud computing goes mainstream in the enterprise.
Even though most enterprises have attained a significant level of comfort with cloud technologies over the past few years, there are still a lot of unknowns, or at least uncomfortable truths, about the cloud itself.
Probably the most significant is its ultimate relationship with legacy infrastructure. Does the cloud truly represent a new kind of IT in which data resources are delivered and consumed on a utility model, or should it merely provide an adjunct service to supplement owned-and-operated systems?
To companies like Cloudscaling, the former has the most appeal in terms of driving enterprise data architectures to new levels of productivity, although it seems that the latter is most in vogue right now because it lies more easily in the comfort zones of most CIOs. As CTO Randy Bios pointed out, platforms like Amazon Web Services (AWS) can only rise to their full potential when executives stop viewing them through the lens of traditional enterprise computing and start seeing them as an entirely new form of IT. Only then can you shift your focus away from simply virtualizing and managing resources and delve into truly game-changing concepts like infinite scalability and lowest cost-per-compute scenarios.
In this light, it would seem that the enterprise industry has come to a fork in the road, er cloud, says ZDNet's Phil Wainewright. Basically, do you simply want to retrofit the cloud to suit the needs of your existing data infrastructure, or do you want to take a leap into the unknown where both the risks and rewards are substantial? Clearly, most enterprises are pursuing the safer alternative through private clouds, even though, in Wainewright's view, these will fail to provide the kind of resource flexibility needed to handle a rapidly changing data universe.
But where Wainewright sees a fork, I tend to view it as one side of a divided highway. True, private and public clouds are different animals, but there is no reason why they can't work in tandem. If we go back to AWS as an example, you'll note that the company recently released a new storage gateway designed to connect on-premises software appliances to cloud-based applications and data. While it is limited to mirroring applications and asynchronous uploading, it nonetheless represents another step in the drive to integrate internal and external architectures so they operate as a unified environment.
Many of the new data management suites are already working under this assumption. Gale Technologies, for one, recently released GaleForce 6.0, which provides broad support for physical, virtual and cloud platforms with an eye toward orchestrating them as a single environment. The goal is to allow enterprises to leverage new and legacy systems for infrastructure service delivery, enabling both broad scalability and efficient resource utilization across mixed-platform infrastructures.
The point is, the cloud is not an either/or proposition. Just as it's short-sighted to view the cloud simply as an extension of traditional resources, so too is it wrong to pitch decades' worth of investment in internal infrastructure just because something new comes along. There's no reason why cloud architectures cannot retain the look and feel of traditional data environments even as legacy infrastructure becomes more cloud-like.
Quantum Tech Could Secure the Cloud Through Blind Data Processing
Excerpted from TechNewsWorld by Richard Adhikari
A group of scientists has shown the potential for quantum computers in a cloud-based system to provide a new level of security using so-called blind computing. The idea behind blind quantum computing is that the computer processing data doesn't know anything about the input, the computation it performs on that input or the resulting output.
Researchers led by the University of Vienna's Stefanie Barz have demonstrated the possibility of using quantum computing to unconditionally secure cloud computing.
An artist's rendition of blind quantum computing, courtesy of the University of Waterloo's Institute for Quantum Computing.
The scientists' work, written up in the journal Science, essentially demonstrates double-blind cryptography.
It consists of an optical implementation of blind quantum computing, Barz told TechNewsWorld. The researchers used lasers, optical fibers, lenses, crystals, mirrors and polarization analyzers to conduct the demo.
The methodology demonstrated by Barz's team could be used in "factoring very large numbers into their prime factors, which is useful for cracking RSA-type encryption; ordering lists such as a Google search; and quantum simulations," Andrew Cleland, a physics professor at the University of California in Santa Barbara, told TechNewsWorld.
The idea behind blind quantum computing is that the computer processing data doesn't know anything about the input, the computation it performs on that input or the resulting output.
In conventional schemes, by contrast, the computations - in this experiment's case, measurements - are known to the quantum computer, so it knows what algorithm it's running.
The methodology Barz's team used has the client preparing qubits "in a state only known to himself, and tailoring the measurement instruction to the state of the qubits," Barz said. "The server does not know the state of the qubits and thus cannot interpret the measurement instructions. The server gets zeroes and ones as outcomes, but cannot interpret the values, whereas the client can."
In addition to providing greater security, blind quantum computing might help cut costs for law enforcement agencies, which need to store vast amounts of data.
"A number of law enforcement agencies have been researching cloud computing as a way to reduce the costs of maintaining vast quantities of digital evidence, but security has been a major consideration," Darren Hayes, CIS program chair at Pace University, told TechNewsWorld. "Quantum security may quell their fears."
The researchers "created photons in a so-called 'blind' state and entangled them to a blind cluster state," Barz said.
The photons were created by pumping a crystal with a blue laser beam. The crystal converted some of the incoming photons to red photons. The process is called "spontaneous parametric down-conversion," Barz said. The resulting red photons were entangled in polarization because of the setting the researchers used.
The researchers then calculated measurement instructions for the Deutsch-Jozsa and Grover's algorithms, made the corresponding measurements on the blind cluster state, and checked the outcome.
Deutsch-Jozsa is a deterministic quantum algorithm. Grover's algorithm is used for searching an unsorted database. It's probabilistic, meaning it gives the most probable answer. Repeating the algorithm increases the probability of getting the right answer by winnowing down the field of probable answers.
In measurement-based quantum computation, you begin with qubits in a given fixed entangled state and apply measurements to designated qubits in sequence, according to Richard Jozsa, one of the authors of the Deutsch-Jozsa algorithm.
The basis of a measurement that's selected may depend on the results of earlier measurements. The final result is determined from the classical data of all the measurement outcomes.
The researchers used one-way quantum computing, also known as the "cluster model." In this, the resource, or original, state is destroyed by the measurements.
Getting blind-based quantum security into the real world "is a highly complex task," Barz said.
It will take 10 to 20 years to have a lab demonstration of a quantum computer "sufficiently powerful to do useful things," UC Santa Barbara's Cleland suggested.
Perhaps we might see results sooner rather than later.
"I was at a briefing a few months ago on this very subject, and scientists at HP felt they would have something working out and productized in the next decade," Rob Enderle, Principal Analyst at the Enderle Group, told TechNewsWorld.
Quantum computers "can decrypt any non-quantum method near-instantly, in theory, rendering all existing forms of encryption obsolete," Enderle pointed out. "This will make the concerns surrounding Iran's nuclear efforts seem trivial by comparison if a foreign country gets there first."
It's Not Over Yet: Why You Should Still Be Concerned About SOPA & PIPA
Excerpted from Search Insider Report by Rob Garner
I've been closely monitoring the SOPA and PIPA bills for months, and it seems astounding that they are seriously being considered for passage. I watched the House hearing in mid-December with shock and awe: "awe" that this might be the undoing of the Internet, and "shock" at those who were moving forward in admitted ignorance of what they were debating. Adding to the "awesomeness" of the hearing, Congressman Melvin Watt (D-NC), went on a five minute spiel about how they should avoid any discussions of "who's been bought off, or not," because it wouldn't serve any valuable purpose. That statement alone spoke volumes about what was at stake.
Both Senator Harry Reid and SOPA sponsor Congressman Lamar Smith were saying "what's the big deal, we have bipartisan support?" Support between two parties who normally can't agree on anything, yet somehow crashing the US Internet is a cause for quickly joining hands around the firewall and singing a couple of rounds of Kumbaya.
You can either be "for" the open Internet, or not - there is no in-between
I support an open Internet because I believe it benefits not just business, but our society as a whole. Clearly, there is a monumental shift that has occurred in the last 17 years, and will continue to occur. Based on statements by Senators Harry Reid, Christopher Dodd, and Al Franken, Congressman Lamar Smith, this is not the last time you will see a power struggle to take over the US Internet.
This is "for-real" this time folks, a classic clash between old media and new media. Of all of the supporters for these bills, I'm particularly shocked at Al Franken's. He has been a voice for net neutrality in the past, but he supported this bill, and he also suggested that Google's algorithm should be regulated by the government. Let me make the crux of this column clear: You are either for an open Internet, or you are not. There is no in- between. Those who say the threat is overhyped are not aware of the facts.
Here is an outline of the basic threats of legislation that will inevitably be rewritten, and resubmitted:
Equal access for users. In an open Internet, content will not be denied because someone disagrees with a point of view, in a way that those with a dissenting opinion could easily sabotage an entire domain and its contents. Protection of long-tail economies. With the rise of the commercial Internet in the mid-1990s, long-tail economies that did not previously exist arose in a new and meaningful way. An open and equal playing field ensures that our economy will continue to thrive and remain competitive in this networked society that we have created. For examples, a shutdown of Yahoo would also include Yahoo Shopping, and all of the small business that legally market a wide variety of goods and services. It is simply a bad idea to shut down a city and its businesses because of one bad actor hawking counterfeits on the corner.
User-generated content. SOPA and PIPA would make it much riskier for sites that host any form of user-generated content. This includes the small individual bloggers and webmasters, all the way up to Google, Facebook, Ebay, and Twitter.
Linking. Have link rot on your site and haven't checked your links in a while? In a world where SOPA/PIPA are law, you could be in violation, and have your entire business and domain yanked down, with the burden of proof being on you.
DNS system breakage. Bottom line, it was a bad idea, and no one had thought about the consequences. Little Business or Big Business, you can't screw around with the DNS like this when transactional data is being sent over the network.
Real-time interaction. Seriously, would you want to start a real-time user-generated content play in a world where any idiot could knowingly or unknowingly sabotage your business or community? I didn't think so.
Domain business takedowns without due process. To have your domain taken down entirely, all you would need is a claim by a copyright holder. These bills basically give the U.S. IP lobby eminent domain to bulldoze your website until further notice, or your burden of proof has been met.
Free speech. SOPA and PIPA are tantamount to a totalitarian firewall that Americans typically fight against.
Again, all of the above is what a bipartisan group in congress, along with the MPAA and RIAA, wants to weaken, if not totally destroy.
As a US citizen, I would be interested if anyone if Congress or their staff would like to respond here in the comments of this article - you would be reaching a good audience of US citizens and people interested in the US Internet abroad by doing so.
Door Opens for Issa-Wyden Online Piracy Bill
Excerpted from the Hill Report by Gautham Nagesh
The collapse in support for two anti-piracy bills last week leaves the door ajar for movement on alternative legislation offered by Senator Ron Wyden (D-OR) and House Oversight Committee Chairman Darrell Issa (R-CA).
The shelving of the Stop Online Piracy Act (SOPA) and the Protect Intellectual Property Act (PIPA) on Friday after massive online protests might have changed permanently the landscape of lobbying on tech issues.
Hollywood, the recording industry and the US Chamber of Commerce - some of K Street's most prominent interests - were drubbed in the debate by tech companies harnessing the power of Internet users. Senate Majority Leader Harry Reid (D-NV) was forced to back away from a vote this week, while House Judiciary Committee Chairman Lamar Smith (R-TX) said he would "seek wider agreement" on his bill.
Issa and Wyden's OPEN Act, which seeks to stop the transfer of money to foreign websites with a primary purpose of copyright infringement or counterfeiting, is likely to get the full scrutiny of policymakers and the tech world in the coming weeks. Although whether it can move forward remains in serious doubt.
After the storm of news surrounding the online piracy bills, this week will be lighter for tech policy observers.
A highlight comes Saturday, with Data Privacy Day, as consumer privacy and security take center stage. The Commerce Department and Federal Trade Commission are likely to release their separate reports on how to improve consumer privacy by the end of the month.
Consumer privacy legislation gained some momentum in the Senate last year but still appears to be a long shot to come to the floor in an election year. The most likely movement in Congress would be on a national data breach notification law to supplant the current patchwork of state regulations.
A December 2010 draft of the Federal Trade Commission (FTC) staff report pushed browser manufacturers to create an add-on that would allow consumers to opt out of having their actions tracked online. A green paper from Commerce proposed the recognition of a new baseline of consumer privacy rights, as well as a new federal data breach standard. Both stopped short of calling for new consumer privacy laws.
On Thursday, FTC Member Julie Brill will headline a day of privacy events at George Washington Law School and deliver a keynote that will likely address the FTC's role as the de facto privacy regulator within the government at present. Also scheduled to appear are Erin Egan, Facebook's Chief Privacy Officer; Bob Quinn, AT&T's Chief Privacy Officer; Rick Buck, Head of Privacy at eBay; and David Hoffman, Intel's Global Privacy Officer.
Coming Events of Interest
Cloud Connect - February 13th-16th in Santa Clara, CA. The premier technology event for cloud computing, features the latest technologies, platforms, strategies, and innovations within cloud computing.
Cloud Computing Imperative 2012 - March 12th-13th in Dubai, UAE. Strategies to implement IaaS, PaaS, SaaS, and XaaS. Plan the shift of IT responsibilities, get fresh perspective on managing project budgets, build a strong ROI for cloud computing, understand the shift from managed services to the cloud, master the cloud infrastructure and see cloud security from a hacker's perspective.
2012 NAB Show - April 14th-19th in Las Vegas, NV. From Broadcasting to Broader-casting, the NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, the NAB Show has proudly served as the incubator for excellence – helping to breathe life into content everywhere.
CLOUD COMPUTING CONFERENCE at NAB - April 16th in Las Vegas, NV. Don't miss this full-day conference focusing on the impact of cloud computing solutions on all aspects of production, storage, and delivery of television programming and video.
|