Distributed Computing Industry
Weekly Newsletter

Cloud Computing Expo

In This Issue

Partners & Sponsors

aspera

Avid

Chyron

Front Porch Digital

Rackspace

Digital Watermarking Alliance

MusicDish Network

Digital Music News

Cloud News

CloudCoverTV

P2P Safety

Clouderati

Industry News

Data Bank

Techno Features

Anti-Piracy

April 2, 2012
Volume XXXVIII, Issue 12


Huawei Among World's Top Tech Innovators

Excerpted from San Francisco Chronicle Report by Andrew Ross

The Jinqiao Export Zone in the Pudong district of Shanghai plays host to numerous multinational corporations' R&D centers like GE, GM, and Siemens. One of the zone's selling points, it tells prospective clients, is that it represents the country's move up the value chain from "Made in China" to "Developed in China."

Occupying 2 1/2 million square feet is a private, employee-owned Chinese company that has moved far up the value chain and symbolizes what reformers here would like the country's economic system to be: innovative and more genuinely market-driven.

DCIA Member company Huawei Technologies started out in 1988 as a small-time distributor of PBX systems in southern China. It is now not just the world's largest supplier of telecom equipment, but an increasingly ambitious player in high-tech sectors ranging from network servers to mobile telephony.

On display inside the company's glass-encased space-age building are a bevy of sophisticated products: data management centers, solar-powered small footprint base stations, cloud-based software services, videoconferencing systems, smart phones and, most recently, a tablet aimed at the business market.

"Huawei is building some of the best, most innovative and fastest equipment in the industry," Fortune Magazine said last year.

Operating in 140 countries, the company recorded $32 billion in sales last year and looks to triple that number in the next 10 years. The Shanghai center, where much of the mobile phone R&D occurs, employs 10,000 people out of a global workforce of 140,000.

It operates 23 R&D centers worldwide, including one in Santa Clara, CA where 700 researchers are focusing long-term on photonics, optics and advanced long term evolution (LTE) devices. Huawei installed the world's first LTE network in Oslo in 2009.

Rated the fifth most innovative company in the world by Fast Company magazine in 2010 (behind Facebook, Amazon, Apple, and Google), Huawei is where the Chinese government would like its economy to be. Innovation in technology is a key component of the country's five-year plan for "rebalancing" the economy and "higher quality growth."

That cannot be achieved without greater involvement of the private sector, say analysts, including the World Bank and the Development Research Center, a Chinese government agency.

Looking at how far Huawei has come, that may not be a bad bet.

In addition to its 700 employees in Santa Clara, another 1,000 are employed elsewhere in the United States, where Huawei spent $230 million on R&D alone last year. Earlier this year, it awarded $6 billion worth of contracts to three California companies, including advanced semiconductor maker Avago Technologies, in San Jose, CA.

Ross Gan, Huawei's head of corporate communications, said 98.58 percent of the company is owned by its employees, who can purchase stock after two years of employment. The rest belongs to the company's founder and CEO, Ren Zhengfei, he said.

DataDirect Networks Launches Big Data Appliances

Excerpted from HostingTec News Report

DCIA Member company DataDirect Networks (DDN), a leader in massively scalable storage, announced the availability of the DDN SFA10K-M and SFA10K-ME Big Data appliances, which will enable organizations worldwide to start small and grow their Big Data infrastructure incrementally while harnessing the award-winning performance of DDN's storage fusion architecture (SFA) technology.

"One of the key attributes of today's Big Data is the lack of predictability of data growth across customer environments," said Alex Bouzari, CEO of DDN. "Today's announcement of our SFA10K-M and SFA10K-ME platforms is an important step in furthering the democratization of Big Data and enabling our customers to resolve Big Data unpredictability by starting small and growing incrementally into massively scalable Big Data technology."

The newest additions to DDN's SFA10K family include:

SFA10K-ME: Featuring DDN's In-Storage Processing technology, the SFA10K-ME is built with high-speed server virtualization technology to co-locate Big Data applications closest to data and eliminate the need for storage processing and networking. SFA10K-ME configurations, featuring InfiniBand or Ethernet connection, can be optionally configured to support DDN's GridScaler and ExaScaler parallel file system software to scale file system performance beyond 1TB/s and volume capacity beyond 10 petabytes.

SFA10K-M: As a high-speed block storage appliance, the SFA10K-M, featuring InfiniBand or Fibre Channel connection, can be configured to deliver up to 10GB/s in only 20U and supports up to 720TB of capacity and can be scaled-out with any of DDN's file storage offerings as a modular Big Data building block.

DDN's SFA10K-M and SFA10K-ME systems are available both through DDN's sales force and the company's global partner network. With a reduced configuration, the system features up to 40% lower entry pricing 57% less form form-factor than previous DDN SFA10K configurations. To support the product introduction, DDN is announcing a 3 month introductory price for the SFA10K-M which starts at $100,000 US.

"The 'Big' in 'Big Data' is a relative term, like the 'High' in 'High Performance Computing.' Big Data is about the growth in data across multiple industries, causing organizations to address high-performance data management issues they haven't had to cope with before," said Addison Snell, CEO of Intersect360 Research. "Not every Big Data problem is in petabytes. That's why it makes sense for DDN to extend its high-performance storage lines into the midrange, to lower the entry point for Big Data management."

For a three-month promotional period between April 1st and June 30th DDN will offer 10K-M pricing to end-users starting at $100,000. The promotional $100,000 entry-level configuration consists of: 1 x SFA10K-M Storage Array Bundle w/ 8 x InfiniBand ports, 1 x 60-slot 4U SS7000 enclosure, 6Gb cables, 16GB of Mirrored Cache, Battery, SFPs and SFA OS Licenses for 4 Raid Processors including SATAssure, RAID levels 1, 5 and 6. Installation, annual support and ExaScaler or GridScaler add-on software is sold separately.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe DCIA is proud to announce new speakers for our inaugural CLOUD COMPUTING CONFERENCE at the 2012 NAB Show.

From creation to consumption, the NAB Show proudly serves as the incubator for excellence - helping to breathe life into content everywhere.

This second DCIA "Conference within NAB" is scheduled for Monday April 16th in N232 of the North Hall at the Las Vegas Convention Center.

The NAB Show has evolved over the last eight decades to continually lead its ever-changing industry. While solutions have changed to keep pace with consumer habits and technologies, aspirations to produce and deliver memorable content have remained constant.

In addition, the DCIA will exhibit in N3222M at the CLOUD COMPUTING PAVILION, a first-ever special section of the NAB Exhibit Floor totally dedicated to cloud computing.

The NAB Show will be attended by 90,000+ media and entertainment professionals from over 150 countries. More than $18.8 billion in purchasing power will be represented onsite. 1,500+ companies spread over 745,000 net square feet will exhibit. There will be more than 500 skill-building sessions; and 1,300+ members of the press will cover the event.

For more information or to register, please click here.

This very timely conference will demonstrate how software developers are addressing two major concerns with respect to cloud-based solutions for audio/video (A/V) delivery - reliability and security.

Experts will provide insights into how cloud computing impacts each stage of the content distribution chain, from collaboration to storage and delivery all the way through analytics.

Sponsoring companies include Aspera, Avid, Chyron, Front Porch Digital, and Rackspace.

The agenda for the conference, which begins at 10:30 AM and continues until 6:00 PM PT, will open with "Latest Trends in Cloud Computing Solutions for the A/V Ecosystem" and then move to "Key Pitfalls Associated with Cloud Computing in High-Value Content Implementations."

The conference will then explore "Various Ways that Cloud Computing Is Being Applied to the Content Creation Process - from Pre- to Post-Production," followed by "Alternative Approaches for Implementing Cloud Storage of Content Catalogs and Libraries and Leveraging Cloud-Based Distribution," and then, "New Levels of Media Performance Data Enabled by Cloud Computing - and Impact on Other Sectors."

The agenda will close out with "Navigating the Current Cloud Environment and Planning for What's Next," and finally, "Disruptive Effects of Cloud Computing Will Continue."

Keynote speakers for each of the above areas, respectively, include Bill Kallman, CEO, Scayl; Jim Burger, Member, Dow Lohnes; Jonathan King, SVP, Joyent; Shahi Ghanem, SVP, Strategy & Marketing, BitTorrent; Scott Brown, GM & SVP Strategic Partnerships, Octoshape; Jean-Luc Chatelain, EVP, Strategy & Technology, DataDirect Networks; and James Hughes, VP and Cloud Storage Architect, Huawei.

The first panel will explore "Advanced Capabilities, New Features, Cost Advantages of Cloud Computing Solutions" with Mike Alexenko, Senior Director of Market Development, Cloud & Mobility, G-Technology; Scott Campbell, Principal, Media, Entertainment, and Telecoms, SAP; David Frerichs, Strategic Consultant, Pioneer Corporation; David Hassoun, Founder, RealEyes Media; AJ McGowan, CTO, Unicorn Media; Samir Mittal, CTO, Rimage; Michelle Munson, CEO, President, and Co-founder, Aspera; and Robert Stevenson, EVP, Interactive Entertainment, Gaikai.

The second panel will examine "Privacy Issues, Reliability Questions, Security Concerns in the Cloud Computing Space" with Dave Asprey, VP, Cloud Security, Trend Micro; Tom Mulally, Consultant, Numagic Consulting; Graham Oakes, Chairman, Digital Watermarking Alliance (DWA); Rajan Samtani, SVP, Sales & Marketing, Peer Media Technologies; Dan Schnapp, Partner & Chairman of New Media, Entertainment & Technology, Hughes, Hubbard & Reed; Yangbin Wang, CEO, Vobile; Marvin Wheeler, Chairman, Open Data Center Alliance (ODCA); and Vic Winkler, Author, "Securing the Cloud."

The third panel will focus on "A/V Pre-Production, Production, Post-Production Clouds" with Tony Cahill, Chief Engineer, CET Universe; Guillermo Chialvo, Gerente de Tecnologia, Radio Mitre; Gerald Hensley, VP, Worldwide Entertainment Sales, Rovi Corporation; Chris Kantrowitz, CEO, Gobbler; Ajay Malhotra, EVP, North America, Prime Focus Technologies; Todd Martin, SVP, Strategic Solutions Group. Chyron; Kirk Punches, VP, Business Development, Sorenson Media; and Jostein Svendsen, CEO, WeVideo.

The fourth panel will cover "Cloud Media Storage & Delivery" with Bang Chang, VP, Server and Storage, SeaChange International; Stephen Condon, VP, Global Marketing Communications, Limelight Networks; Gianluca Ferremi, VP Sales & Marketing, Motive Television; Corey Halverson, Product Director, Media Business Solutions, Akamai; Kshitij Kumar, SVP, Mobile Video, Concurrent; Kyle Okamoto, Sr. Mgr. Product and Portfolio Mgt., Verizon Digital Media Services; and Mark Taylor, VP, Media and IP Services, Level 3.

The fifth panel will address "Cloud Measurement, Analytics, Implications" with Sean Barger, CEO, Equilibrium / EQ Network; Thomas Coughlin, President, Coughlin Associates; Steve Hawley, Principal Analyst & Consultant, TVStrategies; Jonathan Hurd, Director, Altman Vilandrie & Co.; Monica Ricci, Dir. of Product Marketing, CSG International; John Schiela, President, Phoenix Marketing International (PMI); Nick Strauss, Director of Sales, Verizon Digital Media Services; and Mike West, CTO, GenosTV.

The sixth panel will forecast the "Years Ahead for Cloud Computing" with Saul Berman, Lead Partner, IBM Global Business Services; Ian Donahue, President, RedThorne Media; Chris Haddad, VP, Technology Evangelism, WSO2; Wayne Josel, Counsel, Media & Entertainment, Hughes, Hubbard & Reed; Steve Mannel, Senior Director, Media & Communications, Salesforce.com; James Mitchell, CEO & Founder, Strategic Blue; David Sterling, Partner, i3m3 Solutions; and Chuck Stormon, CEO, Attend.

Moderators will include Adam Marcus, Technology Advisor, DCIA; Brian Campanotti, CTO, Front Porch Digital; and Sari Lafferty, Business Affairs, DCIA. Share wisely, and take care.

Transcoding: In-House or in the Cloud?

Excerpted from Streaming Media Report by Troy Dreier

"A question I get reasonably frequently is 'What is a book company, what is a bookshop doing running a cloud computing business?' said Matt Wood, Technical Evangelist Europe for Amazon Web Services. Wood was speaking at the recent Streaming Media Europe conference in London, explaining what Amazon can offer companies that need video transcoding.

There are three parts to Amazon's business nowadays, Wood said: commerce (books and much more), an e-commerce platform, and cloud computing services. How that last part developed was the result of some innovative thinking.

"Really, the story goes back all the way to when Amazon was founded," said Wood. Amazon was always built out to be a platform and we wanted to make our skills and expertise that we built up over a decade or so running e-commerce services on a global scale and allow some of our customers access, so they didn't have to go through the same pain points that we did. So we allow programmatic access to our catalog, to our metadata, to merchants that wanted to sell up on our platform. And we saw a surprising amount of innovation happening. We saw people being able to take our data, take our platform, take our services, and build really innovative new products, unexpectedly innovative new products, to be perfectly honest."

That's how the e-commerce platform developed. Web services followed the same idea:

"About five years ago, we had a blinding flash of the obvious: in addition to building up all these custom operations and custom procedures, and building up this massive global infrastructure to run our e-commerce site, what would happen if we opened that up to the same developers? What if we extended our API access right back to our data centers?" said Wood.

The result was Amazon Web Services, which allows companies to have their video transcoding done in the cloud so they don't need to invest in software or hardware to do the job. Most companies have spikey transcoding needs, said Wood, meaning that they need transcoding at certain times but not others.

To hear more about Amazon Web services, as well as in-house options, click here for a video.

For companies that only need video transcoding occasionally, cloud computing makes sense. Amazon Web Services provides scalable performance on demand.

Abacast Customers Use Cloud-Based Clarity To See More Business 

Excerpted from All Access Music Group Report

Cloud-based advertising management software and services firm Abacast reports that customers are seeing significantly increased CPMs and are able to close more business when using the listener geo-targeting capabilities available in the cloud-based Clarity Digital Radio System. In aggregate, its broadcasting customers are seeing an average increase of 50% for geo-targeted CPMs versus non-targeted CPMs.

"As more and more advertisers base the success of their campaigns on efficiencies, it is important that we be able to offer targeting by DMA," ESPN Audio Sales Manager Tom Fitzgerald said. "With DMA targeting we can help our clients cut waste and change messaging to become more relevant to the local listener."

"Our goals with the Clarity system are to help online radio customers maximize their streaming profits and to lower their digital risk," Abacast CEO Rob Green said. "Clarity's geo-targeting capabilities directly support these two goals by increasing the value of their digital inventory and by increasing their breadth of offerings."

Clarity geo-targeting is currently being used by a select group of Abacast broadcaster customers, with general availability in Q2 2012.

Fascinating Facebook Facts

Excerpted from Baseline Report by Dennis McCafferty

What did anyone ever do at work before Facebook? How did we keep track of our cousin's daughter's field hockey teams or know what our old high school math teacher was listening to on Spotify?

Not to mention the hit to worker productivity dished out by FarmVille, Words With Friends or Mafia Wars. But love it or hate it, Facebook has become a part of our social fabrics, bringing the relationships of nearly a billion people online.

It's changed the way we communicate and how we interact, and it cannot be ignored. And Facebook's co-founder and CEO, techno-wunderkind Mark Zuckerberg has become a media sensation.

How many 27 year old entrepreneurs have feature films about them written by Aaron Sorkin and directed by David Fincher? (We're guessing just this one.) But how much do you really know about the company that knows everything about you? Read on to beef up your Facebook knowledge.

Social Media, Cloud Fragmentation Driving Tech M&A

Excerpted from eWeek Report by Nathan Eddy

Positive spending growth, large cash balances and a predisposition toward mergers and acquisitions are likely to keep the technology industry in the top spot for mergers and acquisitions (M&A) activity in 2012, according to the latest report from PwC.

Innovation and time-to-market concerns continue to fuel M&A activity in the tech sector, according to PwC's "2012 U.S. Technology M&A report," which provides an analysis of 2011 tech deals and the outlook for 2012. The study cited increased activity among tech companies buying up players in the fragmented cloud and social media markets, which have moved to center stage with announced and completed initial public offerings (IPOs).

While spending forecasts have been reduced, the technology industry has generally bucked the negative trends the majority of industries have experienced in the last few years. Despite the unease in the stock market, profitable tech businesses managed to sock away cash during the year, increasing the coffers of the top 20 US technology companies to more than $300 billion in cash and marketable investments by year's end, more than the total transaction value of closed technology deals in the last three years combined, the report said.

Shortly after the close of 2011, IT research firm Gartner lowered its 2012 technology spending forecast to 3.7 percent, and analytics specialist Forrester followed suit, with a forecast of 5.2 percent. Gartner and Forrester estimates for technology spending in 2011 totaled 6.9 percent and 9.7 percent, respectively.

"The decrease in anticipated IT spend in 2012, just over half that of 2011, provides an interesting view of growth in technology businesses in the coming year," PwC's report said.

During the year, the market witnessed the public offerings of Groupon, Zynga, LinkedIn, Pandora and other tech-focused businesses. While overall IPO filings declined in terms of volume and value from 2010 to 2011 due to market volatility during the year, key technology players have not completely written off IPO ambitions. Not long after year-end, came the much-anticipated announcement of Facebook's IPO filing, which is expected to be one of the largest technology IPOs in history.

In addition, PwC noted that several companies announced major changes in 2011, which might signal more transitions to come. For example, Google announced an entrance into the mobile handset market in a big way, with the acquisition of Motorola Mobility.

In another significant shift last year, Hewlett-Parked announced the spin-off of its Personal Systems Group (PSG) business; HP later canceled the plan, and this month announced that it would combine its PSG and printer businesses.

"We expect a continuation of shifts in strategic direction resulting in sizable acquisitions in the coming year," the PwC report said.

With nearly 2 billion Internet users across the globe, the volume of data (both structured and unstructured)-in the form of e-mails, tweets, blogs, instant messages, videos, photos and web pages - is growing exponentially. Information collected through social media platforms, websites and mobile providers has the potential to provide businesses with details about consumer demographics and interests like never before.

"Businesses that can provide the software tools to collect, organize, analyze, and summarize findings from this data will become highly relevant in the near term and likely targets for a variety of businesses looking to put big data to work," the report said.

New US Research Will Aim at Flood of Digital Data

Excerpted from NY Times Report by Steve Lohr

The federal government is beginning a major research initiative in big data computing. The effort, which will be announced on Thursday, involves several government agencies and departments, and commitments for the programs total $200 million.

Administration officials compare the initiative to past government research support for high-speed networking and supercomputing centers, which have had an impact in areas like climate science and web browsing software.

"This is that level of importance," said Tom Kalil, Deputy Director of the White House Office of Science and Technology Policy. "The future of computing is not just big iron. It's big data."

Big data refers to the rising flood of digital data from many sources, including the web, biological and industrial sensors, video, e-mail, and social network communications. The emerging opportunity arises from combining these diverse data sources with improving computing tools to pinpoint profit-making opportunities, make scientific discoveries and predict crime waves, for example.

"Data, in my view, is a transformative new currency for science, engineering, education, commerce, and government," said Farnam Jahanian, head of the National Science Foundation's computer and information science and engineering directorate. "Foundational research in data management and data analytics promise breakthrough discoveries and innovations across all disciplines."

On Thursday, the National Science Foundation will announce a joint program with the National Institutes of Health seeking new techniques and technologies for data management, data analysis and machine learning, which is a branch of artificial intelligence.

Other departments and agencies that will be announcing big data programs at a gathering on Thursday at the American Association for the Advancement of Science in Washington include the United States Geological Survey, the Defense Department, the Defense Advanced Research Projects Agency, and the Energy Department. These initiatives will mostly be seeking the best ideas from university and corporate researchers for collaborative projects.

The private sector is the leader in many applications of big data computing. Internet powers like Google and Facebook are masters at instantaneously mining Web data, click streams, search queries and messages to finely target users for online advertisements. Many major software companies, including IBM., Microsoft, Oracle, SAP, and SAS Institute, and a growing band of start-ups, are focused on the opportunity in big data computing.

Still, there is an important complementary role for the government to play where the incentives for private investment are lacking, according to administration officials and computer scientists. Such areas, they say, include scientific discovery in fields like astronomy and physics, research into policy issues like privacy, and funding for research at universities, where the high-technology work force of the future is educated.

But for government departments and agencies promoting and mastering big data computing, there is self-interest as well.

"There is recognition by a broad range of federal agencies that further advances in big data management and analysis are critical to achieving their missions," said Edward Lazowska, a computer scientist at the University of Washington. "It doesn't matter whether the mission is national defense, energy efficiency, evidence-based health care, education or scientific discovery."

At the session on Thursday, there will be presentations by scientists who are experts in big data computing.

Astronomy is a pioneering discipline for the approach. The Sloan Digital Sky Survey has used digital sensors to scan distant galaxies from an optical telescope in New Mexico, collecting vast amounts of image data that are processed with powerful computers.

The resulting three-dimensional mapping has yielded a "visual representation of the evolution of the universe," said Alexander Szalay, a professor at Johns Hopkins University. He calls the digital sky program a "cosmic genome project."

At Stanford University, an intriguing big-data experiment in online education is under way. Last year, three computer science courses, including videos and assignments, were put online. Hundreds of thousands of students have registered and participated in the courses.

The courses generate huge amounts of data on how students learn, what teaching strategies work best and what models do not, said Daphne Koller, a professor at the Stanford Artificial Intelligence Laboratory.

In most education research, teaching methods are tested in small groups, comparing results in different classrooms, Ms. Koller explained. With small sample groups, research conclusions tend to be uncertain, she said, and results are often not available until tests at the end of school semesters.

But in an online class of 20,000 students, whose every mouse click is tracked in real time, the research can be more definitive and more immediate, Ms. Koller said.

"If 5,000 people had the same wrong answer, it's obvious a concept is not getting through, and you have a clear path that shows where students went wrong," she said.

That kind of data tracking in education, she said, provides "an opportunity no one has exploited yet."

Red Hat: What an Open Cloud Really Means

Excerpted from ComputerWorld Report by Brandon Butler

Open source cloud offerings have specific characteristics that provide benefits above and beyond proprietary offerings, two top officials at Red Hat said during a webinar today.

"Everyone's talking openness when it comes to cloud computing, because it's something users say they want," said Gordon Haff, Cloud Evangelist for Red Hat. "And providers are saying, 'Yeah, yeah, we're open.' But in fact, 'openness' has a narrow definition."

Cloud providers seem to have latched on to the idea of openness in the cloud. This month, for example, Amazon Web Services announced an agreement with Eucalyptus, which is an open source provider of private cloud systems, that will allow for easier connection of open-source private clouds with AWS' public cloud offerings.

Meanwhile, the OpenStack movement is advancing. Started two years ago by Rackspace and NASA, it has now grown to include more than 150 companies and more than 2,000 developers. Plus, some major backers have signed on to the project, including HP and Dell. Meanwhile, some start-ups, such as Piston Cloud, are already selling open-source cloud solutions based on the OpenStack software.

But Haff and CIO Lee Congdon said true open projects have a number of characteristics that users should look for. For example, open clouds are fundamentally based on an open source code developed and supported by a viable independent community that is not run by a single vendor. Open clouds are unencumbered by patents and intellectual property (IP) restrictions, which means they have extensible and open application program interfaces (APIs) that are shared within the community, gives users the ability to move across various cloud environments easily.

These characteristics create a series of benefits that Congdon said make open clouds superior to proprietary offerings. "The power of open source is about being able to have collaboration, the sharing of resources and rolling back out into the community," Congdon said. In open source projects, there are "multiple ways to solve a problem," which gives users the freedom and flexibility to scale their systems and quickly adapt to enterprise needs, all while running on legacy hardware systems.

Still Congdon said there are concerns with implementing cloud strategies, particularly in the software-as-a-service (SaaS) market, where he said there can be a risk of vendor lock in. "SaaS is great, it's made a number of important advances, but lock-in is a real concern for many SaaS solutions, so you have to be careful," he said. Consider if that vendor is stable, if it will be acquired, and if so what that could mean for the application.

Those concerns haven't stopped Red Hat from embracing the cloud internally though. Congdon said the company has about a third of its applications in a SaaS-based cloud environment.

"We don't know yet where it's going to make sense to stop moving applications to the cloud," he said. Red Hat will continue to use products and services internally so it can be used as a reference customer, he said. "We're going to be pushing a lot of stuff into public cloud environments in the coming years and we'll be partnering with business users on when and where that makes sense," Congdon said. "So we're just getting started."

Autodesk CEO Pushes "Democratization" of Technology 

Excerpted from ReadWriteWeb Report by Fredric Paul

Most people think of Autodesk as the maker of AutoCAD, the design software of choice for architects, engineers, and other design professionals - typically running on high-powered workstations. So why is Autodesk CEO Carl Bass so hung up on the "democratization" of technology - spreading technology to the cloud computing platforms and mobile devices?

At the company's media summit in San Francisco this morning, Bass told a crowd of journalists, analysts and customers gathered in the company's slick design gallery that the combination of mobile devices, cloud computing, and social collaboration is more profound than the shift to PCs.

Bass sees the world changing from a PC-centric model where workers promise to "e-mail you that file when I get back to the office," to an environment where mobile devices and the cloud make where ever you are the computing center of the world.

It's already happening, he claimed, citing a list of impressive usage figures:

2 million unique visitors a month to Autodesk 360, the company's cloud offering.

30 files a minute uploaded to AutoCAD WS, the company's cloud-based AutoCAD editor.

10 million downloads of SketchBook in 2 years, now averaging 150,000 per week on PC and mobile platforms.

13 million unique visitors - more than Pinterest - to the company's Instructables community.

21 million unique visitors a month to Pixlr, its online photo editor.

On the low end, naysayers like to denigrate the importance of mobile products, Bass said, calling them "juvenile" "toys." But he pointed out that "consumers by night are often professionals by day."

He also claimed that professionals can do serious work on today's portable devices. "I think we're underestimating these small devices in the work that we do. They can run serious apps" for engineers and other demanding users, and they are getting more powerful all the time.

Meanwhile, on the high end, the cloud lets anyone take advantage of analyses that used to require dedicating expensive workstations for days at a time. Now, "You can do it in the cloud in an hour," he said.

The cloud, Bass added, "is an infinitely scalable resource," limited only by how much you're willing to pay. For urgent jobs, you can pay more and get it done faster. Other tasks can be done more cheaply over time. And that raises a fundamental question: "What would you do differently if you could compute answers faster?"

Autodesk may be a bit ahead of its time. The vast majority of serious design work is still being done sitting at powerful workstations, just as it has been for a while. But Bass couldn't be more correct about the trends. It's hard to argue that more and more computing tasks are going to stop moving away from the desktop. Big, data intensive jobs will move to the cloud while smaller, more UI-focused tasks are going mobile.

There will always be some things best done sitting at your computer. But the number of those things is clearly shrinking, not growing.

Apple TV Set - Not Yet

Excerpted from Around the Net in Online Marketing Report by Gavin O'Malley

Ready or not for an Apple TV, the market might have to wait longer than expected for the supposedly transformative device. According to Asian research group CLSA, the big release won't occur until 2013.

"We continue to view Apple TV hardware as a 2013 event," reads the research brief first reported by Business Insider.

Either way, "most clients agree that a TV is coming," writes BI. "The critical question becomes how the video providers fit into the equation and how Apple's offering would/could differ from current TVs beyond iOS/icloud."

"At this moment, the Apple television is technically fictional," Fast Company reminds us. "However, numerous pieces of information have suggested a late-2012 time window for the arrival of the iTV and it logically makes sense: Apple's iPad was its last industry-shaking device, and that was a whole two years ago. Slackers!"

Regarding the report from the Hong Kong research firm, Fast Company calls it "probable." In fact, "Looking at what Apple's up to right now it would seem like a late-2012 launch would be odd."

Still, "CLSA does not have a particularly prolific track record - the only previous report we have on record from the firm is an incorrect one from June 2011 claiming that an LTE-equipped iPad would launch ahead of the 2011 holiday season," MacRumors reports. "The new iPad with LTE support of course didn't launch until earlier this month."

As CNET reports, "Rumors have flown over the last couple of years that an Apple TV set will debut in the near future. At the beginning of January, sources were speculating a 50-inch screen, and at the end of January Apple was reportedly speaking to major TV component suppliers."

Meanwhile, in January, USA Today quoted Apple co-founder Steve Wozniak as saying: "I do expect Apple to make an attempt to get into the TV set business, since I expect the living room to remain a center for family entertainment, and that touches on all areas of consumer products that Apple is already making."

Broadband Players Turn to Caching 

Excerpted from TeleCompetitor Report by John Engebretson

Caching, a technology originally deployed by Internet service providers (ISPs) in the 1990s, is seeing renewed interest - but this time it's broadband service providers that are deploying the technology, as an announcement expected today from Frontier Communications illustrates.

The idea of caching is to retain popular web content in a cache near the end-user to eliminate the need to repeatedly send the same content over wide area connections. In the 1990s, ISPs began deploying caching technology to minimize their costs. Today broadband providers are installing it even closer to the end user to minimize their traffic, which has grown tremendously with the advent of over-the-top video.

Frontier plans to announce that it is using a suite of products from BTI Systems, including BTI's WideCast caching solution. As Frontier Senior Vice President of Engineering and Technology Michael Golob explained in an interview, Frontier already has extensively deployed BTI's packet optical networking equipment, which he said "scales down well" in rural markets, such as those the carrier acquired from Verizon.

In isolated markets where Frontier has to purchase circuits from other carriers for backhaul connectivity to Internet points of presence, the carrier also has deployed the WideCast offering, which is installed on BTI's packet optical networking platform.

Golob pointed to the example of Moab, UT - an isolated Frontier community nearly 300 miles from Salt Lake City. Internet traffic to and from Moab must be backhauled to an Internet point of presence (PoP) in Salt Lake City, and Frontier must rely on circuits leased from several other carriers for that route."The average cost is about $230 a megabit," said Golob.

By using WideCast, Frontier has been able to reduce backhaul traffic - and costs - by about 25% while also improving the end-user experience, Golob said. Although Golob did not have any metrics to illustrate the improvement in the end user experience, he said, "The response time to get a movie or website is faster and the feedback we get from customers is that our performance is better than competitors in the marketplace."

Not every caching vendor uses the same approach as BTI. Rather than making it part of a packet optical networking platform some vendors have implemented caching on a stand-alone server - an approach Golob said is similar to what content delivery networks such as Akamai and Limelight use. But Golob said the cost-effectiveness of that approach for an individual market has not yet been proven.

Frontier will likely explore the possibility of a server-based approach next year, said Golob.

CiRBA Secures $15 Million to Fuel Growth

Excerpted from SYS-CON Media Report by Pat Romanski

CiRBA on Tuesday announced it has raised $15 million in its third round of institutional funding led by Tandem Expansion Fund. Existing investors Sigma Partners and Edgestone Capital Partners also participated in the deal. The growth round of financing follows a series of enterprise-wide sales to some of the largest financial institutions in the world. These enterprise customers helped CiRBA achieve record software bookings in 2011.

Chris Legg, Managing Partner, Tandem Expansion Fund, noted that "CiRBA is far ahead of other software products both in terms of capability and customers with enterprise deals of up to 100,000 servers. We feel we have made a great investment and look forward to supporting their growth."

CiRBA software provides intelligent control for infrastructure. CiRBA: Controls workload placements, resource allocations, and capacity reservations, and enables automation by distributing intelligent commands that optimize infrastructure and reduce operational risks to existing management solutions.

IT organizations are facing a new level of complexity in these dynamic environments. Without continuous predictive intelligence to control and automate infrastructure management, IT organizations will be unable to deliver promised efficiencies and will introduce operational risk.

Is Hadoop The New Tape?

Excerpted from CNET News Report by John Webster

Is Hadoop a cool new petabyte storage array? Or is it already as obsolete as yesteryear's tape drives? Some think Hadoop is merely a bridge to some better, future platform.

I attended GigaOM's Structure: Data 2012 conference in New York City last week. This is the second one I've attended and I'm now a confirmed advocate of this event. Om Malik brings together people who, in one way or another, represent much the creative thinking around so-called big data. I got the feeling that I could strike up a conversation with anyone there and learn something new.

I noticed at least two major differences between the Structure: Data event I attended last year and this year's version. Last year, most if not all of the exhibiting vendors represented the xSQL community (mySQL, NoSQL, etc.). Much more diversity was on display this year.

Hadoop vendors were there in force - no surprise there. Hadoop is an open-source-developed, distributed computing platform commonly used for data-intensive analytics and business intelligence applications. But, I was surprised by the number of storage vendors making the case for using shared storage in big-data analytics architectures that are typically classified as "shared nothing." Shared storage mixes with shared nothing like oil mixes with water.

So in keeping with the theme of storage in big-data analytics, here are a few choice comments I heard during sessions and on the show floor that relate to storage and Hadoop:

"Hadoop is a revamp of how we store and access data."

This one got me thinking about Hadoop as a storage device. One of the presenters mentioned that if you put 1 terabyte of RAM into each of 1,000 Hadoop data nodes in a single cluster you would have, in aggregate, 1 petabyte of very high-performance storage that's built on open-source software and commodity hardware. And, there's more to this story.

Hadoop has an embedded, distributed files system as do some scale-out network-attached storage (NAS) implementations. Data protection is built-in. It's not RAID (redundant array of independent disks), but Hadoop does maintain multiple copies of data (typically three) across data nodes. So, should you as an IT administrator evaluate Hadoop on the basis of it being a storage device? I think you should.

"Hadoop is not about real time."

The Hadoop community prefers to use disk storage embedded in each data node rather than large, centralized, shared storage (NAS, SAN), for two reasons: 1. Speed. Using DAS (direct-attached storage) in each data node reduces overall cluster latency. Users get closer to having results available to them in real time. 2. Cost. DAS is inexpensive. NAS and SAN (storage area network) are perceived to be not so.

But, if your Hadoop cluster isn't about real time, and you want your Hadoop environment to have some of the features that shared storage brings to the table -- like dynamic capacity expansion, snapshotting, and data deduplication -- then shared storage is worth considering.

"The big elephant doesn't move through the little pipes especially well."

Getting data into and out of Hadoop from remote locations is a problem that has been identified by service providers. Storage system and networking developers have been working out the "data here, data there" issues for years. Distributed storage architectures are emerging that could address this problem.

"Hadoop is the new tape."

Yikes. We go from Hadoop being the cool new storage thing to yesterday's toast in the span of a single blog post. Here's how I interpret this comment. There's a debate going on within the Hadoop community regarding the need for better responsiveness from Hadoop developers. Known issues with Apache Hadoop need to be addressed more quickly.

The user learning curve needs to be concatenated. There are other knocks too. All of which leads some people to believe that Hadoop is merely a bridge to some better, future platform. Me? I'm in the definite maybe camp. I do see an opportunity for Hadoop implementations with applications built on top that would address the user elongated learning curve issue.

More to come from this event in future blog posts. Here's another comment I could start one with: "We don't actually need big data. What we need is small data."

Buffalo LinkStation Pro Duo 2TB: An Affordable, Solid Performer

Excerpted from PCWorld by Jon Jacobi

The $380 Buffalo LinkStation Pro Duo streams media well - and it performs well, too, thanks to its 1.6GHz CPU. Software features include the usual sharing, DLNA/iTunes media serving, and Mac Time Machine support, but you also get rarer perks such as BitTorrent downloads, Squeezebox streaming, backup from PCs using the five supplied licenses to Nova Backup, and website serving.

The LinkStation Pro Duo is a basic two-bay NAS box. It comes with a single USB 2.0 port on the back, but no USB 3.0 or eSATA. The unit also lacks a quick-copy button. The two drive bays are easy to reach when you pop off part of the front cover, but they aren't hot-swappable.

In its default RAID 0 configuration, the LinkStation Pro Duo read our 10GB large file at 68.5 megabytes per second. That's speedy enough to handle streaming even 1080p video, though it's only slightly faster than the slowest box we tested for our roundup of 11 NAS boxes. The other performance numbers are lower-echelon as well, though adequate for home use. The box wrote our 10GB mix of files and folders at 27.5 MBps, read them at 40.6 MBps, and wrote the large 10GB file at 45.6 MBps.

RAID 0 increases performance slightly and uses the full capacity of the box's drives, but it leaves all of the data on the box at risk. Drive failure happens, so keep the LinkStation Pro Duo backed up, or consider reconfiguring it to RAID 1 mirroring to build in data redundancy.

The LinkStation Pro Duo's operating system is generally well-organized, but the language and help are often terse, technical, and not particularly easy to follow. You can puzzle stuff out fairly quickly if you have the tech chops, but we noticed some very loose ends in the interface and in what it describes.

The 2TB LinkStation Pro Duo's street price of approximately $380 puts it in direct competition with the Western Digital My Book Live Duo 4TB. The latter offers twice the capacity for the same price, as well as faster (by 20 MBps) large-file read performance. That makes the LinkStation Pro Duo a tough sell, even though it has some attractive software features that the My Book Live Duo lacks.

Coming Events of Interest

2012 NAB Show - April 14th-19th in Las Vegas, NV. From Broadcasting to Broader-casting, the NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, the NAB Show has proudly served as the incubator for excellence - helping to breathe life into content everywhere. 

CLOUD COMPUTING CONFERENCE at NAB - April 16th in Las Vegas, NV. Don't miss this full-day conference focusing on the impact of cloud computing solutions on all aspects of production, storage, and delivery of television programming and video.

Cloud Computing World Forum - May 8th in Johannesburg, South Africa. The Cloud Computing World Forum Africa is the only place to discuss the latest topics in cloud, including security, mobile, applications, communications, virtualization, CRM and much, much more.

Cloud Expo - June 11th-14th in New York, NY. Two unstoppable enterprise IT trends, Cloud Computing and Big Data, will converge in New York at the tenth annual Cloud Expo being held at the Javits Convention Center. A vast selection of technical and strategic General Sessions, Industry Keynotes, Power Panels, Breakout Sessions, and a bustling Expo Floor.

IEEE 32nd International Conference on Distributed Computing - June 18th-21st in Taipa, Macao. ICDCS brings together scientists and engineers in industry, academia, and government: Cloud Computing Systems, Algorithms and Theory, Distributed OS and Middleware, Data Management and Data Centers, Network/Web/P2P Protocols and Applications, Fault Tolerance and Dependability, Wireless, Mobile, Sensor, and Ubiquitous Computing, Security and Privacy.

Cloud Management Summit - June 19th in Mountain View, CA. A forum for corporate decision-makers to learn about how to manage today's public, private, and hybrid clouds using the latest cloud solutions and strategies aimed at addressing their application management, access control, performance management, helpdesk, security, storage, and service management requirements on-premise and in the cloud.

2012 Creative Storage Conference - June 26th in Culver City. CA. In association with key industry sponsors, CS2012 is finalizing a series of technology, application, and trend sessions that will feature distinguished experts from the professional media and entertainment industries.

Copyright 2008 Distributed Computing Industry Association
This page last updated April 13, 2012
Privacy Policy