July 30, 2012
Volume XL, Issue 5
DataDirect Networks Sponsors CLOUD COMPUTING WEST 2012
The DCIA and CCA proudly announce that DataDirect Networks (DDN) has signed on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.
DDN is is the world's largest, privately-held, data storage infrastructure provider. With a unique and exacting focus on the requirements of today's massive unstructured data generators, DDN has innovated a comprehensive product portfolio for "big data" applications, which are optimized for the world's most data-intensive environments including high performance computing, life science research, web and cloud content, professional media, homeland security, intelligence, and more.
In many ways, DDN has been mastering big data challenges before the term "big data" was even invented. By supporting the requirements of the world's largest file storage systems and demanding applications for over ten years, DDN has developed both domain expertise and an unfair advantage in today's new data explosion.
In partnership with a worldwide network of resellers and integrators, DDN is ideally positioned to capitalize on its first mover advantage in the "big data age." As big data continues to become more prevalent and democratized across all industries — by businesses looking to capture, analyze, and derive insights from new data generators — DDN will continue to enable organizations to maximize the value of information everywhere.
CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry's fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.
DDN will keynote the "Cloud Storage" session at the Entertainment Content Delivery conference, and the "Big Data" session at the Investing in the Cloud conference.
CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.
At the end of the first full-day of co-located conferences,attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.
So register today to attend CCW:2012 and don't forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.
The Age of Distributed Computing & Innovative Web Solutions
Excerpted from Sooper Article Report by James Paul
The ever-expanding Internet is becoming an essential source for entertainment, business, and communications. The Internet links millions of computers, and anybody with a basic personal computer (PC) and basic communication equipment can take part in it. Working 24x7x365 with the help of the Internet is the huge the World Wide Web or simply the web.
If the Internet is like the electric power running in the network of wires within your home, the web is different things like light bulbs and fans (applications). The capability of the web to deliver multimedia database at a stroke of the keyboard or a move of the mouse is nothing but the eighth wonder of the world.
This is truly the age of distributed computing, network services and innovative web solutions. With the evolvement of the Net, new ways of computing are made possible. Network computing is one possible development that promises greater compatibility and efficiency than personal computing.
Modern network services are geared towards replacing the PC (fat client) with network computing or "thin client." Unlike a fat client computer (which needs large amounts of processing power, disk space and data servers), a thin client computer needs less memory and no disk storage and so is much cheaper.
Instead of storing application programs and data in local hard disk, the thin client computer simply downloads programs from a central server into its RAM as needed. This will make computing cheaper than ever before.
Network computing has been made possible because the computer industry has agreed hardware and software standards, including a new language, Java. Whether the present Internet infrastructure (phone lines, cables, etc.) has the required capability, speed, and consistency to support network computing remains to be seen.
The Internet and the web are constantly evolving. In the future, heavy computers, wires, and cables will be made redundant and network services will be required to maintain the few central servers only, instead of a number of personal servers at every stage. Already, we are witnessing innovative web-based solutions (virus protection programs, for example) to solve many PC problems, which earlier required manual installation.
The scenes depicted in sci-fi and futuristic movies may one day come true, as we all will be connected via networked computers. Every human activity will be recorded and accounted, reducing crimes and human misery. While some experts maintain that it will create the new set of problems, the future looks optimistic. Let us wait and watch.
Report from CEO Marty Lafferty
The US Senate voted 84-11 this week to proceed to debate, amend, and vote on the newly revised Cybersecurity Act of 2012 (S. 3414).
Once Senate Majority Leader Harry Reid (D-NV) agreed to an open amendment process, Republican opponents consented to support a cloture vote that would end debate and bring the bill to the floor. The bill, as currently constituted, provides for voluntary cybersecurity standards, but also authorizes agencies that regulate critical infrastructure — like the Federal Communications Commission (FCC) — to codify and enforce those standards.
The measure also allows for information sharing among industry players, with anti-trust and liability carve-outs, and gives Internet service providers (ISPs) authority to monitor traffic and take countermeasures.
The Obama Administration expressed its strong support for the measure, while offering a few amendments of its own. "The Administration strongly supports Senate passage of S. 3414," said the Office for Management and Budget (OMB).
"While lacking some of the key provisions of earlier bills, the revised legislation will provide important tools to strengthen the Nation's response to cybersecurity risks. The legislation also reflects many of the priorities included in the Administration's legislative proposal."
The White House also signaled its opposition to any changes that would weaken the bill's privacy protections.
"The Administration particularly appreciates the bill's strong protections for privacy and civil liberties, and would not support amendments that weaken these protections. The Administration agrees that it is essential that the collection, use, and disclosure of such information remain closely tied to the purposes of detecting and mitigating cybersecurity threats, while still allowing law enforcement to investigate and prosecute serious crimes. All entities — public and private — must be accountable for how they handle such data."
As the week progressed, Democratic majority backers of the bill asserted that cybersecurity guidelines would be voluntary, while Republicans said such guidelines were voluntary in name only.
But the White House said it would not support amendments "reducing the Federal Government's existing roles and responsibilities in coordinating and endorsing the outcome-based cybersecurity practices; weakening the statutory authorities of the Department of Homeland Security to accomplish its critical infrastructure protection mission; or substantially expanding the narrowly-tailored liability protections for private sector entities."
Republicans argued that the private sector needs broad liability protections from anti-trust concerns over competitive issues arising from sharing info with the government or each other. But Democrats argued that "overly broad" industry immunity "would undermine the very trust that the bill seeks to strengthen."
OMB also said it had concerns about provisions "purporting to prescribe the Executive branch's responsibilities in coordinating with foreign governments and conducting diplomatic negotiations," and wanted it to be made clear that the President "has exclusive constitutional authority to conduct diplomacy."
It also sought clarification on "protection of intelligence sources and methods, as well as information sharing and policy coordination."
Demand Progress, which is still not satisfied with the measure, continued its letter-writing campaign to the Senate to urge additional pro-privacy protection language.
Meanwhile, the release of the Declaration of Internet Freedom has sparked a vigorous global discussion about the role of the Internet in our lives — and what users can do to help keep it free and open in the face of continuous threats.
Interested parties from all around the world are organizing related Internet Freedom Events, such as "translate-a-thons" to create and distribute as many translations of the as possible, as well as happy hours, "series of tubes" tubing excursions, beach parties, presentations, art shows, and, naturally this season, barbecues. Please click here for photos. Share wisely, and take care.
Digital's Big Four Jockey for Dominance, Reshaping Industry
Excerpted from eMarketer Report
A significant portion of the digital experience now rests in the hands of four companies — Amazon, Apple, Facebook, and Google. "Other than content creation, it's difficult to imagine any aspect of today's digital landscape where at least one of the Big Four fails to play a prominent, if not defining, role," said eMarketer in the new report The Changing Digital Landscape: Key Trends Marketers Need to Know.
"Their clashes are reshaping the digital landscape, affecting hardware, software, services, the delivery and sale of content, advertising, and commerce."
All four companies are not competing equally in all of these realms, of course. But by a combination of necessity and design, "all have expanded far beyond their core competencies in an effort to strengthen their appeal to users, solidify their standing with marketers and maintain their easily eroded relevance in the fast-moving digital economy," said eMarketer.
For example, with Android, Google initially took on Apple in the smart-phone arena. Google introduced its own Nexus One smart-phone at the beginning of 2009. Although the device itself was a commercial failure, it accomplished Google's larger goal of jumpstarting the Android platform. And now Google is moving into device manufacturing, introducing its own Nexus 7 tablet.
By comparison, Facebook, as a platform that sits across every device and operating system, finds itself at a competitive disadvantage. The persistent rumors of a Facebook-developed phone reflect this current liability.
"Expansionary moves on the part of the Big Four reflect a corresponding evolution in the digital world: Tight, increasingly verticalized integration of hardware, software, content, services, advertising, and commerce has become table stakes," said eMarketer. "Competing effectively now requires direct control over many, if not most of these assets, and enough leverage in areas of strength to compensate for areas of weakness."
Overall, the influence the Big Four wield over the digital experience is broad-reaching. "In many ways, it is their world; everyone else — consumers and marketers alike — simply play (or work) in it."
The full report, "The Changing Digital Landscape: Key Trends Marketers Need to Know," also answers these key questions: Is the future of digital media open or closed? How is the ascent of smart devices redefining the computing—and marketing—landscape? And to what extent have platforms become a dominant force in mediating the digital media experience for both consumers and marketers?
Cloud Computing: The Transformational Cloud
Excerpted from Cloud Computing Journal Report by Pat Romanski
"Cloud solutions are great for lots of workloads but the cloud is not perfect for everything," noted Robert Miggins, Senior Vice President of Business Development for PEER1 Hosting, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. And Miggins continued, "We have many examples of highly skilled clients that are staying with traditional hosting or pursuing hybrid solutions for various reasons."
Cloud Computing Journal: Just having the enterprise data is good. Extracting meaningful information out of this data is priceless. Agree or disagree?
Robert Miggins: I strongly agree. What good is data without putting it to use to make good decisions?
Cloud Computing Journal: Forrester's James Staten said, "Not everything will move to the cloud as there are many business processes, data sets, and workflows that require specific hardware or proprietary solutions that can't take advantage of cloud economics. For this reason we'll likely still have mainframes 20 years from now." Agree or disagree?
Miggins: AGREE - cloud solutions are great for lots of workloads but the cloud is not perfect for everything. We have many examples of highly skilled clients who are staying with traditional hosting or pursuing hybrid solutions for various reasons. And that's why PEER1 Hosting is a full solution provider - because there are lots of ways to help businesses with their IT and hosting and the cloud is just one of those ways.
Cloud Computing Journal: The price of cloud computing will go up - so will the demand. Agree or disagree?
Miggins: Disagree - cloud solutions, like the cost of hardware itself, will continue to decline - however, they will not drop off as aggressively as server hardware costs.
Cloud Computing Journal: Rackspace is reporting an 80% growth from cloud computing, Amazon continues to innovate and make great strides, and Microsoft, Dell and other big players are positioning themselves as big leaders. Are you expecting in the next 18 months to see the bottom fall out and scores of cloud providers failing or getting gobbled up by bigger players? Or what?
Miggins: Maybe so, but there will also be lots of new start-ups entering the market. Cloud is so transformational that there will continue to be lots of ways to innovate.
Cloud Computing Journal: Please name one thing that - despite what we all may have heard or read - you are certain is NOT going to happen in the future, with Cloud and Big Data?
Miggins: Datasets are not getting any smaller. We can all agree that the amount of data stored in the cloud will continue to grow.
Huawei Knocks-Off Ericsson as World's Biggest Telecom Vendor
Excerpted from GigaOM Report by Kevin Fitchard
Huawei reported 2012 half-year revenues this week that make it the largest telecom infrastructure maker in the world — a title formerly belonging to Ericsson. The two, however, are neck and neck and a new contract or fluctuation in currency could see the two changing places once again.
Huawei on Tuesday reported revenues of $16.1 billion for the first six months of 2012. That would seem like a perfectly ordinary quarter for the giant and growing Chinese telecom vendor, but there is something particularly significant of about this earnings report.
According to some quick calculations made by Light Reading's Ray Le Maistre, Huawei's sales have surpassed Ericsson's, making the privately held company the largest telco infrastructure maker in the world.
Sweden's Ericsson brought in $15.25 billion in the first half of the year, putting it $850 million shy of its Chinese rival. That may seem like a lot, but currency exchange rates differences between the Chinese yuan and the Swedish kroner have a big impact.
The two also have different portfolios. Ericsson is still by far the largest cellular infrastructure maker in the world, while Huawei has sizable handset and enterprise businesses. Ericsson no longer has the revenues from its handset joint venture with Sony, but it did get a big sales boost this year from its recent acquisition of network systems vendor Telcordia.
Ericsson would surely argue it sells more actual telecom network gear than Huawei, but one thing is certain: this race isn't over. Both companies are growing despite the poor global economy, and as they continue to land more contracts and currencies continue to fluctuate, they likely will keep leapfrogging one another. Huawei and Ericsson are both well ahead of their next closest competitors, Alcatel-Lucent and Nokia Siemens Networks.
The amazing thing is that Huawei has risen to global network prominence despite having almost no impact in the US, which along with China are the two most important infrastructure markets in the world.
Huawei has some handset deals to sell carrier-rebranded smart-phones — the biggest of which is for T-Mobile's next generation of MyTouch phones — but it doesn't have a single major network equipment contract to its name in the US.
Meanwhile Ericsson has its fingers in every major network build of the Big 4 carriers — and most of the smaller contracts as well.
Huawei attributes this to ingrained prejudice in US government circles against a Chinese vendor building the country's sensitive communications networks. As a privately held company, Huawei lacks the transparency of its competitors, which all trade publicly on major global exchanges.
Alleged links between Huawei and China's People's Liberation Army have led the US government to block government contracts and acquisitions of domestic companies. Huawei has denied such links and has even invited a US investigation to assuage any security concerns.
In a recent interview, Huawei External Affairs VP in the US Bill Plummer said those sinister perceptions of Huawei have cost it a huge amount of business in the US, even though European and Canadian carriers haven't shied away from dealing with the vendor. Plummer said Huawei was on the verge of becoming the third supplier in Sprint's LTE contract and CDMA network overhaul, but politics got in the way (the contract went to Samsung).
"We were the most competitive offering for Sprint in terms of technology and total cost of ownership, but non-market forces dictated the result," he said.
How Telefonica Harnesses Cloud for OTT Services
Excerpted from TelecomTV Report
Telefonica Digital is following up its over the top (OTT) messaging service TU Me with other innovative services, all based on the TU Core platform developed by BlueVia.
TU Me, which was developed in just 100 days, stores users content in the cloud and, in addition to texting and instant messaging, also facilitates voice calls, photo and location sharing.
Please click here for a video featuring Jamie Finn, Director of Communications Products, Telefonica Digital.
I Want My Cloud TV!
Excerpted from Multichannel News Report by Todd Spangler
TV Everywhere observers, take note: About one-third (35%) of adult broadband users consider having remote "cloud-based" access to their favorite TV shows to be highly valuable, ranking it a 6 or 7 on seven-point scale (with 1 being "of no value at all" and 7 being "of great value"), according to a new study by The Diffusion Group (TDG).
That's roughly in line with other media consumers said they would find highly valuable to have access to "in the cloud," including movies (37.6%), music (38.6%), and photos (41.6%), TDG found in the survey of 2,000 adult broadband users conducted in the second quarter of 2012.
However, overall, personal cloud-based media services are at a very nascent stage, according to TDG.
Only 9% of Internet users say they use some form of cloud-based media service or application (music, photos or video).
Only 4% of all respondents said they currently have access to TV shows in the cloud — which may indicate that many pay-TV subscribers don't know what is actually available to them online.
Small Ops Say "Dropping Channels Works"
Excerpted from Multichannel News Report by Mark Robichaux
Small operators have to develop a "thick skin and iron will" to drop the channels that demand exorbitant rate increases.
That's one strategy advocated on a panel at the Independent Show in Orlando, FL for small- and mid-sized cable operators. Although many small multiple system operators (MSOs) are fearful about dropping a major network, panelists said it's a tactic that at certain times makes sense to deploy.
On the panel devoted to "Programming Reform: What's Next As DC Starts to Move," John Higginbotham, Superintendent for the Franklin Plant Board, a municipal utility serving 17,000 subs in central Kentucky, said the company has had several "blackouts" of channels during retransmission-consent disputes. The longest was five months in 2005; in 2008 the utility dropped three networks, one for 12 days.
"Our company took a 4,500% increase on the retransmission fee for the in-market affiliate," he said. "It's crazy."
"You gotta have thick skin and iron will and take the beat-down from the customer," said Higginbotham. "I know it's tough to say," he added. "Dropping has worked. You don't want to do it, but sometimes you have to."
Other panelists seemed to agree - as long as consumers are "educated" about how the channels are pulled in retransmission scraps or normal network carriage fights. After the DirecTV-Viacom dispute, "we're seeing that consumers understand the perspective of the distributor more," said Tonya Rutherford, Assistant General Counsel for Business and Legal Affairs at Verizon. "Consumers get it."
"Content is king, but not all content is 'must-see,'" said John Bergmayer, Senior Staff Attorney for Public Knowledge. "And you can't just expect more and more every year because distributors will ultimately start to look elsewhere."
Almost everyone agreed that the economy's impact on viewers will have an impact on customer's ability to withstand big rate increases passed on from programming hikes. Likewise, this consumer frustration will be the primary driver of relief in many ways - in the form of an overhaul of the current Telecom Act.
Indeed, the panel opened with a video montage of legislators in various hearings complaining about retransmission-consent blackouts, the effects on consumers and the limited scope of FCC authority. All of the panelists agreed that Congress is likely to act on a broad telecom rewrite by next year.
Another promising possible solution to retransmission disputes could be technological. Aereo, the upstart broadcast TV service that is currently in a legal fight with broadcasters, seemed to be more of a solution for small cable operators.
"If Aereo can do this lawfully, and give subscribers an option to receive the over the air channel without going through the entire retranmission process, why couldn't a cable operator set up the same technological platform at the very least supply the subscriber at home their own antenna to receive free over-the-air TV as God and Congress intended?" asked Barbara Esbin, Senior Counsel at Cinnamon Mueller, a law firm helping small operators.
"It's an extremely promising example of increasing consumer choice, decreasing costs to consumers and making the provision of access to broadcast signals more frictionless, like it was in the beginning."
Four Tips for Managing Large Video in the Cloud
Excerpted from Streaming Media Report
Bandwidth is a problem for high bit-rate video. Cloud-based transcoding has enormous advantages over on-premise transcoding: better ROI, faster speeds, and massive scalability. But professional video content is often stored at 30-100 Mbps (or more), resulting in very large files. Conventional wisdom holds that these files are too large to transfer over the public Internet.
This problem becomes even worse when considering the size of an entire content library. If a publisher creates two hours of high bit-rate 50 Mbps video each day, they will have a library of 32,000 GB after two years. What happens if it becomes necessary to transcode the entire library for a new mobile device or a new resolution? Even though a scalable transcoding system can transcode 32,000 GB of content in just a few hours, moving that content over the public Internet at 100 Mbps would take over 30 days. Fortunately, there are solutions to these problems, and major media organizations like Netflix and PBS are embracing cloud-based services.
We will discuss four techniques used by major publishers to eliminate these bandwidth bottlenecks and efficiently transcode video in the cloud.
The easiest way to eliminate bandwidth bottlenecks is to locate hosting and transcoding together. For example, if your transcoding system is run on Amazon EC2, and you archive your video with Amazon S3, you have free, near-instant transfer between storage and processing. (This isn't always possible, so if your storage and transcoding are in separate places, the next point will help.) To eliminate bandwidth bottlenecks, store video close to transcoding.
Second, use accelerated file transfer. When transferring files over long distances, standard TCP transfer protocols like FTP and HTTP under-utilize bandwidth significantly.
For example, a 100 Mbps connection may actually only transfer 10 Mbps over TCP, given a small amount of latency and packet loss. This is due to the structure of the TCP protocol, which scales back bandwidth utilization when it thinks the network is over-utilized. This is useful for general Internet traffic, because it ensures that everyone has fair access to limited bandwidth.
But it is counter-productive when transferring large files over a dedicated connection. When it is necessary to transfer high bit-rate content over the Internet, use accelerated file transfer technology. Aspera and other providers offer UDP-based transfer protocols, which perform significantly better than TCP over most network conditions.
If Aspera or other UDP-based file transfer technologies aren't an option, consider transferring files via multiple TCP connections to make up for some of the inefficiencies of TCP.
To maximize bandwidth utilization, use file transfer technologies like Aspera, UDP, or multiple TCP connections.
Third, transfer once, encode many times. For video to be viewable on multiple devices over various connection speeds, different video resolutions, bit-rates, and codecs are needed. Many web and mobile publishers create 10-20 versions of each file.
So when doing high-volume encoding, it is important that a file is only transferred once, and each transcode is then performed in parallel. When using this approach, you can effectively divide the transfer time by the number of encodes to determine the net transfer time. For example, if transfer takes 6 minutes, but you perform 10 transcodes in the cloud, the net transfer required for each transcode is only 36 seconds.
To achieve maximum efficiency, transfer a high quality file to the cloud only once, and then perform multiple encodes in parallel.
Fourth, syndicate from the cloud after transcoding. Whether you transcode in the cloud or on-premise, some bandwith is required. In one case, a high bit-rate mezzanine file is sent to the cloud for transcoding. In the other case, when transcoding on-premise, several transcoded files are sent directly to a CDN, publishing platform, or to partners like iTunes or Hulu.
Both cases require outbound bandwidth, and in many cases, syndicating from the cloud requires less overall bandwidth than syndicating from an on-premise system. For example, it is not uncommon for broadcast video to be syndicated at high bitrates. If a broadcaster uses a 100 Mbps mezzanine format, and then syndicates that content to five partners at 50 Mbps, it is clearly more efficient to only send the original file out of the network for transcoding, and let the transcoding system handle the other transfers.
Not everyone syndicates high bit-rate content, of course. But even when encoding lowbitrate web and mobile video, multiple small files adds up.
While transferring high bit-rate video can be a challenge, the correct approach to cloud transcoding can mitigate these problems.
High volume publishers should follow these
four basic guidelines: 1) Store content in the cloud; 2) Use accelerated file transfer technology; 3) Ingest each file once to a parallel cloud transcoding system; and 4) Syndicate directly from the cloud.
By implementing these recommendations, media companies of all types can offload video processing to the cloud, and realize the benefits of scale, flexibility, and ROI provided by cloud transcoding.
Sponsored Software Helps Artists Profit from BitTorrent
Excerpted from BBC News Report
A new scheme aims to enable artists to profit from users who download their work for nothing via BitTorrent.
Typically music obtained via BitTorrent downloads is copyright infringing and, according to the record industry, costs it millions in lost revenue.
The experiment, devised by the makers of the popular file-sharing software uTorrent, will encourage users to install sponsored applications.
Both the artist and BitTorrent would receive a cut of the proceeds.
The idea is being trialed by American hip-hop producer DJ Shadow with his latest release Total Breakdown: Hidden Transmissions From The MPC Era 1992-1996.
When users download the release, it will ask them whether they want to install RealPlayer, a media player which was widely used during the late 1990s and early 2000s, but has since dipped in popularity compared with Apple's iTunes and Windows Media Player.
However, users can still listen to the downloaded music without installing the sponsored software, which works only on PCs.
"We believe we can make digital distribution even more viable for creators and fans," said BitTorrent chief executive Eric Klinker.
"So, beginning now, we'll be testing new ways to drive profitability for creators while delivering even more meaningful media experiences for our users."
Mr. Klinker added that there were other schemes his company was considering to try to help artists make money out of the business.
The company already offers a paid version of its client uTorrent. The free version has more than 150 million users worldwide.
"New business models built on top of the BitTorrent ecosystem are the future of content," Mr Klinker said.
"This is where fans are. It's time to bring artists, film-makers, and game developers into that conversation in meaningful ways, too."
Spotify Marks its First Anniversary in the US with 13 Billion Listens
Excerpted from Engadget Report by Jon Fingas
They grow up so fast, don't they? Spotify's US launch was just over a year ago, and the streaming music outlet wants us to know just how big its baby is getting.
Americans listened to more than 13 billion tracks on the service in the first 365 days, and they shared more than twice as many — 27,834,742, to be exact.
Not surprisingly, just over half of that socializing went through Facebook. Spotify is likewise flaunting 2,700 years' worth of time spent skulking around its app platform.
Don't feel any pangs of regret if you forgot to buy something for Spotify's birthday, by the way: the company isn't holding any grudges and says you'll "love" what it has gift-wrapped for year two.
We're hoping that involves more free radio stations and fewer holdout musicians.
AppFog Lets You Pick Your Cloud, (Almost) Any Cloud
Excerpted from GigaOM Report by Barb Darrow
For companies wanting to put their workloads on a public cloud without having to sweat the details, AppFog has a bold proposition.
AppFog's platform-as-a-service (PaaS), available as of late Wednesday, abstracts out the tweaking and tuning of cloud servers, databases, and storage. And, if you want to run your work on Amazon and then move it to, say, Rackspace, or Microsoft Windows Azure, or the HP Cloud, you can do so with the click of a button, according to AppFog CEO Lucas Carlson.
The Portland, OR based company, which started out as a PHP-specific PaaS called PHPFog, has broadened and adjusted strategy in the past year, adding support for Java, .NET, and Node and other popular languages and deciding to restructure its foundation atop standard Cloud Foundry technology.
That means it can run across the major public clouds, now supporting the aforementioned Amazon, Rackspace, Microsoft, and HP offerings with more to come. "We will be adding them like mad — we'll have an all SSD cloud soon," Carlson said.
"We become your front-end to cloud. We took a standard Cloud Foundry API and delivered that across all the public clouds," Carlson said. The resulting PaaS has been put through its paces by 5,000 beta testers including the City of New York and 40,000 developers, he said.
This is a tall order. But so far, Matthew Knight, founder and CEO of Merchpin, a beta tester for the past four months, is impressed. Merchpin ran its e-commerce app on Amazon's infrastructure before AppFog but got bogged down with infrastructure fussing they had to do. "We were building our application and also having to deal with maintaining our servers. It was a pain. AppFog fixes that," he said.
That and its tight integration with MongoDB, Merchpin's database of choice, makes implementation and deployment extremely easy. He said that AppFog will cost more than Amazon alone but only in dollars. "When you factor in man hours, it's much less expensive," he said.
Companies can go to AppFog's site to set up a free account with 2 GB of RAM. Yes, it distills out all the other confusing pricing units listed by the public cloud providers. No need to worry about instances or storage type or database choice. AppFog prices on RAM requirement only.
Monthly plans with additional RAM are available: 4GB for $100; 16GB for $380 and 32 GB for $720. AppFog will bill the customer for the entire infrastructure stack, including the backend cloud, giving it pretty good account control.
Merchpin ran its application on Amazon before and after moving to AppFog so Knight has not tested its promised easy push-button cloud migrations. But, if it lives up to its billing, it would interest many companies looking into multi-cloud solutions, a trend that Carlson has done his best to promote. If AppFog really can move applications from cloud to cloud as advertised, it will be a huge draw.
Cloud Computing Needs Better Contracts - EC
Excerpted from Reuters Report by Claire Davenport
The European Commission (EC) wants cloud computing firms to improve contracts they offer customers in a drive aimed at averting costly legal disputes, allaying privacy concerns, and boosting an industry which can offer huge savings to users.
Buying computer hardware can be a drain for new and small companies, and huge savings can be made adopting cloud storage — using networks to connect remotely to servers elsewhere, possibly on a different continent.
But security and data privacy is a major concern, the EC said in a policy paper intended to encourage the technology.
A lack of trust is cited by many surveys as one of the main reasons companies do not embrace the cloud.
"The complexity and uncertainty of the legal framework for cloud services providers means that they often issue complex contracts or agreements with extensive disclaimers," the EC said in the paper, obtained by Reuters and expected to be published after the summer break.
"Contracts often do not accept liability for data integrity, confidentiality, or service continuity."
The EC said in the paper, which is not binding, it wanted to help the industry develop model agreements on issues such as which country's laws applied in a legal dispute between a service provider and a customer.
Data in the cloud is often stored or processed in two or more data centers, to ensure access even in the case of heavy network traffic.
And services are sometimes provided through a chain of firms with different tasks in various countries, making legal action by a dissatisfied client difficult.
Customers and policymakers want to ensure that wherever the data is, it has the same level of protection as the location where the client uses it, and that companies are held accountable.
The Commission will also look into whether binding laws will be needed for cloud services.
European Union regulators say they have not found cloud vendors to be forthcoming enough about what they will do for customers if a service is disrupted or data stolen.
"They have an attitude of take it, or leave it. You want it cheap then do it large-scale and we cannot tell you where the data goes," European Data Protection Supervisor Peter Hustinx said.
Hosts of data, which under EU law are typically companies which provide internet access or search engines which host but do not create content like websites, will not be liable in a legal row, the draft policy said.
Big cloud firms like Google and Microsoft are trying to lure more customers and are buying new sites for data centers to serve clients' rising dependence on faster and cheaper internet.
Google has recently bought land in Hong Kong, Singapore, and Taiwan to build new centers.
In June, Google announced that it would also offer model contracts.
Microsoft said it offers such model clauses to more than 500,000 users, sometimes including free audits for small and medium-sized businesses that cannot afford an auditing firm.
To make cloud computing a safe part of everyday life the EC said it will work with the World Trade Organization (WTO), the Organization for Economic Co-operation and Development (OECD)and the United Nations to develop a common legal basis.
The paper singled out Japan and the United States as two regions with whom it wanted to develop more common standards.
The United States has the most evolved cloud computing market and Japan has made strides driven by its need for faster disaster recovery systems after events such as an earthquake or tsunami.
Best Practices for Architecting Distributed Computing & Systems
Excerpted from Smashwords Report by James Seymour
When attempting the creation, test, and/or implementation of any system, strict and thorough planning and documentation is imperative from the start. This document presents the best practices with which to architect distributed computing and systems. These best practices have been successfully used in various corporations around the world.
The author of this set of best practices has been one who has found consistent success using them for over 27 years.
Coming Events of Interest
ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.
ITU Telecom World 2012 - October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.
CLOUD COMPUTING WEST 2012 - November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.
Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms - December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.
|