Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

QinetiQ

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

May 26, 2014
Volume XLVIII, Issue 4


Half of World's Population Headed for the Cloud

Excerpted from Fierce CIO Report by David Weldon

Nearly half of the world's population will be using cloud computing services in some fashion by 2018, according to a new study by Juniper Research.

In its report, Cloud Computing--Consumer Markets: Strategies and Forecasts 2014-2018, the research firm estimates that there are currently 2.4 billion people using cloud-based services in one form or another. But that number is expected to rise to 3.6 billion according to the study.

"The analysts said that advances in cloud storage, music, and games would propel this figure over the next few years," says an article at Cloud Pro. "It is said that the bulk of revenues in consumer cloud services would be from streaming services, while cloud storage services find attracting paying subscribers a struggle."

"Governance for all" is more than an end goal written in a plan; it's a strategy that unites IT and business content owners. Now you can learn to create a governance strategy to suit all your needs.

Meanwhile, a KPMG report on the future of investment banking has concluded that cloud computing is one of the "most disruptive forces in business in the past 20 years," says an article at Computer Weekly. The report advises banking institutions to move away from traditional methods of doing business and embrace cloud and other new technologies.

"Cloud computing continues to change the game," the KPMG research says. "Banks that continue to use outdated legacy systems will find it increasingly difficult to create and launch new services, to provide access to a mobile workforce and to accommodate geographically dispersed customers and partners as well or as quickly as their competitors who are operating in the field."

The KPMG report warns that "the stakes are high," and that the leading firms going forward will be those that can adapt their business mix and operating models to embrace new technologies.

Washington Welcomes the Cloud

Excerpted from Datacenter Dynamics Report by Chris Drake

Recent developments point to increasing dynamism and growth in the market for federal government cloud services in the US.

In mid-March it was reported that Microsoft is closer to the commercial launch of its Azure US Government Cloud. It will be commercially available in 2015, when Microsoft's government cloud platform will provide Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) offerings to US government agencies.

In late March, Cisco announced plans to spend $1 billion over the next two years on developing its own cloud service portfolio for large corporations and government bodies. Cisco Cloud Services will be delivered via a network of channel partners and will target enterprise and government customers looking to deploy hybrid clouds.

Also in late March, cloud market giant, Amazon Web Services (AWS), announced it had been awarded a new level of security certification from the US Department of Defense (DoD). The additional level of security accreditation means that Amazon will be able to target even more government agencies with its cloud computing services.

Moves made by Microsoft, Cisco and Amazon to create government cloud offerings — and acquire new certifications for existing offerings — reflect the strong potential for growth that is attributed to this market.

Although spending on cloud services by federal agencies still represents a small proportion of total IT spend, spending is predicted to rise significantly over the next few years.

There are 'push' and 'pull' factors driving this trend. One push factor is the Federal Data Center Consolidation Initiative (FDCCI), established in 2010 to reverse the historic expansion of federal data centers. The FDCCI's goal is to consolidate at least 800 government data centers by 2015.

The government also wants to reduce the cost of data center hardware, software and operations, shift IT investments to more efficient computing platforms, promote 'green technologies' and increase IT security.

IT consolidation is also seen to increase operational efficiencies while shared services are seen to not only cut costs but boost business process efficiencies.

The growth potential of the government cloud market is reflected in the rising number of providers targeting this sector. In addition to Amazon, Microsoft and Cisco, major cloud service providers include technology vendors such as IBM, HP, Dell, Oracle and VMware. Telecoms network operators such as AT&T, Verizon, and CenturyLink — the latter two provide government cloud services through their Terremark and Savvis businesses respectively — are also targeting the market.

Content delivery network service provider Akamai also offers cloud-based services to federal government agencies, including the DoD, NASA, and the Securities and Exchange Commission (SEC). The services it offers includes traffic management and content delivery, website acceleration, HD streaming, storage and DNS security.

Akamai is one of 11 cloud service providers which have, to date, been awarded Federal Risk and Authorization Management Program (FedRAMP) approval. FedRAMP is a US government-wide program that standardizes the approach to security assessment, authorization and continuous monitoring for cloud products and services.

Other companies that hold FedRAMP accreditation include AWS, AT&T, HP, IBM, Oracle, Lockheed Martin and Microsoft. The latter's Azure public cloud service received FedRAMP accreditation last September. This accreditation will be extended to Microsoft's new government cloud once it becomes operational next year.

The FedRAMP accreditation awarded to AWS in May 2013 is thought to have subsequently helped beat IBM to win a ten-year $600 million cloud computing contract with the Central Intelligence Agency (CIA). Some have described this contract win as a watershed moment which marks the start of a competitive and dynamic market for government cloud services. AWS will build a version of its public cloud inside the CIA's data centers and this will operate as a private cloud.

AWS first announced the launch of its public GovCloud offering in 2011. However, AWS' focus on the market for private cloud services makes sense, given how busy the market for public cloud services has since become. In addition, there are major opportunities for private cloud service providers to win lucrative business from government institutions — if they get their strategies right.

Although private cloud investments by government agencies will rise more rapidly, public cloud services will continue to be important for web hosting and low-risk data storage. There is also a growing market among federal government institutions for hybrid cloud environments which combine private and public architectures.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyWhile the controversy over the decision of the Federal Communications Commission (FCC) to proceed with public debate of its Notice of Proposed Rulemaking (NPRM) to Protect and Promote the Open Internet continues, the FCC this week indicated that the agency would also examine peering arrangements.

Netflix's recently concluded agreement with Comcast to pay the Internet service provider (ISP) not to slow video streaming traffic to its subscribers underscores the reasons for our concern regarding such arrangements.

DCINFO readers will recall that that the FCC voted 3-2 along party lines on May 15th to open its NPRM to comments for four months.

When the vote was taken, FCC Chairman Tom Wheeler said, "What we're dealing with today is a proposal, not a final rule. We are asking for specific comment on different approaches to accomplish the same goal, an open Internet."

This means that you have until July 15th to submit your initial comments on the proposal to the Commission, and until September 10th to follow-up with a reply to the first round.

By way of background, the Democratic Commissioners were lukewarm on this action: Commissioner Jessica Rosenworcel said, "I would have preferred a delay — I think we moved too fast;" Commissioner Mignon Clyburn said she, too, had misgivings.

And Commissioner Michael O'Rielly added, "The premise for imposing Net Neutrality rules is fundamentally flawed and rests on a faulty foundation of make-believe statutory authority. I have serious concerns that this ill-advised item will create damaging uncertainty and head the Commission down a slippery slope of regulation."

The two opposing Republican Commissioners said that the NRPM exceeds the agency's legal authority; there has been no evidence of actual harm or violation of Net Neutrality principles; and elected members of Congress should decide the issue, not regulatory appointees.

The proposed rules were limited to the so-called last mile between ISPs and consumers, and excluded operators of backbone transport networks that connect various parts of the Internet's central infrastructure.

Nevertheless, ISPs, including Verizon Communications, which brought the successful court challenge to the Commission's previous open Internet rules, warned against subjecting broadband to strict oversight.

"For the FCC to impose 1930s utility regulation on the Internet would lead to years of legal and regulatory uncertainty and would jeopardize investment and innovation in broadband," the company said.

On the other hand, public advocacy groups, like the Consumers Union, said the proposal did not go far enough.

"The agency's plan still appears to go against the principles of ensuring an open Internet. The proposal could negatively impact consumer prices, choices, and access to the Internet, as well as free speech and innovation," it said.

In our April 28th edition, we offered a supportive metaphor for permitting a fast lane to be introduced on the Internet for those content providers and consumers willing to pay extra for it: package delivery in the physical world.

FedEx, UPS, and the US Postal Service each offer tiered services with different delivery speeds and levels of protection, and these systems are not perceived as broken or grossly unfair to the public.

However, the rules did not address the issue of interconnection or the agreements under which content providers compensate operators for faster access.

Wheeler's interpretation had been that the scope of FCC rulemaking was limited to the last leg of the network that reaches the consumer.

Using the physical package delivery model, this would mean that rules to ensure equitable treatment would only apply to the final truck-roll, leaving out the jet, rail, and/or truck transport of a package preceding its final delivery.

This week's comments by Chairman Tom Wheeler before a US House of Representatives Energy and Commerce Subcommittee oversight hearing — that the agency "needs to be looking at and will be looking at" peering arrangements — represent a significant change from his previously stated position that these did not fall under the FCC's Net Neutrality jurisdiction.

While agreements between ISPs and the operators of Internet backbone systems for the exchange of traffic generally do not involve payment because equivalent amounts of traffic are usually going both ways, Comcast's charge to Netflix for a direct link to Comcast's high-speed Internet service shows why it is essential that open Internet rules, to be effective, need to cover end-to-end Internet access.

"Given some of the most recent actions out of the Commission," including the issuance of Net Neutrality rules and regulations governing coming auctions of airwaves for wireless broadband, "I fear that we may be headed into some rough waters," Subcommittee Chairman Greg Walden (R-OR) said.

Congresswoman Anna Eshoo (D-CA) added, "I don't want this to become an auction, selling off the best in bits and pieces where some pay for faster lanes, while others cannot pay and get stuck in a slow lane, which would unravel the values that have been the hallmark and the bulwark of the Internet."

For the DCIA's part, we're seeking a common ground among our Member companies that include the range of players most affected here: broadband network operators, major cloud computing solutions providers, content rights-holders, and web-based start-up companies.

Together these organizations have accomplished unprecedented advances in communications technologies and businesses built upon them, largely without and in some ways thanks to the absence of heavy-handed government intervention.

We will respond formally by July 15th, and meanwhile welcome your views, which you may submit to info@dcia.info for consideration. Share wisely, and take care.

Wheeler Says FCC Needs More Engineers

Excerpted from TV Technology Report by Leslie Stimson

FCC Chairman Tom Wheeler believes the FCC could use more engineers — and economists. He said so during testimony today before the House Subcommittee on Communications and Technology.

It's been six months since Wheeler has faced lawmakers; he was the sole person to testify during an FCC oversight hearing. 

The engineer comment came during an exchange between the Chairman and Congressman Peter Welch (D-VTT), who simply asked what the Commission needs to do its job. "Our IT infrastructure is worthy of being in the Smithsonian," Wheeler replied, noting the agency has computers "that have known risks." 

The Commission can't provide an easy way for the public or industry to accomplish some tasks online, said the Chairman, "because our IT system isn't up to it." That's when he said: "We do need more engineers, and economists, too." 

His definition of engineer includes IT as it does at the station level these days as well. In previous testimony about the latest budget request, Wheeler told lawmakers the agency has more than 200 "relic IT systems" that are costing the agency more to service than they would to replace over the long term. 

The Commission asked for a total of 1,790 "Full Time Equivalent" positions for FY 2015, which includes an additional 10 such positions for IT programming. Broadcasters would like the agency to have more technology-related personnel too, believing that would make some decision-making affecting the industry to go more smoothly. Discussion of the agency's recent open Internet proposal dominated much of the discussion.

Cloud Computing Adoption Continues

Excerpted from Cloud Tweaks Report by Mojgan Afshari

Nowadays, many companies are changing their overall information technology strategies to embrace cloud computing in order to open up business opportunities. There are numerous definitions of cloud computing. Simply speaking, the term "cloud computing" comes from network diagrams in which cloud shapes are used to describe certain types of networks.

All the computing of more than one computer via a network or the service gained from the host computer via a network is considered cloud computing. Through different types of devices such as PCs, smartphones users can access to services and computing resources in clouds.

According to the National Institute of Standards and Technology (NIST), ''Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.''

Cloud computing represents a convergence of two major trends (IT efficiency and business agility) in information technology.

The term IT efficiency refers to using computing resources more efficiently through highly scalable hardware and software resources.

Furthermore, the business agility is the ability of a business to use computational tools rapidly, to adapt quickly and cost efficiency in response to changes in the business environment.

Cloud computing can remove traditional boundaries between businesses, make the whole organization more business agile and responsive, help enterprises to scale their services, enhance industrial competitiveness, reduce the operational costs and the total cost of computing, and decreases energy consumption.

It would seem that cloud computing can provide new opportunities for innovation by allowing companies to focus on business rather than be stifled by changes in technology.

Although this new technology can help organizations to achieve business efficiencies, evidence indicates that not all companies intend to adopt cloud-based solutions. Borgman and his colleagues conducted a study on the factors influencing cloud computing adoption and found that the security and privacy, identity management standards, and the need for sharing and collaboration in today's highly competitive world have a positive effect on using and adopting the cloud computing.

In fact, a data breach is a security incident in which a company or a government agency loses sensitive, protected or confidential data. Cloud computing involves storing data and computing in a shared multi-user environment, which increases security concerns. Privacy-enhancing techniques, monitoring mechanisms, authentication, encryption, and the security of data in the cloud environment are good ways to enhance cloud security and minimize risk.

Many scholars suggested several strategies to help decision makers improve cloud security which are as follows:

1. Ensure effective governance, risk and compliance processes exist

2. Audit operational and business processes

3. Manage people, roles and identities

4. Ensure proper protection of data and information

5. Enforce privacy policies

6. Assess the security provisions for cloud applications

7. Ensure cloud networks and connections are secure

In sum, cloud computing has become ubiquitous in recent years. Studies indicated that factors such as the relative advantage of cloud computing (such as improving the quality of business operations, performing tasks more quickly, increasing productivity, cost savings, and providing new business opportunities), ease of use and convenience in using the cloud infrastructure, privacy and security, and the reliability of cloud providers can affect the adoption of cloud computing.

Hence, decision makers should systematically evaluate these factors before adoption of cloud computing environment.

Verizon Eyeing Cloud-Based Video Jukebox

Excerpted from Home Media Magazine Report by Erik Gruenwedel

Verizon is working on creating a virtual video "jukebox" that would allow consumers to access content from a cloud-based platform, CEO Lowell McAdam told an investor group.

Speaking May 20th at the JP Morgan Global Technology, Media and Telecom confab in Boston, McAdam said Verizon's purchase of over-the-top video platform OnCue from Intel didn't signal the telecom's entry into a linear TV content rights battlefield for a streaming service.

"We don't think that model is particularly attractive because of the overall content cost," he said.

Instead, McAdam envisions OnCue as a cloud-based option with consumers accessing varying types of unbundled content — including short-form video. The executive said the platform would not bundle third-party content channels, although Verizon is in talks with broadcasters about enabling a-la-carte access to select programming.

Indeed, McAdam said OnCue would act as a virtual video jukebox of sorts, melding aspects of Redbox Instant [Verizon is a co-owner], with transactional VOD and subscription streaming functionality that a consumer could "pull down from the cloud" when they want it, with a much broader array of content — including YouTube-type short-form content.

"I think that is a very attractive model for us. But it can't be the bundled 10 channels together and force users to take it over-the-top, the way they have done in their current linear model," McAdam said.

He also envisions transitioning FiOS TV from video provider to broadband facilitator — the largest growing segment of the multichannel video program distribution market. Why? Verizon recently leveraged its ISP prowess by inking an interconnection deal with Netflix to enable the latter's FiOS subscribers faster streaming speeds.

FiOS ended the most-recent fiscal period with 5.3 million video subscribers and 6.2 million Internet subs.

The telecom would like to broaden that Internet prowess through a wider FiOS broadband footprint much in the way AT&T is seeking to do acquiring DirecTV.

"We will have those broadband pipes, and if you think about it, the more traffic that goes into the home, the better for us, because we've got the technology that's future proof, that's easy for us," McAdam said.

Mobile & Cloud Reign as Tech's Top Moneymakers

Excerpted from InfoWorld Report by Serdar Yegulalp

According to a survey conducted by US audit, tax, and advisory company KPMG, tech executives in the United States report that mobile and cloud revenues aren't only the top drivers for revenue in their companies, but are exceeding revenue forecasts from the previous year.

KPMG's survey tapped 100 C-level and senior-level executives in the tech industry; 74 percent of them are from companies that earn $1 billion or more in yearly revenue, and the rest are from companies earning between $100 million to $1 billion.

For all those companies, 53 percent say mobile revenues topped their 2013 forecasts, and 46 percent claimed their cloud revenues for the same year were above expectations. But in the future, the biggest driver of growth for those companies isn't expected to be mobile or cloud, but data and analytics (D&A), which 51 percent of the respondents ranked as their top growth driver.

Mobile and cloud came in this year at 41 percent and 40 percent respectively, with security at 28 percent and consumerization of IT and Internet of Things (IoT) both at 19 percent.

The changes from last year's survey hint at how attitudes have shifted for each of these topics. In 2013, D&A placed third behind cloud computing and mobile, at 33 percent. Where security was at 28 percent last year, it's nearly doubled in importance since then.

The author of the report, Gary Matuszak, Global Chair, KPMG Technology, Media, and Telecommunications, claims that IoT is emerging as the next major revenue driver for such companies for the years to come.

At the very least, it has become more prominent as a discrete subject in the minds of those surveyed; in last year's survey, IoT didn't even make the list of major revenue drivers at all. But it isn't clear how the term was defined for the sake of the survey (or if the term was simply used as-is), since IoT is a fluid concept that can be applied generically to many different applications, as InfoWorld's Galen Gruman has noted.

Another change from last year: Respondents don't expect revenues to rise as dramatically as before. Last year, 79 percent believed revenues would increase; this year, it's "about 8 out of 10." But the amount of the increase is believed to be on the order of 1 to 5 percent this time around, rather than the 6 percent or greater increase predicted by most of them last year.

The survey also revealed some other topical winners and losers in the eyes of the executives. Automotive technology was a big winner, as 60 percent of respondents had plans to invest in some variety of it. When it came to crowdsourcing systems, though, only 25 percent said they planned to invest in it, while 55 percent said a flat-out no to any such investments and 20 percent were still on the fence. Finally, most of those profiled -- 86 percent -- cited the need to "create competitive differentiation" as the main driver of change in their business.

IoT & Cloud Computing Threats Redefine Security

Excerpted from PCWorld Report by Fred O'Connor

A printer that connects to the web may pose as great a risk to enterprise security as an OS vulnerability, but yet companies worry about the latter and too often ignore the former, said a CTO during a discussion at the Massachusetts Institute of Technology (MIT).

With more devices gaining web connectivity as part of the Internet of Things (IoT) movement, hackers have greater opportunities to exploit weaknesses, said Patrick Gilmore, CTO of data-center and telecommunications service provider the Markley Group. The people who write software for printers may not be worried about security, he said.

"No one talks about what if your printer is hacked and every document your CEO printed is posted to a blog," he said.

The session, part of the MIT Sloan CIO Symposium Wednesday, covered a range of security issues, including cloud computing, emerging threats and data security.

Companies using cloud services should review what conditions would allow a provider to cut off a customer's service, said Rob May, CEO and co-founder of Backupify, which backs up data stored in cloud applications to a separate cloud system.

"You have a responsibility to protect your data. You can't outsource all your security to a cloud vendor," he said.

A Backupify customer that uses Gmail approached the company about securing its data if Google terminated its email account, May said. The customer works in a controversial business, he said, and presented a scenario in which Google would drop the business as a client after people protested the company's service providers. The company asked Backupify how quickly it could migrate its email data to Microsoft Outlook if such a situation occurred, May said.

Cloud customers need to ask better questions when considering Web services, Gilmore said.

Instead of inquiring about a cloud provider's physical and technical security measures, customers ask about pricing and backup procedures, he said. Physical plans are especially important, he said, since cloud data is ultimately stored in hardware and some vendors throw out hard drives instead of destroying them.

The challenge for security teams is in balancing the need to share data to achieve corporate goals while maintaining security procedures, said Mark Morrison, senior vice president and chief information security officer at financial service firm State Street Corporation.

State Street is moving risk management security to counteract emerging threats, Morrison said. Security is no longer "if we do these five things we are somehow magically secure," he said, adding that companies can no longer simply follow a checklist that includes basic security measures like establishing a firewall.

"You've got to realize prevention isn't going to be your sole protection anymore," he said.

For example, Morrison is looking to deter cyberattackers by making the investment required to wage an attack cost more than the return.

"If you can increase their costs, hopefully they'll move on," he said.

New threats are coming from nation states that include cyberattacks as part of their defense plans. In some instance, these countries are funding attackers and using them as "cybermercenaries," he said.

Morrison is also looking to increase the use of two-factor authentication and decrease reliance on passwords.

"Password are a complete waste of time," he said. "They are the equivalent of signing the back of a credit card."

Passwords need to be 14 or 16 characters long to offer protection, he said, so people write them down to remember them, which places them at risk of being misused.

Trying to control employees use of USB-equipped devices to transfer data is another ineffective security measure, Gilmore said.

Identifying USB devices is challenging, he said, noting that the technology is found in common items like pens and watches.

"Data is ubiquitous, easy to transfer," he said. "How do you keep them from using USB? You don't."

Instead, companies should implement policies that make workers not want to steal data, and consider how to contain damage if information is leaked.

Businesses hiring IT security professionals should find candidates who think like the enemy, since cyberscofflaws don't follow rules.

"You need to have people that think like attackers," he said.

Organizations Unprepared for Technology Shifts 

IBM today released preliminary study findings of 750 global organizations revealing less than 10 percent are fully prepared to address the proliferation of cloud computing, analytics, mobile devices, and social media.

Addressing this challenge head-on, IBM is unveiling new systems, software and capabilities designed to help organizations create smarter infrastructures that yield faster access to Big Data insights through the cloud and improved business performance.

"Big Data is the transformative force driving every element of our clients' computing infrastructure -- starting with environments of traditional applications blended with the new requirements of social, mobile and analytic workloads that demand faster access at massive scale," said Tom Rosamilia, Senior Vice President of IBM Systems & Technology Group and IBM Integrated Supply Chain. "The continued advances of our portfolio provide clients with a fast and easy way to close the gap between their data, the business decisions they have to make, and the mandate to use information to provide more personalized experiences for their customers."

The IBM preliminary findings revealed that 70 percent of organizations recognize that IT infrastructure plays a significant role in enabling competitive advantage or generating revenue.

Building on last week's Software Defined Storage launch, in which IBM announced new software that enables organizations to access any data from any device and from anywhere in the world, the company today announced new and enhanced capabilities across its storage portfolio.

Advances in its Storwize, XIV, tape library and Flash storage products can optimize storage for large-scale cloud deployments through virtualization, real-time compression, easy-tiering and mirroring, and provide clients fast access to information.

In related storage news, scientists at IBM Research - Zurich, in cooperation with the FUJIFILM Corporation of Japan, announced they have demonstrated 85.9 billion bits per square inch, a new record in areal data density on low-cost linear magnetic particulate tape -- a significant update to one of the computer industry's most resilient, reliable and affordable data storage technologies for Big Data.

Clients are seeking greater speed, agility and resiliency for the Big Data, analytics and large-scale virtualization for dynamic cloud environments.

IBM Global Financing helps clients acquire IBM solutions with a single financing solution to better manage their cloud and Big Data infrastructure, and accelerate business transformation. Financing programs and offerings help clients better match the benefits of reduced up-front payments and faster return on investment within existing budget commitments. Credit-qualified clients may obtain zero percent loans or Fair Market Value leasing and loans with customized payment plans. IBM Global Asset Recovery Services provides buyback and disposal services for removal of older IT equipment.

Top 3 Tips for Moving to the Cloud

Excerpted from BizReport Report by Kristina Knight

The cloud is beckoning many more small and medium sized businesses (SMBs) to it. Plenty of space for files and data, easy access and security that still allows workers access when they are out of the office. But how do you know if the cloud - and cloud hosting partner - you've chosen is right?

Here are ComputerSupport's top 3 tips for SMBs moving into the cloud:

First what do you need?

"Cloud computing increases collaboration by allowing all employees - wherever they are - to sync up and work on documents and shared apps simultaneously," writes ComputerSupport. "Since cloud computing is much faster to deploy, businesses have minimal project start-up costs and predictable ongoing operating expenses. They would also remove any upfront CAPEX costs associated with hardware maintenance."

One tip to cut costs - use the cloud provider's resources and infrastructure to decrease the costs in moving your company to the cloud.

Second, are your systems simple to manage? The cloud allows businesses to access applications and programs remotely, but make sure employees know how to do it.

Finally, have you studied the provider?

There are hundreds of cloud service providers out there - things to look at to determine if a provider is right for your business include:

1. Will they keep your business secure: assess each provider's security capabilities in key areas, such as: anti-malware protection, data eradication encryption mechanisms, government and industry regulations, identity management and physical security compliance?

2. Does the provider offer new technology and tech options that sync with your needs?

3.Do they stand up - is there a history of down time or outages, what about data recovery options and protection?

Mobile Cloud Technology Can Change Lives

Excerpted from HostReview Report by Vikram Kumar

Today the computer is one of the most important technologies that people cannot live without. The good thing about computers nowadays is that they became very portable that people can bring them anywhere they want to go. 

The availability of this technology made everyone connected to one another and to access important data that is needed for everyday life. One of the most portable gadgets in today's modern world is the smartphone. It is through mobile devices that people can access games, photos, movies, music and more even if they are traveling abroad. As you travel, you do not have to stay away from your family, friends and to the data that you need as mobile cloud is here. It is impossible to take advantage of this technology without tools and services that can make it easy to use. One of the best developments in Internet technology is the cloud.

It is cloud computing that makes it possible for one to store data or information that they need using Internet based server rather than using a local storage device like a hard drive or flash drive. Anyone can access their data from any device or computer with their permission. You might find service that will require you to install a program or software, but most cloud computing services simply use a portal that is made accessible from any web browser.

The best examples of cloud-base service are music, video, and photo sharing websites. These sites provides people the ability to back up their photos, music or videos and share them to the rest of the world using different social networking websites like Facebook or Twitter. There is no need to zip files, use complex compression format, and print pictures to be mailed to your loved ones. All you need to do is to upload your files and give permission to your family and friends to view your photos. It is not only available on computer because mobile cloud allows sharing of files in a very easy way.

There is virtual office from cloud computing service. Some companies and organizations make use of cloud office through mobile application. This is an office suite application that you can access anytime and anywhere you are. For as long as your smartphone is connected to the Internet and you have the application, then the file that you need is available every time you need them. You can get this kind of technology from Toronto cloud. There are many things that you can do just like when you are using a desktop-based software.

The system is related to social networking that makes your cyber life easier. Many social networking sites are taking advantage of cloud technology especially in mobile devices. They have their own applications that people can access anytime and anywhere using an Internet connection or wifi. Toronto cloud is doing their best to fulfill the demand for this technology as it is continuously evolving day by day. Cloud made impossible things possible because people now believe that they can store files using an invisible storage device.

Everything Is Distributed

Excerpted from O'Reilly Radar Report by Courtney Nash

In September 2007, Jean Bookout, 76, was driving her Toyota Camry down an unfamiliar road in Oklahoma, with her friend Barbara Schwarz seated next to her on the passenger side. Suddenly, the Camry began to accelerate on its own. 

Bookout tried hitting the brakes, applying the emergency brake, but the car continued to accelerate. The car eventually collided with an embankment, injuring Bookout and killing Schwarz. In a subsequent legal case, lawyers for Toyota pointed to the most common of culprits in these types of accidents: human error. "Sometimes people make mistakes while driving their cars," one of the lawyers claimed. Bookout was older, the road was unfamiliar, these tragic things happen.

However, a recently concluded product liability case against Toyota has turned up a very different cause: a stack overflow error in Toyota's software for the Camry. This is noteworthy for two reasons: first, the oft-cited culprit in accidents — human error — proved not to be the cause (a problematic premise in its own right), and second, it demonstrates how we have definitively crossed a threshold from software failures causing minor annoyances or (potentially large) corporate revenue losses into the realm of human safety.

It might be easy to dismiss this case as something minor: a fairly vanilla software bug that (so far) appears to be contained to a specific car model. But the extrapolation is far more interesting. Consider the self-driving car, development for which is well underway already. We take out the purported culprit for so many accidents, human error, and the premise is that a self-driving car is, in many respects, safer than a traditional car. But what happens if a failure that's completely out of the car's control occurs? What if the data feed that's helping the car to recognize stop lights fails? What if Google Maps tells it to do something stupid that turns out to be dangerous?

We have reached a point in software development where we can no longer understand, see, or control all the component parts, both technical and social/organizational—they are increasingly complex and distributed. The business of software itself has become a distributed, complex system. How do we develop and manage systems that are too large to understand, too complex to control, and that fail in unpredictable ways?

Distributed systems once were the territory of computer science PhDs and software architects tucked off in a corner somewhere. That's no longer the case. Just because you write code on a laptop and don't have to care about message passing and lockouts doesn't mean you don't have to worry about distributed systems. How many API calls to external services are you making? Is your code going to end up on desktop sites and mobile devices — do you even know all the possible devices? What do you know now about the network constraints that may be present when your app is actually run? Do you know what your bottlenecks will be at a certain level of scale?

One thing we know from classic distributed computing theory is that distributed systems fail more often, and the failures often tend to be partial in nature. Such failures are not just harder to diagnose and predict; they're likely to be not reproducible — a given third-party data feed goes down or you get screwed by a router in a town you've never even heard of before. You're always fighting the intermittent failure, so is this a losing battle?

The solution to grappling with complex distributed systems is not simply more testing, or Agile processes. It's not DevOps, or continuous delivery. No one single thing or approach could prevent something like the Toyota incident from happening again. In fact, it's almost a given that something like that will happen again. The answer is to embrace that failures of an unthinkable variety are possible — a vast sea of unknown unknowns — and to change how we think about the systems we are building, not to mention the systems within which we already operate.

Okay, so anyone who writes or deploys software needs to think more like a distributed systems engineer. But what does that even mean? In reality, it boils down to moving past a single-computer mode of thinking. Until very recently, we've been able to rely on a computer being a relatively deterministic thing. You write code that runs on one machine, you can make assumptions about what, say, the memory lookup is. But nothing really runs on one computer any more—the cloud is the computer now. It's akin to a living system, something that is constantly changing, especially as companies move toward continuous delivery as the new normal.

So, you have to start by assuming the system in which your software runs will fail. Then you need hypotheses about why and how, and ways to collect data on those hypotheses. This isn't just saying "we need more testing," however. The traditional nature of testing presumes you can delineate all the cases that require testing, which is fundamentally impossible in distributed systems. (That's not to say that testing isn't important, but it isn't a panacea, either.) When you're in a distributed environment and most of the failure modes are things you can't predict in advance and can't test for, monitoring is the only way to understand your application's behavior.

If we take the living-organism-as-complex-system metaphor a bit further, it's one thing to diagnose what caused a stroke after the fact versus to catch it early in the process of happening. Sure, you can look at the data retrospectively and see the signs were there, but what you want is an early warning system, a way to see the failure as it's starting, and intervene as quickly as possible. Digging through averaged historical time series data only tells you what went wrong, that one time. 

And in dealing with distributed systems, you've got plenty more to worry about than just pinging a server to see if it's up. There's been an explosion in tools and technologies around measurement and monitoring, and I'll avoid getting into the weeds on that here, but what matters is that, along with becoming intimately familiar with how histograms are generally preferable to averages when it comes to looking at your application and system data, developers can no longer think of monitoring as purely the domain of the embattled system administrator.

There are no complex software systems without people. Any discussion of distributed systems and managing complexity ultimately must acknowledge the roles people play in the systems we design and run. Humans are an integral part of the complex systems we create, and we are largely responsible for both their variability and their resilience (or lack thereof). As designers, builders, and operators of complex systems, we are influenced by a risk-averse culture, whether we know it or not. In trying to avoid failures (in processes, products, or large systems), we have primarily leaned toward exhaustive requirements and creating tight couplings in order to have "control," but this often leads to brittle systems that are in fact more prone to break or fail.

And when they do fail, we seek blame. We ruthlessly hunt down the so-called "cause" of the failure—a process that is often, in reality, more about assuaging psychological guilt and unease than uncovering why things really happened the way they did and avoiding the same outcome in the future. Such activities typically result in more controls, engendering increased brittleness in the system. The reality is that most large failures are the result of a string of micro-failures leading up to the final event. There is no root cause. We'd do better to stop looking for one, but trying to do so is fighting a steep uphill battle against cultural expectations and strong, deeply ingrained psychological instincts.

The processes and methodologies that worked adequately in the '80s, but were already crumbling in the '90s, have completely collapsed. We're now exploring new territory, new models for building, deploying, and maintaining software — and, indeed, organizations themselves. We will continue to develop these topics in future Radar posts, and, of course, at our Velocity Conferences in Santa Clara, Beijing, New York, and Barcelona.

Cloud Can Bring Voice of the Customer Back into Manufacturing

Excerpted from Forbes Report by Louis Columbus

In contrast to SAP 's cautionary tale including the abrupt departures of Vishal Sikka, Shaun Price, and continued attempts to recapture their customers' cadence, Salesforce shows how powerful galvanizing customer-facing systems are in redefining manufacturing today. The revolution that manufacturers want starts in their front offices, focusing on making every relationship count. Trust, not transactions, matter most in manufacturing today. Defining a mobile-first application development strategy that includes deep contextual customer intelligence, Salesforce shows potential to lead a revolution in manufacturing with its Salesforce1 platform.

The recent Salesforce announcement that they are focusing on a delivering solutions that solve specific industries' issues is noteworthy. It provides a blueprint for helping manufacturers navigate away from an inward-centric, myopic transaction mindset to one that embraces customer's rapidly changing needs. The solutions, built on the Salesforce1 platform, promises to provide speed and anticipate changes in customer business models and a quickening cadence. Salesforce is not a previous or current client of mine and I listened in to their announcement from London today free over the Internet.

Bottom Line: Manufacturers' investments in people, processes and technologies can only deliver long-term value if the strong catalyst of customer trust fuels them long-term.

The first step in breaking down the silos that keep manufacturers isolated from their customers is getting manufacturing engineers, managers and executives out from behind their desks and out on sales calls. Investing time regularly to hear from customers firsthand regarding their needs, requirements, preferences and wants delivers invaluable insights. Many manufacturers who are market leaders, like GE for example, quickly redefine internal manufacturing processes and strategies based on the insights gained from customer and prospect visits.

Second, customer-driven manufacturers who have taken the bold step of redefining their manufacturing workflows, processes and strategies to stay in step with customer demands have entirely different metrics that drives a much higher level of urgency for customer-defined results. What's fascinating about visiting manufacturing centers both in North America and abroad is how openly they tell you if they are customer centric or not without saying a word. From flat screen monitors that flash real-time production, quality and customer-driven performance metrics in many discrete manufacturers to the chalk boards in smaller Asian factories that list in detail customer requirements, you get immersed quickly in what customer success looks like in each production center.

Third, in manufacturing operations that believe in taking action on the voice of their customers, it's very common to see much more configurable shop floor operations and greater visibility across the entire production value chain. In high performance manufacturers who have been doing this for years, the value chain itself shrinks along with costs, and the customer wins with higher quality products at lower prices.

Salesforce is orchestrating over a hundred of their development partners including Apttus, ServiceMax and others to deliver manufacturing solutions, and previewed some of the industry-specific work at the Salesforce1 World Tour in London this week. The collaborative effort shows how Salesforce's secure, scalable cloud platform has the potential to revolutionize manufacturing by capturing orders more accurately by automating the inquire-to-order process across all selling channels.

While many enterprise software vendors have attempted to do this, what makes this approach noteworthy are the following key take-aways:

Manufacturers and the customers they serve are relying more on mobile devices than ever before, making usability and responsive design a must-have. The need to stay in step with the accelerating cadence of their customers, channel partners, distributors and dealers is starting to force many manufacturers to take a mobile-first application deployment approach. Designing in usability for mobile devices first leads to a much more intuitive, responsive design and higher adoption of the apps. As customers increase the cadence and speed of how and where they buy, mobile-first development stands a much better chance of keeping up than any previous approach.

Providing greater visibility into supply chains, inventory, manufacturing and fulfillment is essential to win and keep customer loyalty. Today customers expect total visibility and real-time responsiveness for every company they deal with. Manufactures have to embrace this new reality and align systems and processes to make it part of their DNA, how they do business every day.

Using cloud-based integration and mobility to free up valuable data locked in ERP systems has the potential to make complex manufacturing more customer-driven. Gaining access to and getting customer context around years of data generated by ERP systems can streamline even the most complex manufacturing workflows. The following screens illustrate how Apttus is using the Salesforce1 platform to enable more intuitive inquiry-to-order workflows that drive greater sales, integrating ERP data in the process. What's also noteworthy is how these solutions integrate with CAD systems, dynamically creating schematics needed for engineered products.

Unifying customer data, analytics and reporting across all front-office and manufacturing operations is a strong catalyst for continual selling and profit gains. Having a very clear, contextual view of customers' requirements through every phase of the inquiry-to-order and quote-to-cash processes can save manufacturers days and weeks of lost productivity.

Tackling the toughest of all manufacturing systems challenges, getting people to use a new system. The approach that Apttus and Salesforce are driving provides manufacturers with flexibility in defining the user experience and will definitely help solve this challenge. What's noteworthy about their announcement this week in London is how well orchestrated the partner strategy is across the value chains of each manufacturing segment.

A single, scalable, secure system of engagement that is customer not transaction driven. Legacy ERP systems excel at being the system of record for transactions. The trouble is that manufacturers need to know more than just transaction data, they need to have contextual customer intelligence including why customers are buying today.

Coming Events of Interest

2014 Akamai Government Forum — June 5th in Washington, DC. The 2014 Akamai Government Forum will present a new path forward for the future of cloud-based solutions and cybersecurity in federal government. The event will be produced by Government Executive Media Group, the government and defense division of Atlantic Media.

Enterprise Apps World — June 17th-18th in London, England. EAW is a two day show, co-hosted with Cloud World Forum, that will look at all the implications of going mobile in the workplace and how enterprise apps can help.

Silicon Valley Innovation Summit — July 29th-30th in Mountain View, CA.AlwaysOn's 12th annual SVIS is a two-day executive gathering that highlights the significant economic, political, and commercial trends affecting the global technology industries. SVIS features the most innovative companies, eminent technologists, influential investors, and journalists in keynote presentations, panel debates, and private company CEO showcases.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

International Conference on Cloud Computing Research & Innovation - October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development.

Copyright 2008 Distributed Computing Industry Association
This page last updated June 1, 2014
Privacy Policy