Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

September 1, 2014
Volume XLIX, Issue 6


Paragon at CLOUD DEVELOPERS SUMMIT & EXPO

The DCIA & CCA are very pleased to welcome Paragon to the CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014), featuring the top-ten cloud brands Amazon, Dell, Google, HP, IBM, Microsoft, NetSuite, Oracle, Rackspace, and SAP, among many other cloud-computing innovators.

Delegate registration at early-bird rates ends September 6th for CDSE:2014, which will take place in Austin, TX on October 1st and 2nd.

Business strategy and technical sessions covering the latest trends — Mobile Cloud, DevOps, and Big Data — as well as general interest cloud service topics will be featured along with a special focus on three economic sectors experiencing the most cloud adoption: Media & Entertainment, Government & Military, and Healthcare & Life Sciences.

Paragon is a CDSE:2014 Silver Sponsor.

Paragon provides client-centric, customized program management, information technology (IT), and logistics services to federal agencies of all sizes in every arena, helping them to manage processes, optimize operations, and increase their return on investment (RoI).

Among its many service offerings are cyber security planning and accreditation, which represent the latest in security capabilities, security compliance preparation, and accreditation following NIST standards and FedRAMP requirements.

Keynoting for Paragon will be Bob Littlejohn, Senior Security Engineer, Paragon Technology Group.

He has over 20 years of experience in information system security and accreditation with a variety of agencies, starting as a system security analyst with McDonnell Douglas, and including information security with Johnson Controls World Services, NCI Information Systems, Unisys, APPTIS, Newberry Group, and Booz Allen Hamilton.

Bob Littlejohn was responsible for certification and accreditation of systems for PEOSTRI in Orlando FL, the Air Force Weather Agency in Omaha NE, US Transportation Command in southern IL, Sandia National Laboratory in Albuquerque NM, the Air Force Flight Test Center at Edwards AFB CA, and assisted with development of the US Coast Guard certification and accreditation strategy for Telecommunication and Information Systems Command in Springfield, VA.

Conducting a Paragon Workshop at CDSE:2014 will be Mark Clark, Systems Security Engineer, Paragon Technology Group.

Combating cyber threats is rewarding for him because it helps protect information from those who intend to exploit it in a harmful manner.

Mark Clark's most interesting tasks are those that require resolution of a non-routine security problem. 

"One of the biggest challenges security engineering faces is how to interpret and apply pre-existing security regulations to rapidly changing technology and cyber threats. Crafting unique solutions to mitigate these threats is rewarding," he says.

Paragon's participation exemplifies the two major offerings of CDSE:2014:

During the business conference at CDSE:2014, thirty-six highly focused strategic and technical keynotes, breakout panels, and Q&A sessions will thoroughly explore cloud computing solutions, and ample opportunities will be provided for one-on-one networking with the major players in this space.

At eighteen co-located CDSE:2014 instructional workshops and special seminars facilitated by industry leading speakers and world-class technical trainers, attendees will, see, hear, learn and master critical skills in sessions devoted to the unique challenges and opportunities for developers, programmers, and solutions architects.

Register now before early-bird rates expire.

Cloud Privacy & Data Security Webinar

The Distributed Computing Industry Association (DCIA) and Edwards Wildman Palmer will present "Cutting Edge Developments Affecting Cloud Companies in Privacy & Data Security," a one-hour webinar on Tuesday September 16th at 12:00 PM ET, featuring a discussion of key issues affecting privacy and security in data and high tech and a preview of what to expect at the CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014).

Topics will include recent court decisions affecting data privacy, security protection and disclosure — what you need to know to survive; legislation, regulations and guidance that will further impact privacy, security and liability — how you should prepare now for coming changes; and practical suggestions for addressing the natural tension between providers and customers: why it's cost effective to take steps in advance to avoid future conflict.

Please click here to register by September 10th.

CEOs, CFOs, CTOs, CIOs, and in-house counsel of companies deploying cloud services as enterprise end-users, as well as executives in these roles at cloud service providers should attend.

CLE credit has been applied for in NY, CA, IL, and RI. Although multiple participants are welcome to join this program, any person who seeks CLE credit for attendance must be logged-in individually and remain logged in throughout the duration of the program to receive credit.

Speakers will include Edwards Wildman's Michael Bennett, Lawrence Freedman, and Thomas Smedinghoff. The DCIA's Marty Lafferty will moderate.

Michael Bennett, a Partner in EWP's Chicago office, counsels clients on a variety of technology issues including big data, wireless communications, machine-to-machine wireless communications, in-bound and outbound sourcing, SaaS, PaaS, IaaS, and all aspects of cloud computing.

Lawrence Freedman is a Partner in the firm's Washington, DC office and formerly a CEO of a Communications/Cloud Company. Larry advises clients on a full range of strategic, contractual, and regulatory compliance issues, including data privacy and security issues, associated with the development and deployment of cloud computing strategies.

Thomas Smedinghoff is also a Partner in the Chicago office. He is internationally recognized for his leadership in addressing emerging legal issues regarding electronic transactions, identity management, privacy, information security, and online authentication issues from both a transactional and public policy perspective.

Marty Lafferty is CEO of the DCIA, a trade organization whose members include a range of companies involved in cloud computing and providing platforms for storage, transmission, and exhibition of content: software application developers, broadband network operators, and digital media rights holders. Prior to DCIA, Marty served in senior positions for some of the world's most innovative entertainment and technology companies.

Shekhar Gupta Joins DCIA Member Services

Please warmly welcome Shekhar Gupta to the senior management team of the Distributed Computing Industry Association (DCIA).

Shekhar brings extensive experience as an accomplished executive with a successful track record of launching and managing telecom networks and products to his leadership role in DCIA Member Services.

Shekhar has built and managed networks as large as $100 million and has been very successful in creating new revenue streams for companies he has served. He has also built new companies from the ground up to multi-million dollars in revenue. He holds over 40 patents.

He brings 18 years of experience in various roles from product management to engineering and planning to operations for Fortune 100 and start-up companies both domestically and internationally.

Shekhar currently works for healthcare company Helping Solution in Kansas City, KS where he manages its Social Media and development/deployment of Telehealth, Internet of Things (IoTs), Big Data, and Cloud Products & Services.

Previously, Shekhar worked for ARRIS/Motorola Google where he led new products and hosted and consulting services in CCAP migration, BW Management & Optimization, QoE, and Connected Homes.

Prior to ARRIS, as Senior Director for OPNET Technologies, a network and application monitoring company, Shekhar managed its International Professional Services Group where his accomplishments included acquiring London Police and O2 as major customers in addition to adding AT&T domestically. He also reduced non-revenue generating services to improve company net income.

Before OPNET, Shekhar worked for CenturyLink, 3rd largest local telecom in the US, where led its SDN, VoIP, IPTV, MDU/MTU, and Smart Home products among others.

Shekhar has numerous industry certifications, a BS in Electrical Engineering from University of Nebraska, and a Doctor of Ophthalmology degree from India.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThe Digital Due Process (DDP) coalition, in which the DCIA is a participant, for more than two years has advocated reform of the seriously outdated Electronic Communications Privacy Act (ECPA) to protect data stored in the cloud.

Very significant progress has been made in recent months to garner support for HR 1852: The Email Privacy Act (EPA) in the US House of Representatives, and its companion in the US Senate, S 607: Electronic Communications Privacy Act Amendments Act (ECPAAA).

American lawmakers can show they take their constituents' privacy seriously — and that they can enact meaningful reform, which will level the playing field for the protection of electronic communications — by passing these bills.

These measures will require government agents to obtain warrants from a judge in order to force service providers to disclose private data they store in the cloud for their customers.

DCIA Member companies and leading private sector organizations are invited to sign-on to the following letter that DDP will present to Congressional leaders this week.

If you'd like to sign-on, please email me here.

We'll add your company or entity's name to the signatories, and send you a final copy for your records.

The Senate version is shown below — the House version will reference that there are now 260 co-sponsors, a majority of Members:

"We write to urge you to bring to the floor S. 607, the bipartisan Leahy-Lee bill updating the Electronic Communications Privacy Act (ECPA).

Updating ECPA would respond to the deeply held concerns of Americans about their privacy. S. 607 would make it clear that the warrant standard of the U.S. Constitution applies to private digital information just as it applies to physical property.

The Leahy-Lee bill would aid American companies seeking to innovate and compete globally. It would eliminate outdated discrepancies between the legal process for government access to data stored locally in one's home or office and the process for the same data stored with third parties in the Internet 'cloud.'

Consumers and businesses large and small are increasingly taking advantage of the efficiencies offered by web-based services. American companies have been leaders in this field. Yet ECPA, written in 1986, says that data stored in the cloud should be afforded less protection than data stored locally. Removing uncertainty about the standards for government access to data stored online will encourage consumers and companies, including those outside the U.S., to utilize these services.

S. 607 would not impede law enforcement. The U.S. Department of Justice already follows the warrant-for-content rule of S. 607. The only resistance to reform comes from civil regulatory agencies that want an exception allowing them to obtain the content of customer documents and communications directly from third party service providers.

That would expand government power; government regulators currently cannot compel service providers to disclose their customers' communications. It would prejudice the innovative services that we want to support, creating one procedure for data stored locally and a different one for data stored in the cloud. For these reasons, we oppose a carve-out for regulatory agencies or other rules that would treat private data differently depending on the type of technology used to store it.

S. 607 was approved by the Judiciary Committee last year. (H.R. 1852 is co-sponsored by over 260 Members, including a majority of the majority.) We urge you to bring it to the floor. We believe it would pass overwhelmingly, proving to Americans and the rest of the world that the U.S. legal system values privacy in the digital age."

With HR 1852 and S 607, Congress has the rare opportunity to update digital communications privacy for the 21st century by providing the same amount of privacy to online as offline communications, as guaranteed by the Fourth Amendment.

Feel free to contact me with questions. Share wisely, and take care.

Data Breaches in the Cloud: Who's Responsible?

Excerpted from GovTech Report by Jessica Hughes

The risk of a data breach in the cloud is multiplying and is now costlier and happens more frequently, according to a recent study by the Ponemon Institute.

But this phenomenon, which is dubbed the cloud multiplier effect, can be mitigated by a strengthened security posture, according to Larry Ponemon, Chairman of the Ponemon Institute.

"It's funny, I'm a big believer in the cloud," Ponemon said. "I like cloud and I think cloud has improved quite a bit from a security perspective."

Cloud computing is not necessarily less secure, Ponemon said, but that is the perception among many of the study's respondents who view on-premises data breach as easier to control and less costly as a result.

"It's kind of a level of complexity you are adding because now you're relying on a third party to do the right steps," Ponemon said.

The fact that many cloud environments are secure is a sentiment echoed by several government CIOs who commented on the security of and possibility of a breach within their own cloud environments. 

"From what I know, and certainly from a mid-size city characteristic, the reputable cloud vendors have better security than we have," said Michael Armstrong, CIO of Corpus Christi, TX.

Still, there are notes of cloud security pessimism from IT officials and security practitioners throughout the study, Data Breach: The Cloud Multiplier Effect

For instance, 66 percent of respondents said their organization's use of a cloud resource diminishes its ability to protect confidential sensitive information.

The study published and commented on the responses from 613 IT practitioners on questions related to cloud security, including who is responsible for a breach after it happens.

Here are the survey stats at a glance:

71 percent said they would not receive immediate notifications involving the loss or theft of customer data.

66 percent of respondents said their organization's use of cloud resources diminishes its ability to protect confidential or sensitive information.

62 percent said they believed the cloud services in use by their organizations are not thoroughly vetted for security before being used.

55 percent responded that they don't believe their IT leader is responsible for ensuring their information is secure.

51 percent said on-premises IT is equally or less secure than cloud-based services.

Although it's impossible to know the motivations of the study's respondents, Ponemon said he suspects their mixed view of cloud security is itself a mix of truth and perception.

Ponemon said he understands some of the study's negativity because he's seen data breach research on public cloud providers not taking the appropriate security steps and breaches occurring. But, he said, few of this study's respondents reported breaches of their own.

One security gamble when moving to the cloud is the data owner's loss of control. Ponemon said that when an organization owns its own data center, security staff can observe and control Internet traffic easily and configure firewalls to its liking. Visibility, he said, is a core issue to security.

"Not having that ability, that visibility, makes it very hard for the company that's entrusting the cloud provider to make sure that all these steps are being taken properly," Ponemon said.

Although organizations can mitigate or reduce the risk of a breach by vetting cloud provider security practices and conducting audits of the data stored in the cloud, the majority of companies are not conducting these practices, according to the study. One reason is that the procurement process may be happening outside of IT's purview, Ponemon said.

Pennsylvania, which is in the process of unifying its seven data centers with a Unisys hybrid cloud, included in its contract the flexibility to personally conduct audits or have regulatory agencies conduct them, said Tony Encinias, CIO of Pennsylvania. "We need to make sure we satisfy the requirements," he said, "and we also need to make sure that we're doing it smart."

But when organizations don't take extra steps to ensure the cloud is secure, this feeds into the cloud multiplier effect, Ponemon said. Also contributing to the effect: the number of mobile and other devices accessing the cloud; increased cloud dependency; and lack of visibility about what's in the cloud, which may put sensitive or confidential information at risk, according to the study.

Increasing the backup and storage of sensitive or confidential information in the cloud also tops a list of nine scenarios as most costly in a data breach, according the study.

Corpus Christi's Armstrong said he believes the cloud multiplier effect can be diminished. "I think you can mitigate that by being very careful about where you put your stuff." He said he's especially careful about the arrangements he makes with core business applications in cloud environments.

Corpus Christi has had some major business applications in the cloud for five years, including the full Infor Lawson suite of applications in the Infor Business Cloud. Armstrong said there is sensitive information he won't store in the cloud now, though he will likely reconsider in the next decade since cloud vendors are getting more reliable and secure.

Although there is some distrust surrounding security in the cloud, 51 percent of respondents in the study answered that on-premises IT is equal to or less secure than cloud-based services.

"There are things that make the cloud very, very secure. You just have to be careful and have some vigilance," Ponemon said.

King County, WA, has platforms in place to cover the three areas of cloud computing -- IaaS (infrastructure as a service) with Amazon and DLT, PaaS (platform as a service) with Microsoft CRM, and Office 365 and SaaS (software as a service) with the county's prosecuting attorneys case management system.

Each cloud project was held to the county's security and audit requirements, and had to get clearance from the county's team, including risk managers, prosecuting attorneys, Health Insurance Portability and Accountability Act and criminal justice information services security specialists, an IT security officer, and procurement and contracts officers.

Bill Kehoe, CIO of King County, said he takes time to educate his staff about cloud environments and their risks. "I think you've got to be careful," he said. "You can't just throw your data into any cloud environment."

That's one reason why the county contracts with established cloud vendors, like Amazon and Microsoft, he said, adding that security staff, standard cloud architecture, security controls and diverse audits all figure into the security of larger cloud environments.

According to the study, there is a general feeling that outside forces, not internal security, are to be relied on to protect data in the cloud. That's because 55 percent of practitioners responded that they don't believe their IT leader is responsible for ensuring their information is secure.

"I would submit that it's everyone's responsibility to ensure the safety of the data in the cloud." Encinias said. "It's the service provider's responsibility, it's the data owner's responsibility and, as the commonwealth of Pennsylvania CIO, I'm also definitely responsible."

Though with responsibility distributed, this also makes the terms and conditions with cloud services more difficult to agree on because two parties must decide which will pay and when, said Encinias.

For Pennsylvania's recently executed contract with Unisys, which took four to five months to put together, it states that the cloud provider must offer certain information during a breach and must also help facilitate mitigation. In the case of a breach, responsibility is declared after an analysis, Encinias said.

And once a breach occurs, everything circles back to indemnification, or the protection from having to pay for another's negligence, Corpus Christi's Armstrong said. Indemnification appears in contracts, but cloud users are also protected by regulatory penalties and laws.

"The element of risk that you bear is defined in your contract documents," Armstrong said, "so it really pays, whatever time it takes, to get that right."

King County's Kehoe said he's finding that who is responsible and to what tune varies depending on the cloud environment and what portion of the technology stack the vendor is responsible for. Since breaches are costly, he said this nuance is important for his staff to understand.

"The cloud is so new to government that our security, risk management and legal council need to better understand the risks and how the contracts need to be different in terms of indemnification language for each of the cloud environments," he said.

For instance, IaaS and PaaS can present challenges in parsing responsibility because risk and responsibility are more shared. Whereas with SaaS, the vendor owns everything but the data.

Along with deciding who is responsible for a breach, there also are questions surrounding timely breach notifications from cloud service providers. The survey reports that 71 percent of respondents fear their provider would not immediately notify their organization in a loss or theft of customer data.

Timely notification can be a problem, Ponemon said, along with whether stolen or lost data is discovered by the cloud provider.

Notification of a breach is a contract element, and monitoring data and suspicious activity is the responsibility of the hosting company, Armstrong said. But a lot also depends on the relationship between the vendor and the purchaser.

"At some point you've got to develop a level of trust that they have your interest in mind and that they're going to do the right thing," he said. "If you selected a good partner, if there is a data breach, you'll be able to work through that and understand the root cause of it."

To help with the indemnification and communication questions, many governments, including Corpus Christi, are covering themselves with data breach insurance that protects governments from things like notification costs and federal penalties, which are levied before responsibility is declared.

"None of this stuff is really straightforward yet, so you've got to protect yourself," Armstrong said.

Indeed, the market for insurance is on the rise, with an adoption rate of about 30 percent, Ponemon said. Companies with good security practices are likelier to hold such insurance, according to another Ponemon Institute study quantifying the cost of a data breach.

Options like insurance can protect municipalities that may not have the right tools or resources in a data breach, Ponemon said.

And if going with cloud makes sense, Ponemon suggests organizations make the decision looking at the whole picture: "Make sure it's not just a cost decision, that it's based on cost, quality, the ability to deploy quickly. All of that good stuff should be determined in advance. Then I think it would be, in many cases, a big improvement for government organizations.

"The key," he added, "is if you're going to do cloud, just do it safely."

Sweeping BitTorrent Sync Update Streamlines Cloudless File-Sharing

Excerpted from PC World Report by Ian Paul

BitTorrent Sync offers an enticing promise: share any folders you want across all your devices using peer-to-peer (P2P) networking — no cloud necessary. And people are buying into that promise, according to BitTorrent, with more than 10 million installs of the app and 80 petabytes of data transferred.

Despite the promising early numbers, however, early versions of Sync were a little confusing to use for non-power users.

That changes on Tuesday with the introduction of BitTorrent Sync 1.4 Beta for Windows, OS X, Linux, Android, iOS, and Windows Phone. The newest version of Sync is a giant leap forward in usability thanks to its improved sharing capabilities and much needed visual overhaul.

I've been using Sync 1.4 for a few days. Here are my initial impressions using the new app on Windows 8.1 and Android.

The first thing you notice when you fire up Sync 1.4 on Windows is how much better it looks than previous versions. The utilitarian, rudimentary tabbed interface is gone and replaced with a non-tabbed flat design in line with the Windows 8.1 design aesthetic.

By default, the new Sync window shows two columns: folder names and sync status. Click on the settings cog in the upper right corner and select Preferences to add other columns with additional information such as receiving, sending, progress, path, size, date added, date synced, and peers.

You should also note at the top of the general preferences window there are three key bits of information you should know about: your user name, the device name, and the device's fingerprint. All three pieces of information are meant to help others identify you (more on that later). Most users probably won't bother with these settings, but the security conscious among you will want to take note.

The new Sync also adds more per-folder controls including a variety of syncing preferences, a list of peers sharing your files, a disconnect button, and access to each folder's archive. Hover over a folder and select the three vertical dots on the far right to access this menu.

As with previous versions of BitTorrent Sync, you can't share individual files—only folders. To add a folder to Sync 1.4 click the blue Add folder button in the top left corner of the app. This opens an Explorer window where you can choose the folder you want to add to Sync.

Once you've added a folder you're ready to share it with others.

Hover over any folder in Sync and you'll see a new sharing icon, which makes it far easier to move data between devices than the old secret code-sharing method. Click the Share icon and a new window pops up with the choice to email or copy an app-specific link that will automatically open Sync on the recipient's computer. For mobile devices you still scan a QR code as before.

By default, Sync offers read only sharing options, but you can click the Read & Write tab at the top of the sharing window to give someone full access to your folder.

BitTorrent has also added a right-click context menu to the Windows system for even easier sharing. Just navigate to the folder you want to share in Explorer, right-click the folder, and choose one of the menu options.

Let's say Jane wanted to email Bob a link to her Vacation Pictures folder. To start, she'd click the Share icon and then click the email icon in the window that appears.

This will automatically open Jane's default mail client with a boilerplate message that Jane can edit to meet her needs. The subject line includes the name of the folder she's syncing and an app-specific link in the body of the message.

Once Bob receives the email he just clicks the link in the message. If he has Sync 1.4 installed he'll be able to add the folder. If not, he'll be prompted to install Sync 1.4.

Sync 1.4 automatically suggests where you should save folders others want to share with you.

Even better for Bob, Sync now automatically offers a default location to save the folder that Jane wants to share with him. Before version 1.4 it was up to Bob to figure out where to save the folder. That often ended up with some confusing naming conventions and poor choices for saving locations.

But Bob's journey isn't over yet. Once Bob accepts the link to Vacation Pictures and chooses a save location he won't start receiving data from Jane's PC right away.

By default you have to approve recipients of your folders after they've accepted your sharing link.

By default, anyone who accepts a Sync link doesn't automatically start receiving files from the sender. Instead, the recipient has to be approved by the sender after they've accepted the sharing link.

In our example, Bob accepts the sharing link that Jane sent. Then Jane gets a notification letting her know that Bob wants to share her folder. Jane then approves that link to let Bob start syncing.

Jane could also make extra sure that Bob is who he claims to be by taking a deeper look at the approval request.

All requests appear at the top of the Sync dashboard. As you can see in the image above, Bob has decided on the username Sticky. If Jane wanted to she could just click the green check mark to immediately share her vacation pics.

Sync helps ensure you're sharing your files with the intended recipients.

If she was a little more concerned about security, however, she could click on Bob's user name. A pop-up window would then appear showing Bob's device name, IP address, and Sync device fingerprint.

With Bob's fingerprint, Jane could call him up and read it back to him over the phone to make sure she's sharing her photos with the correct PC.

For the initial sender, approvals are something of a tradeoff for added security since you are sharing your personal files using insecure communication like standard email. Anyone looking for even more security could turn to encryption tools like PGP or miniLock.

For the privacy conscious it may be a little concerning that Sync displays your IP address to other users. But on the other hand Sync is designed to share files with people you know and trust. Plus the chances of data leakage to third-party snoops are minimized as all traffic between devices is secured with 128-bit AES encryption and initial connections are secured with perfect forward secrecy.

There are also advanced sharing options.

Going back to Jane's initial sharing window, underneath the three basic sharing options is an Advanced tab that offers three crucial settings.

Click on Advanced and the first thing Jane sees is a check box set to make the sharing link expire in 3 days. If Bob doesn't use the link by then he won't be able to access Jane's photos. Jane could also adjust the time of expiry to any number of days she wants. Alternatively, she could uncheck the box and the link would never expire.

Underneath that box there's also a setting to restrict how many times the link can be used. If Jane was sending the same link to five people, she could limit the link's use to five times.

The last box "Peers I invite must be approved on this device" is the setting requiring approvals before sharing. To turn this setting off just uncheck the box. Also note that any folders shared with others under previous versions of Sync won't have approvals turned on by default.

Like its Windows counterpart, the Android version of Sync has also received a visual overhaul. For the most part, the new version is much better and works similarly, including the new sharing method.

The one thing I noticed that's a little different and somewhat non-obvious is sharing content (other than photos) from your Android device to a PC or other mobile device. There used to be an Add folder button in Sync 1.3, but that's gone with version 1.4.

To share a folder on your Android phone or tablet, click the menu icon in the upper right corner (the three vertical dots) and select New backup.

A new window pops up showing options to add the pictures, music, movies, or downloads folder to Sync. You can also tap Custom location to pick a specific folder in the file system. Once you've chosen a folder to share, you can send links around to other devices or friends just like with the PC version of Sync 1.4.

Going the other way. Adding a folder from a PC to your smartphone is easy and requires the QR code scanning method just like previous versions of Sync. The major difference being Android users no longer have to choose a save location just like their iOS-using counterparts.

Sync is still officially in beta and some users may run into bugs or instability issues with 1.4. Nevertheless, Sync 1.4 is a solid update that is much easier to use and well within the grasp of everyday users.

Telefonica Group CIO: Using Cloud to Regain Ground Lost to OTT Players

Excerpted from Business Cloud News Report by Jonathan Brandon

With operations in 24 countries, over 120,000 employees globally, Telefonica Group, one of the largest telecommunications companies in the world, is looking to put information technology (IT) at the core of everything that it does in order to compete globally in an industry currently in the throes of a digital revolution. Phil Jordan, Group Chief Information Officer at Telefonica tells BCN that cloud is at the center of how the company plans to regain terrain lost to over-the-top (OTT) players, make its core mobile and fixed line operations more flexible and scalable, and enable it to provide next generation digital services.

"IT is a strategic differentiator for the company, but it hasn't always been perceived that way, even when I became CIO," says Jordan, who took up his current role at Telefonica in 2011. Context, he says, is essential to understanding why this perception proliferated within the company and throughout the telecoms sector more broadly, and why the operator is currently spearheading so many new digital initiatives.

"We're a business operating in an industry that failed to innovate fast enough, which has probably created the opportunity for the Vibers and the WhatsApps of the world, and I think we've failed to innovate in our own product because, like a lot of businesses, we felt we didn't need to because we were making lots of money," he explains. "You get complacent."

The situation is cause for concern, not just at Telefonica but for most telcos, which could be losing up to $386 billion cumulatively by 2018 to OTT players like Skype and WhatsApp according to research and consulting firm Ovum.

It is within this context that Telefonica has sought to become more digital, an effort complicated by its continued growth and a series of acquisitions. As one of the biggest and oldest telcos with lots of fixed line heritage, it has accumulated a very complex IT systems landscape over the years, which Jordan says the company is constantly trying to simplify.

"One of our big challenges as a federation of separate businesses is that we have to remove complexity. We've managed to overcomplicate the industry, our business, and our internal systems over the last 20 years," he says.

But the biggest challenge, he explains, is inextricably linked to what Jordan believes is the biggest opportunity for the operator: data.

"We've always had a tremendous amount of data, we've always been a big data company, but how do you derive insights from the data? Because of systems fragmentation, we've struggled to derive real insight and particularly global insight through the use of data. Getting a 360 degree view of our customers is actually a much bigger challenge for us than working out how to leverage analytics and big data systems."

Because of the way the industry grew, Jordan explains, the lack of recognition in how important IT was moving forward, "we've ended up with such fragmented systems that they don't really lend themselves well to forming that 360 degree customer view. But that's a problem because data is the new battleground and the future differentiator for our industry."

"We've gone from having almost 7,000 systems three years ago, down to 4,200 now, so we're slowly simplifying our estate."

The company's SaaS and virtualization strategies are central to this process. Telefonica is a large user of Office 365, SAP SuccessFactors and Salesforce among other big name cloud services. It deploys these services from a private cloud platform hosted in its massive data center in Spain, which is one of the largest in Europe, dwarfed only by Portugal Telecom's recently announced Covilha data center.

By centralizing these services the company is able to leverage its cloud platform and generate operational efficiencies through a shared services approach, while ensuring local standards and businesses processes can be maintained where necessary, and by purchasing commoditized solutions off-the-shelf the company has enjoyed significant cost benefits.

The operator has also virtualized a number of its internal platforms, which allows the company to sweat its existing assets and make its data center resources more scalable and flexible.

By the end of this year Jordan's team will have virtualized about 40 per cent of the group's IT servers, and the company is now taking this approach to the most mission-critical system of all — the core network, through network function virtualization (NFV). Along with global CTO Enrique Blanco and his team the company is working on a proof of concept for a virtual radio access network (vRAN). Work on a virtual Evolved Packet Core, vIMS, vDNS and vDHCP is also set to conclude this year as the operator looks to virtualize 30 per cent of all new infrastructure by 2016.

Virtualizing these system will allow Telefonica to deploy network assets in a way that allows them to be managed centrally and deployed globally, while making them more flexible, scalable and less expensive to acquire and maintain than legacy networking hardware.

The operator has taken a slightly different approach with its business support and operational support systems. Telefonica recognizes the need to transform BSS and OSS in order to have these digital capabilities and a foundation for the future. But it's not consolidating these systems or putting one BSS or OSS across the group because it's just not practical or doable across many group companies, Jordan explains.

"We are doing greenfield BSS implementations in 14 separate countries at the moment, so we've accelerated beyond belief in pace and the urgency, and these are with the same standard processes and architecture, using three different vendors, heavily based on standards and reused on processes. So core BSS and OSS processes reused country to country, in clusters of the same technology."

The company's multimillion dollar investment into its global BSS overhaul has as much to do with eliminating data systems fragmentation and simplifying the back-end of the services it offers as much as driving digital engagement with customers and readying itself for the company's future, which Jordan says increasingly sees digital services at its core.

"We must become more digital in our interactions with our customers. The new generation of Telefonica customers don't engage with us in the way they used to. Online is an important channel for us but a 'digital only' channel experience needs to be created in all the countries we operate in."

But Jordan says this is part of a broader strategy to drive digital services — particularly platform-based services — within Telefonica, and as a key component of its market offerings.

"I don't think we'll ever be able to innovate as fast as the model that now exists around the Internet and around the digital world, so adopting an open platform that innovators and entrepreneurs want to base their applications and their services on, and provide capabilities to innovate with, is central to where we want to go, and becoming a platform business, offering platform as a service, is a key element to our future."

The company's transformation into one primarily focused on digital services hasn't been easy, but it has made some bold changes to encourage its move in that direction, beyond what's already been discussed above of course.

Telefonica Digital for instance, the London-headquartered independent division of the telco dealing primarily in digital services and data-driven marketing innovations, was closed down and re-integrated into the company in February this year. The move was aimed at cutting costs, but it was also intended to centralize the type of work it was doing back in Madrid, the company's headquarters, and weave the 'digital-first' approach into the broader group rather than keep it at arm's length.

Jordan says that there is no longer any doubt of the convergence of IT and telecoms, both in terms of the players competing to offer digital (cloud) services, and the technologies at their foundation. The company is going to continue working towards the virtualization of its network, having recently announced a partnership with Brocade to explore the capabilities of virtualized network appliances running on standard, x86 Intel-based hardware.

These kinds of innovations are central to the company's strategy of simplifying and improving the scalability and flexibility of its network, particularly as it looks to target the Internet of Things (IoT).

"We see the opportunity for monetizing data on networks through wearables. It allows us to sweat the assets, and M2M is a great way to schedule data outside human peaks. It's a great way to optimize our own networks, and we also want to build services around IoT. All these devices that will be embedded will need to talk to one another," he says.

"We need to build and aggregate services around this. But we need to monetize it because it's expensive to build networks at a time when these services are commoditized and easily disrupted by other people. It's a good challenge."

Jordan says that in terms of internal systems, what really excites him is wearables. He would like to see how the company could make use of wearable devices and cloud services in combination at Telefonica, particularly for field service engineers. It could help make their jobs safer, and help them work more efficiently.

He explains that if he had the choice to start from scratch and greenfield everything, he would put as much emphasis on the telecoms business model itself as the systems that help it run, though he admits this is more of a distant fantasy than anything.

"My greenfield dream would be: don't run twenty countries as independent companies; don't engage with independent regulators in each country. I think it could be radically simplified. Rather than 7,000 systems I think a group telco should be able to run on around 100," he says, adding that only then will telcos truly be able to keep up with OTT players.

"Imagine the agility and pace that you would gain if you could run on 100 systems. We still have mainframe in Telefonica."

Rising Cloud Service Adoption Opens Fresh Revenue Stream for Telcos

Excerpted from Business Daily Online Report by Ben Uzor

A fresh revenue opportunity is opening up for telecommunications companies in Nigeria as large and small corporates in various industry verticals seeking to cut down on Information Technology (IT) costs increasingly jump onboard the cloud computing train, market observers say.

Cloud computing refers to the use of computing resources (hardware and software) delivered as a service over the Internet.

Given the immense cost-saving and efficiency benefits brought about by cloud-based services, Nigerian firms are eager to pay for cloud technologies, which telcos can deliver to them efficiently.

A recent study released by World Wide Worx and Cisco shows that only 36 percent of Nigerian firms were using the cloud in 2013. A significant 44 percent of Nigerian firms, however, said they would indeed embrace the cloud in 2014, bringing the total to 80 percent.

"This is certainly a growth area for the telecoms industry. Virtually every telecoms operator is looking into data hosting centers," said Emmanuel Onyeje, chief operating officer, Zinox Technologies.

Globacom, the second national carrier, which already runs one data facility in Lagos, recently announced a second center, which will be one of Nigeria's largest data centers on completion.

Also, MTN Nigeria has deployed a high-grade data center with 500 square meters of collocation and hosting space in Lagos. By virtue of this infrastructure, the company is offering cloud-based services targeted at small and medium enterprises (SMEs) across various industry verticals in Nigeria.

The case is the same for Bharti Airtel, which opened its 1,858-square-meter Lagos data center in 2012 to cater to its subscriber base.

MainOne, undersea cable operator, is also constructing a tier-3 data center in Lagos, tipped to be the largest in West Africa. The data center is expected to be completed in Q4 2014.

These projects, according to market observers, signify the growing presence of telecoms operators in the data center industry and the lure of attractive returns from providing infrastructure.

In 2014, the overall cloud services market revenue will reach $209 billion, according to Markets&Markets, a research company. Market observers are of the opinion that telcos are angling for a piece of the pie, with a view to making up for falling revenues from their voice services segment.

"We will continue to explore opportunities inherent in cloud computing to help in stimulating the economic growth and development of this great country," said Babatunde Osho, chief enterprise solutions officer, MTN Nigeria.

"The financial services sector such as all the major private banks and the Central Bank of Nigeria (CBN) are particularly interested in cloud computing as a means of reducing costs, by sharing services," said Dele Akinsade, director of developer platform for West, East and Central Africa, Microsoft.

Available statistics, however, show that banks, for instance, spent an estimated $900 million on IT infrastructure last year in a strategic attempt to improve customer satisfaction and market dominance.

Industry watchers say telcos see the opportunity and are positioning their respective networks to offer a myriad of cloud-based services, though infrastructural, security and regulatory concerns still remain a big issue.

"The government can support the cloud computing industry by developing a robust national cyber security policy," said Olusola Teniola, chief executive officer, IS Internet Solutions.

"Such regulation is a key building block for a cloud industry and would go a long way towards reassuring Nigerian businesses to host data onshore," Teniola said.

In the midst of implementing International Financial Reporting Standards (IFRS) in 2011, the apex bank announced plans that required lenders to maintain independent servers within Nigeria to manage their data.

Nigerian-owned Inlaks Computers has designed a number of private clouds for clients like FirstBank, UBA, and the CBN. The apex bank has also clearly stated its desire to cut banks' operating costs by pooling resources such as power and IT systems. Such moves are encouraging to grow domestic cloud-hosting capacity.

Nigeria has a cloud computing market potential of $1 billion, according to industry analysts, but broadband infrastructure bottlenecks form a critical drawback hindering the steady adoption of cloud services.

Net Neutrality Vs. Free Speech

Excerpted from Wall St. Journal Report by Robert McDowell

As the Federal Communications Commission's September 15th deadline for public comment on its new net-neutrality rules approaches, the "open Internet" movement has taken an unexpected turn toward undermining free speech. Net-neutrality activists have long tried to sell the public on the need to protect people from Internet service providers blocking or degrading consumer access to websites and online applications. Now the cable industry has jumped on the net-neutrality bandwagon.

Historically, cable companies vehemently opposed new rules governing Internet network management. Why? Because nothing is broken in the Internet-access market that needs fixing, and existing laws could prevent or fix any future problems. Also, new net-neutrality rules would politicize business and engineering decisions and slow down lightning fast developments in the Internet space to the sclerotic crawl of Washington bureaucrats.

Now, however, as a third misguided FCC attempt to implement net neutrality gains momentum, Time Warner Cable and the National Cable and Telecommunications Association are trying to drag Internet "edge" providers — including the websites of local broadcasters airing ABC, CBS, NBC and FOX — down with them into the regulatory abyss.

The "edge" of the Internet is where consumers go to get content such as movies from Amazon, streaming online TV shows from broadcasters and apps from companies like Uber. Thousands of start-ups sprout each year while billions in private risk capital is being plowed into a new economy that is providing unprecedented consumer freedom and benefits. Today's Internet blossomed precisely because the government kept its hands off of it.

Until now.

The FCC's attempt to turn the Internet into what amounts to a federally regulated public utility—all in the name of protecting consumers—has produced tortured logic among cable interests: If Internet service providers are going to be regulated, then websites that their subscribers watch—especially broadcasters' sites—should be regulated too.

According to comments filed with the FCC by Time Warner Cable and the National Cable and Telecommunications Association, broadcasters should not be allowed to take down or withhold the content they produce and own from online distribution even if subscribers have not paid for it—as a matter of federal law. In other words, edge providers should be forced to stream their online content no matter what. Such an overreach, of course, would lay waste to the economics of the Internet. It would also violate the First Amendment's prohibition against state-mandated, or forced, speech—the flip side of censorship.

It is possible that the cable companies figure that subjecting powerful broadcasters to anti-free speech rules will shift the political momentum in the FCC and among the public away from net neutrality. But cable's anti-free speech arguments play right into the hands of the net-neutrality crowd. They want to place the entire Internet ecosystem, physical networks, content and apps, in the hands of federal bureaucrats.

For instance, Columbia law professor Tim Wu, the architect of the movement who coined the term "net neutrality," testified before Congress in June that new rules should "capture" "media policy, social policy" and even FCC "oversight of the political process." His goal, and that of his myriad followers, is to have "FCC oversight of the Internet." Period.

Mr. Wu has outsize influence over regulators. He penned the first-ever net-neutrality conditions that were part of the FCC's approval of the AT&T-BellSouth merger in 2006. And he now appears to have a powerful industry ally, albeit perhaps unwittingly, in cable.

But America's cable companies should be careful what they wish for. History teaches us that once you invite regulators into your neighborhood to regulate your rival, it won't stop at the house across the street. Having cable argue for dragging edge-based content providers, like broadcasters or anyone else, into the morass only adds momentum to the net-neutrality effort.

Instead of sympathizing with its captors and helping to expand the dragnet of unnecessary regulations to every corner of the Internet, cable should flatly oppose new rules. The FCC has an unsuccessful track record in court after two similar power grabs in 2010 and this past January, so there's good reason to believe a hat trick of losses is in the making. Now is not the time to panic; it is the hour to persist in favor of a free Internet, and to begin preparing court appeals.

Coming Events of Interest

Cloud Connect China — September 16th-18th in Shanghai, China. This event brand was established in Silicon Valley (US) in 2008. Last year, it was first introduced into China, providing all-dimensional cloud computing solutions through pay conferences and exhibition.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

CloudComp 2014 — October 19th-21st in Guilin, China. The fifth annual international conference on cloud computing. The event is endorsed by the European Alliance for Innovation, a leading community-based organization devoted to the advancement of innovation in the field of ICT.

International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

GOTO Berlin 2014 Conference – November 5th–7th in Berlin, Germany. GOTO Berlin is the enterprise software development conference designed for team leads, architects, and project management and is organized "for developers by developers". New technology and trends in a non-vendor forum.

PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.

Storage Visions Conference — January 4th-5th in Las Vegas, NV. The fourteenth annual conference theme is: Storage with Intense Network Growth (SWING). Storage Visions Awards presented there cover significant products, services, and companies in many digital storage markets.

Copyright 2008 Distributed Computing Industry Association
This page last updated September 7, 2014
Privacy Policy