Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

Edwards Wildman

IBM

Iron Mountain

OutSystems

Paragon

Rackspace

SoftServe

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

September 15, 2014
Volume XLIX, Issue 8


Cloud Privacy & Data Security Webinar on Tuesday

The Distributed Computing Industry Association (DCIA) and Edwards Wildman Palmer will present "Cutting Edge Developments Affecting Cloud Companies in Privacy & Data Security," a one-hour webinar this Tuesday September 16th at 12:00 PM ET, featuring a discussion of key issues affecting privacy and security in data and high tech and a preview of what to expect at the CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014).

Topics will include recent court decisions affecting data privacy, security protection and disclosure — what you need to know to survive; legislation, regulations and guidance that will further impact privacy, security and liability — how you should prepare now for coming changes; and practical suggestions for addressing the natural tension between providers and customers: why it's cost effective to take steps in advance to avoid future conflict.

Speakers will include Edwards Wildman's Lawrence Freedman, Thomas Smedinghoff, and Michael Bennett. The DCIA's Marty Lafferty will moderate.

Lawrence Freedman is a Partner in the firm's Washington, DC office and formerly a CEO of a Communications/Cloud Company. Larry advises clients on a full range of strategic, contractual, and regulatory compliance issues, including data privacy and security issues, associated with the development and deployment of cloud computing strategies.

Thomas Smedinghoff is also a Partner in the Chicago office. He is internationally recognized for his leadership in addressing emerging legal issues regarding electronic transactions, identity management, privacy, information security, and online authentication issues from both a transactional and public policy perspective.

Michael Bennett, a Partner in EWP's Chicago office, counsels clients on a variety of technology issues including big data, wireless communications, machine-to-machine wireless communications, in-bound and outbound sourcing, SaaS, PaaS, IaaS, and all aspects of cloud computing.

Marty Lafferty is CEO of the DCIA, a trade organization whose members include a range of companies involved in cloud computing and providing platforms for storage, transmission, and exhibition of content: software application developers, broadband network operators, and digital media rights holders. Prior to DCIA, Marty served in senior positions for some of the world's most innovative entertainment and technology companies.

Please click here to register.

CLOUD DEVELOPERS SUMMIT & EXPO in Two Weeks

The Cloud Computing Association (CCA) and the Distributed Computing Industry Association (DCIA) will present the CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014) in Austin, TX on October 1st and 2nd at the Hilton Austin.

This year's summit and exposition will highlight the latest advances from the top-ten cloud brands Amazon Web Services, Dell, Google, HP, IBM, Microsoft, NetSuite, Oracle, Rackspace, and SAP.

36 highly focused strategic and technical keynotes, breakout panels, and Q&A sessions will explore cloud computing solutions with ample opportunities for one-on-one networking.

Also featured will be innovators such as AD Vault, Adjacent Technologies, , Aspera, Cerner, DigiCert, DirectTrust, Edwards Wildman, Front Porch Digital, Iron Mountain, JW Secure, Kinetic Concepts, Mediafly, , Office of the National Coordinator for Health Information Technology (ONC), OnRamp, Paragon Technology Group, Prime Care Technologies, Qrhythm, RealEyes Media, Sense Corp., SoftServe, Talend, TechLabs, VDI Space, Xvand, and more.

18 co-located instructional workshops and special seminars will be facilitated by industry leading speakers and world-class technical trainers devoted to the unique challenges and opportunities for developers, programmers, and solutions architects.

Conference sponsors and exhibitors include IBM, SoftServe, Rackspace, Edwards Wildman, Paragon, Iron Mountain, and OutSystems.

The agenda will cover Mobile Cloud, DevOps, and Big Data as well as general interest cloud service topics with a special focus on 3 economic sectors experiencing the most cloud adoption: Media & Entertainment, Healthcare & Life Sciences, and Government & Military.

Please click here to register.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThank you to the more than eighty organizations that joined the Distributed Computing Industry Association (DCIA) in signing-on to US Congressional letters that the Digital Due Process (DDP) coalition sent to Senate and House of Representatives leaders this week.

The DDP letters advocate reform of the seriously outdated Electronic Communications Privacy Act (ECPA) to protect data stored in the cloud.

Very significant progress has been made in recent months to garner support for HR 1852: The Email Privacy Act (EPA) in the House, and its companion in the Senate, S 607: Electronic Communications Privacy Act Amendments Act (ECPAAA), and we are now pressing for passage during this session of Congress.

The House letter reads as follows:

"We write to urge you to bring to the floor H.R. 1852, the bipartisan Yoder- ­Polis bill updating the Electronic Communications Privacy Act (ECPA).

Updating ECPA would respond to the deeply held concerns of Americans about their privacy.

H.R. 1852 would make it clear that the warrant standard of the U.S. Constitution applies to private digital information just as it applies to physical property.

The Yoder- ­Polis bill would aid American companies seeking to innovate and compete globally.

It would eliminate outdated discrepancies between the legal process for government access to data stored locally in one's home or office and the process for the same data stored with third parties in the Internet 'cloud.'

Consumers and businesses large and small are increasingly taking advantage of the efficiencies offered by web-based services.

American companies have been leaders in this field.

Yet ECPA, written in 1986, says that data stored in the cloud should be afforded less protection than data stored locally.

Removing uncertainty about the standards for government access to data stored online will encourage consumers and companies, including those outside the U.S., to utilize these services.

H.R. 1852 would not impede law enforcement.

The U.S. Department of Justice already follows the warrant-for-content rule of H.R. 1852.

The only resistance to reform comes from civil regulatory agencies that want an exception allowing them to obtain the content of customer documents and communications directly from third party service providers.

That would expand government power; government regulators currently cannot compel service providers to disclose their customers' communications.

It would prejudice the innovative services that we want to support, creating one procedure for data stored locally and a different one for data stored in the cloud.

For these reasons, we oppose a carve-out for regulatory agencies or other rules that would treat private data differently depending on the type of technology used to store it.

H.R. 1852 is co-sponsored by over 260 Members, including a majority of the majority.

We urge you to bring it to the floor.

We believe it would pass overwhelmingly, proving to Americans and the rest of the world that the U.S. legal system values privacy in the digital age.

Sincerely,

Adobe, ACT | The App Association, American Association of Law Libraries (AALL), American Civil Liberties Union (ACLU), American Library Association (ALA), Americans for Tax Reform, AOL, Apple, A Small Orange, Association of Research Libraries (ARL), Automattic, Autonet Mobile, Blacklight, Brennan Center for Justice at NYU Law School, BSA | The Software Alliance, Center for Democracy & Technology (CDT), Center for Financial Privacy and Human Rights (FPHR), Cheval Capital, CloudTech1, Code Guard, Coughlin Associates, Competitive Enterprise Institute (CEI), Computer & Communications Industry Association (CCIA), The Constitution Project, Council for Citizens Against Government Waste, Data Foundry, Digital Liberty, Direct Marketing Association (DMA), Disconnect, Discovery Institute, Distributed Computing Industry Association (DCIA), Dropbox, DuckDuckGo, Endurance International Group (EIG), Evernote, Electronic Frontier Foundation (EFF), Engine Advocacy, Facebook, Foursquare, FreedomWorks, Future of Privacy Forum (FPF), Gandi, Golden Frog, Google, Hewlett-Packard (HP), Information Technology Industry Council (ITI), The Internet Association, Intuit, Internet Infrastructure Coalition (i2Coalition), Kwaai Oak, Less Government, LinkedIn, Media Science International (MSI), Microsoft, NetChoice, New America's Open Technology Institute, Newspaper Association of America (NAA), Oracle, Peer1 Hosting, Personal, Rackspace, Records Preservation and Access Committee, R Street Institute, reddit, ScreenPlay, Servint, Software & Information Industry Association (SIIA), Symantec, Taxpayers Protection Alliance (TPA), Tech Assets, TechFreedom, TechNet, Tucows, Tumblr, Twitter, U.S. Chamber of Commerce, and Yahoo! Inc."

The Senate letter was presented to DCINFO readers here.

With HR 1852 and S 607, American lawmakers have the rare opportunity to update digital communications privacy for the 21st century by providing the same amount of privacy to data stored in the cloud as to information stored on premises.

This Wednesday September 17th is Constitution Day.

What a great opportunity to let your elected officials know that there is no better time to pass ECPA reform and affirm Americans' Fourth Amendment rights online.

By passing these measures, Congress can show it takes its constituents' privacy seriously — and that it can enact meaningful reform, which will level the playing field for the protection of electronic communications

Share wisely, and take care.

US Telcos Shy Away from Benchmark Boost

Excerpted from ZDNet Report by Leon Spencer

US telecommunications giants AT&T and Verizon have called on the Federal Communications Commission (FCC) not to boost its definition of broadband to 10Mbps from its current benchmark of 4Mbps.

In submissions to the FCC's inquiry into advanced telecommunications deployment in the country, AT&T and Verizon have both told the Commission that the current data speed benchmark for broadband is adequate, given the online habits of most Americans.

"The Commission should not artificially narrow the definition of broadband to require certain capabilities (such as the ability to stream HD video to multiple users simultaneously), and should instead study the full range of services that consumers demand and the variety of services they are using to fill these varied needs," said AT&T in its comments to the inquiry, submitted late last week.

"The Commission should undertake a more rigorous, fact-based, and statutory analysis before determining what, if any, definitional revisions are warranted at this time. Even recognizing that the definition of broadband will evolve over time, the notice presents no record basis for a conclusion at this time that a service of less than 10Mbps is no longer 'advanced'," the company said.

The FCC has periodically raised the minimum standard for Internet service to be considered a broadband service, according to ARS Technica, determining in 2010 that the minimum broadband definition speed should be between 200Kbps to 4Mbps and 1Mbps upstream.

However, according to AT&T, the current pace of the industry does not require a change in definition for what constitutes broadband.

"Given the pace at which the industry is investing in advanced capabilities, there is no present need to redefine 'advanced' capabilities and the proposed redefinition is not adequately supported," said AT&T in its submission.

If the FCC does amend the broadband definition benchmark, it would likely dramatically alter the broadband coverage rates that the country's service providers would be able to claim, which, under the present definition, is relatively far reaching.

As of 2012, 94 percent of Americans had access to fixed broadband services under the current definition benchmark, said AT&T, referencing the FCC's findings from that year. According to the company, 94 percent of the population also had access to mobile broadband as of 2012.

Meanwhile, Verizon argued in its submission that the FCC should make a point of keeping in mind the prevalence of wireless broadband as it considers raising the definition benchmark.

Verizon said that according to the National Telecommunications and Information Administration, which includes mobile broadband in its National Broadband Map, wireless broadband is now available to more than 98 percent of the population.

"As of the end of 2013, there were approximately 100 million 4G LTE subscribers in the US, which represents half of all worldwide 4G LTE connections," said Verizon in its comments, also submitted late last week.

Verizon also highlighted infrastructure development projects by AT&T and CenturyLink that have seen tens of millions of Americans hooked up to broadband — as it stands under the current definition.

"AT&T has stated that its proposed merger with DirecTV would enable the combined company to expand broadband deployment further still, 'to at least 15 million customer locations across 48 states, with most of those locations in under-served rural areas'," said Verizon.

"CenturyLink passes nearly 8 million homes with fiber to the node, and in August 2014 announced that symmetrical broadband speeds up to 1Gbps are now available to residential and business customers in select locations in 16 cities," the company said.

Saving Net Neutrality the BitTorrent Way

Excerpted from Bloomberg View Report by Leonid Bershidsky

Eric Klinker, the chief executive of BitTorrent, the file-sharing software company, has proposed -- only half-seriously -- an interesting solution to the net neutrality vs. bandwidth-hogging problem. Instead of having heavy traffic generators such as Netflix pay Internet providers for "fast lanes" to their customers, Klinker suggests that providers pay other companies to get into a "slow lane," shifting traffic to off-peak hours.

Klinker's idea is typical of net neutrality advocates' arguments: It is largely based on the essentially Communist argument that Internet providers should shut up about any extra charges because they have plenty of money, anyway. There is, however, a curious and potentially useful technical side to the proposal.

BitTorrent knows all about slow lanes. In 2011, it accounted for 13 percent of peak hour Internet traffic in North America. Now, according to the broadband-equipment firm Sandvine, its share is just 5.96 percent (which still makes it the third-biggest traffic generator after Netflix and YouTube). One explanation is that BitTorrent technology is widely used to share pirated movies and music, and in a world of cheap or even free streaming, that is an increasingly obsolete way to get one's hands on content. The other one is that, after Comcast shut off BitTorrent traffic in 2007 and 2008, the company made a conscious decision to stop being a peak-traffic hog.

It moved to the so-called Micro Transport Protocol, which shifts traffic to less congested times. A BitTorrent download will slow down when the network is relatively crowded and speed up when there's less demand for bandwidth. That doesn't only affect the pirate downloads but also legitimate uses of BitTorrent, including its popular Sync service -- cloud storage without the need for enormous data centers or the danger of government control.

So now Klinker says Internet providers should pay people who, like him, voluntarily renounce bandwidth hogging, just like electricity companies encourage people to use more power in off-peak hours. "This would relieve pressure on the network, yield a better experience for users and would be worth real money to the ISPs," he argues. "Additionally, there would be no unnatural pressure for the ISP to deliberately degrade the base service in order to manufacture demand for the priority service, as some have suggested might happen."

In fact, in the old days Internet providers did charge less for off-peak use. One can still find such plans in places including Nepal. Introducing "night tariffs" is a variation on the idea of broadband providers going back to the old practice of charging only for traffic used, which would be fair but inconvenient to users who want their bill to be predictable.

Klinker's proposal is only a rhetorical device: He's arguing that the problem of net congestion doesn't exist, so there's no need for Netflix to pay the providers or for providers to reward "slow lane" customers. "It's time to stop engaging in this kind of zero-sum thinking, pitting one user of the network against another," Klinker writes. "There's no scarcity." As with many net neutrality advocates, he refers to Internet providers' fat gross margins to prove his point.

It's true that bandwidth is not really scarce. According to the Federal Communications Commission's 2014 report on U.S. broadband performance, providers now routinely exceed their advertised speeds, which wasn't the case a year earlier. That kind of generosity would be impossible if there were a capacity shortage. The profits argument, however, remains ignorant or disingenuous. To make the increases in speed possible, providers have to maintain and upgrade their networks. The investment it requires does not affect gross margins. Comcast in 2013 reported capital expenditure of 8.5 billion, more than its net income for the year and about 13 percent of revenue.

It's wrong to expect Internet providers to keep paying for equipment that allows us to watch more video, play "heavy" games over the Internet or increase our use of the cloud. Now, they want heavy traffic generators to chip in, and the FCC appears to support them. Perhaps, however, Klinker is right about revisiting old traffic-based payment plans. I think it would be misleading to most consumers, but having a discussion about how we want to be charged would help figure out how much people really care about net neutrality.

If we truly want a neutral Internet in which all traffic is created equal, we should be happy to pay for our actual usage. That would be fair, and providers would not need to create fast or slow lanes. If, on the other hand, we want the convenience of a fixed payment, net neutrality is no more than a meaningless fetish.

How to Completely Decentralize the Internet

Excerpted from BitCoin Magazine Report by Andrew Wagner

Few technologies have been as socially disruptive as the Internet. Before computers, reaching a wide audience required control of printing or broadcasting centers. These have been replaced by home computers, which people worldwide are gaining access to at a phenomenal rate. Message boards, blogs, and other websites enabled the two-way flow of information on a massive scale, and through the use of liberating new innovations, we can decentralize the Internet completely.

Peer-to-peer (P2P) networks are the best example. Previously, files were distributed via dedicated machines designed to handle a massive number of requests; now, one can share files to just a few "peers," who share it to a few more people (and so forth), until anyone can gain access via a branching web of connections. Who has which chunks of data is tracked by "torrents," which can be run by clients like FrostWire to upload and download the desired files. Without a central server to confiscate, it's impossible to find and remove "bad" content or evidence of corruption.

If we use encryption and make these networks complex enough, it can be almost impossible to trace the source of data, rendering it very difficult to block access to "undesirable" news sites and content. This property of P2P networks is utilized by programs like Tor, which funnels data (most commonly webpages) through a long and confusing series of nodes. This makes finding and punishing those who break censorship laws a nightmare, resulting in the further erosion of central authority over online communication.

Now with the advent of cryptocurrencies like Bitcoin, we can use decentralized networks to send money, as well. Instead of music, video, or similar files, we send transactions that transfer ownership from one user to another. Unlike file-sharing networks, however, where ownership is not a concern, this required the advent of blockchain technology, which can keep track of who owns what without the need for an arbiter or judge. Besides that, the underlying concept and digital architecture are identical.

As new inventions allow more things to be transmitted over the Internet via P2P networks, the scope of this decentralization will only increase. Despite these improvements, however, most online traffic is still handled by central servers, and almost everyone on the World Wide Web uses the Domain Name System, which is controlled by an American non-profit corporation. The physical infrastructure is composed mostly of wires owned by monopolistic telecom businesses, and if we want to decentralize society any further, these shackles must be removed.

Cryptocurrency has already done a great job of decentralizing the domain name system. Using Namecoin, one can register .bit domain names directly on the blockchain; whomever possesses the private key in control of the domain name provides the IP address to which it forwards. This is a step in the right direction, but only liberates one aspect of the web, ignoring key problems like how online data is stored and delivered. We need a decentralized digital architecture that can handle all online traffic, not just file downloads and domain name forwarding, or else our dreams of a decentralized Internet are just that.

P2P technology can solve this problem, as well: if it's possible to store and send files or money using a P2P network, it should be possible to do so with any type of information. Rather than using a central server to distribute things like web pages, application data, or files stored on the cloud, we can download that content in pieces from various computers on the P2P network. Constantly-updated copies of this data will be distributed across all of these peers in encrypted form, ensuring safe, accurate, and continuous access.

The first of these systems is called MaidSafe, and its development began years before Bitcoin went public. Anyone running this open source program will become part of the SAFE Network, some of which will volunteer to become "vaults." All data on the SAFE Network is stored across these vaults in an encrypted format, which can only be broken using the private key that uploaded the data, or one to which permission has been granted. The network stores a total of exactly 4 full copies of this data at all times, and randomly assigns processing tasks like determining and validating the locations of these chunks to all of the nodes. Nearby nodes are organized into groups, which watch one another and will eject a node that misbehaves.

Nodes are incentivized to become vaults by Safecoin, which is rewarded in proportion to how much resources they contribute to the network, most of that being storage space. You consume safecoins by using resources, and they can be used to purchase goods, services, or other digital currencies. Unlike cryptocurrencies, however, they are not based on a blockchain; account balances are stored on a ledger distributed across network vaults along with the rest of the network data. They can be exchanged via the Mastercoin protocol.

Theoretically, one could host or operate any type of website or application this way. One of the most notable applications to take advantage of this opportunity so far is the API Network, which provides a new means of distributing and calling APIs. For those who aren't yet familiar with APIs, you can learn about the process in one of our previous articles. Although its native coin—XAP—is stored on the Bitcoin blockchain via Mastercoin, the API Network uses the SAFE Network to store API data and call it upon request, which rewards XAP to the API provider. This decentralizes access to things like Google Maps, cryptocurrency price data, and various useful web apps.

Storj is a more recent open source platform, and the winner of the Texas Bitcoin Conference Hackathon. Like MaidSafe, it enables a P2P network that can store and transmit a wide variety of information. Nodes support the network by running the DriveShare application, which rewards users with Storjcoin X for storing encrypted chunks of data uploaded to the network. They operate on the Counterparty protocol on top of the Bitcoin blockchain, which allows them to be exchanged for other coins, used for commerce, or spent on other Storj applications.

The main application for which they're famous is called Metadisk, and it completely decentralizes cloud storage. Instead of uploading files you want to store online to to a central server, they're uploaded to the Storj network and stored by those running the Driveshare program. If you're not running Driveshare yourself, you'll have to earn Storjcoin from elsewhere to pay for this service; compared to competitors like Dropbox, however, the price is insignificant. As a bonus, your information cannot be accessed by third parties without your consent — something which Dropbox cannot claim.

Being a post-Satoshi platform, it should come as no surprise that Storj uses the blockchain to keep track of all this. Bitcoin 2.0 platforms like CounterParty allow one to embed more than just financial information in transactions, storing all kinds of data in blocks. Similar to how Namecoin can keep track of who owns what domain name, and projects like Ethereum can assign other property and assets, Storjcoin X stores information about who can access what data, and where it is at any time. Transactions are validated by Bitcoin miners who choose to register CounterParty transactions in return for a small fee, thus avoiding the problem of consensus.

These technologies will remove the need for centralized networks and servers. We now have a new way of thinking about how the Internet should work, and all of the protocols necessary to make that a reality. One thing that the open source and hacktivist communities cannot easily replace, however, is the physical infrastructure itself: the wires that carry the data sent between nodes are still owned by corporations and governments, which can monitor, restrict, or block your online activities. Even if you've managed to build your own cable lines, if you want to communicate with the rest of us, you have to go through one of the central hubs on which most of us rely — even if we're all using Storj and avoiding central servers.

The answer lies in mesh networking. The router that currently handles all of your online traffic operates under the assumption that it's part of a hierarchy: it forwards your requests to and from the machine one level above it, which routes it to and from either another nearby machine or an even bigger hub, which routes massive amounts of data. Instead, the protocols behind mesh networking assume the computers are all connected to each other — either directly or through other other Internet users — without any hubs in-between. 

No node is likely to be overwhelmed if we operate as a P2P network like MaidSafe or Storj. The main drawback is that this requires all users to carry other users' traffic, which costs computer resources and bandwidth. The best idea for incentivizing participation so far would incorporate cryptocurrency by either rewarding coins to those who route more traffic than they generate, or charge a fee to those who don't. The beauty of this solution is that it decentralizes the communications industry, which by its nature is prone to monopoly—conglomerates like Comcast, Verizon or Shaw in North America will be obsolete.

Unfortunately, there are technical limitations to this. Laying cable lines is rather expensive; we would have to unearth concrete and pavement each time someone moved and lines needed to be moved or upgraded. As wireless technology advances, however, the price of powerful WiFi routers will reach a point where they can effectively replace copper wires for the middle class in relatively urban areas. Instead of connecting to a modem installed by your Internet service provider (ISP), these routers connect directly to each other, or to long-range routers designed to reach past unpopulated terrain where no users live. Anyone connected to an ISP can act as a gateway, allowing others to reach content left behind in the historical system.

Once all that has been accomplished, the only point of vulnerability is the manufacturer. One day, we will be able to 3D print our own wireless routers, using open source blueprints free from any intentional security vulnerabilities. For now, however, a wide selection of wireless routers fit for the job are already available online, for anyone dedicated enough to help get the meshnet started. Rumor has it that the meshnet in Seattle is well underway, but our meshnet project in Vancouver appears to have stalled; I'm hoping to start contributing as soon as it starts up again. We can free ourselves from the bindings of cable companies as well as the government.

Cloud Threat Intelligence: The Next Big Thing?

Excerpted from TechTarget Report by Rob Wright

With FireEye's announcement of a new threat analytics platform for Amazon Web Services (AWS), threat intelligence for the cloud is now becoming a reality. But will cloud-based threat analytics systems displace traditional security information and event management products and threat analytics systems in the near future?

Announced last week, FireEye's threat analytics platform (TAP) for AWS is the first of its kind because, according to Milpitas, CA based FireEye, the product was built natively on Amazon's cloud and it combines FireEye's threat intelligence with event monitoring and analytics across AWS as well as a client's on-premise IT environment.

"We wanted to build a threat analytics platform that CISOs thought was missing for the cloud," Grady Summers, Vice President of Strategic Solutions at FireEye, said. "It combines the intelligence you need to detect emerging or unknown threats with the speed with which you need to react and the context you need to understand the threats."

A key attribute of FireEye's TAP is that it's integrated with the AWS CloudTrail web service and can monitor all AWS API calls made on a customer's account. The TAP product analyzes all CloudTrail data for any anomalous behavior or potential threats. In addition, the TAP for AWS can also integrate, index and analyze all of an enterprise's internal data.

"TAP can consume any type of data from the enterprise and apply that data to the threat analysis," Summers said.

But that enterprise data angle is a sticking point for many enterprises, according to Mike Rothman, analyst and President of Phoenix, AZ based infosec consultancy Securosis.

"There are a lot of folks that have a big philosophical challenge with sending their enterprise data to the cloud," Rothman said, "even if it's for security."

Rothman said FireEye's TAP for AWS, like many other cloud-based security products, has the kind of architecture that addresses many of the questions about where enterprise data is stored, how it's used and how it's then disposed of. But he said many enterprises still prefer to keep their data on premises, no matter what safeguards or reassurances a vendor can offer.

The cloud's flexibility is very helpful in terms of dealing with the growing amount of threat intel and data out there. Grady Summersvice president of strategic solutions at FireEye

FireEye, however, hopes that benefits of its cloud-based TAP system, particularly the scalability of the cloud and the cost savings compared to on-premise software and hardware, will entice more enterprises to make that leap.

For example, Summers said, one of FireEye's customers is an entertainment company that sells tickets for events. It could no longer rely on legacy, on-premise security information and event management (SIEM) systems for threat analytics because of the hardware costs; the customer regularly experienced spikes in web traffic and ordering. Summers said it made sense for the company to use a cloud model that could accommodate the fluctuations and growth in enterprise data and threat intelligence that needed to be analyzed.

"The cloud's flexibility is very helpful in terms of dealing with the growing amount of threat intel and data out there," Summers said.

Those advantages, coupled with the CloudTrail integration, are good selling points for FireEye, Rothman said. "The FireEye TAP solution is important because companies need visibility for what they're doing in AWS," he said.

But businesses that have regulatory and compliance concerns -- which Rothman said are not always legitimate -- are unlikely to be swayed by those cloud advantages unless they have a sizable portion of their environment in the cloud already and have overcome the philosophical challenges.

"If a lot of an enterprise's computing is done in the cloud, then they'll go with a cloud-based security solution like this," Rothman said. "But if it's not, then I don't think the benefits will convince those enterprises to move to cloud security."

FireEye is optimistic, however, that the lure of cost-effective cloud services on AWS and an agile, scalable threat analytics solution will help convince customers to move away from legacy SIEM systems or at least add a cloud-based threat analytics solution to their defenses.

Summers said the majority of FireEye clients still have some kind of legacy SIEM systems in place, but that may be changing.

"The initial data we've seen this year indicates a big change coming," Summers said. "I think we're going to see more SIEM deployments in the cloud and more TAP solutions on top of those systems as well."

Rothman said it's too early to tell how enterprises will adopt threat analytics platforms in the cloud, but he agreed with FireEye that cloud-based security products and services are trending upward.

"Is this going to totally disrupt the SIEM market? I think it's going to take a couple years," he said. "But there's going to be an increasing amount of IT being done in the cloud, and that includes security."

Tech Trends that Will Change the Legal Industry

Excerpted from Sys-Con Media Report by Todd Scallan

From the way legal teams prepare for trial to how they communicate with clients and other professionals, technology is quickly becoming an influential member of the practice. However, even in today's technology-driven world, not all firms can label themselves tech savvy. To help those firms integrate technology into their businesses, this article explores the top five technology trends presented by the American Bar Association's 2013 Tech Report to keep an eye on for the future of successful law firms and professionals.

Mobile Usage on the Rise: With 79 percent of small firms using smartphones for work purposes, and 45 percent using tablets, the recent mobility explosion has caused exponential growth in mobile usage in the legal industry. Mobile usage allows lawyers to be...well, mobile. With everything attorneys need from calendars to documents to emails at the tips of their fingers (or bottoms of their pockets), these devices now allow them to work on anything, anytime, anywhere.

Additionally, mobile devices are changing the way lawyers work on each case from start to finish. As Dallas-based lawyer Tom Mighell says in a 2013 Illinois Bar Journal article, "You can do client intake on an iPad and take notes on it when you're meeting, tally deadlines, review documents, review and take depositions, review transcripts, conduct jury selections and you have access to a number of trial presentation apps to help you present evidence in the courtroom." In other words, everything you need to be a successful lawyer can now live in the palm of your hand.

However, as mobile usage continues to grow in both large and small firms, so do the security concerns that surround each device, so make sure to establish mobile device security polices that determine how mobile devices can and should be used in the workplace.

Cloud Computing Set to Increase: Law firms are now gravitating toward cloud computing because it gives them the ability to access information and communicate anywhere, anytime - an advantage in a profession that requires employees to work outside of the office. In fact, 29 percent of solo and small firms reported using cloud computing, and nearly 15 percent of firms with more than 500 attorneys have reported utilizing some form of cloud computing as well. Law firms find the cloud beneficial because it doesn't require a large capital investment in hardware and infrastructure. However, security challenges regarding the cloud should always be top of mind for businesses.

Online Research Made Easy: In the past, lawyers had to physically spend hours in libraries conducting research; today is a completely different story. With many print resources going digital and the availability of e-books, more legal research can be conducted online - saving employees a significant amount of time.

Social Media: Networking has always been a big part of the legal profession, but the availability of numerous online communication channels will change the way attorneys connect with each other and with clients. Additionally, social media outlets act as free marketing tools, allowing small and large firms to spread the word out about their businesses to a larger audience, potentially bringing in more clients.

Additionally, courtroom coverage via social media and blogs will also change the legal industry, allowing attorneys and the public to follow trials online.

Data Protection and Business Continuity Planning: As technology use increases, so, too, does the need for data protection and business continuity planning. Law firms handle massive amounts of sensitive data, and any breach or loss of access can set a firm up for disaster. New technologies for Disaster Recovery, also called Recovery-as-a-Service, will help law firms implement a business continuity strategy with minimum complexity and time commitment. This gives firms peace of mind, while ensuring that valuable data and applications are protected and that employees can continue working even in face of an IT outage or natural disaster.

What do these trends mean for your legal practice? Attorneys must educate themselves about how to secure data, thwart security threats and avoid practice interruption in case a disaster strikes. While technology provides great opportunities for the legal profession, it also brings risks that must be dealt with in order for firms to stay in compliance and maintain their business.

We've seen firsthand the impact that application downtime, be it due to a virus, a computer crash or a natural disaster, have on firms that are not fully prepared. As you look at the technology trends for law firms and plan your IT budget accordingly, make sure you also follow these five simple steps to ensure your staff will remain productive, even if there is an event that could potentially disrupt your technology environment:

1. Create a disaster recovery plan: This could be as simple as a listing of all critical phone numbers and the contact information of all employees together with key processes to be followed in case the office needs to be evacuated.

2. Protect your critical data: Running out of the office with tape backups or hard drives under your arm because the sprinklers suddenly came alive is not good protection. Implement a data protection solution that securely stores your data off-site.

3. Ensure application continuity: Look for a solution that allows your employees to continue accessing key applications like your case management system, financials and CRM even if your office is unavailable.

4. Test frequently: Ensure your disaster recovery procedures are current and actually work by testing them at least quarterly. Involve different areas of the company and test not only steps for restoring data, but for application and server virtualization as well.

5. Make disaster recovery a priority: To ensure proper procedures and solutions will be implemented, make sure that disaster recovery is a top priority by partners and other senior staff. The cost of application downtime, which impacts billable hours and completing projects on time, is so great that making DR a priority should be top of mind.

As you begin using technology as an integral part of your practice, now is a good time to put your IT house in order and take proactive and preemptive steps towards avoiding practice interruption.

Apple Beefs Up iCloud Security

Excerpted from Top Tech News Report by Jennifer LeClaire

Tech giant Apple is taking the iCloud hack that revealed naked selfies of various celebrities seriously. The iPhone-maker plans to roll out new security measures to keep its users, whether celebrities or everyday Joes, safe.

Apple CEO Tim Cook told the Wall Street Journal the company will alert users via e-mail and push notifications when someone tries to change an account password, restore iCloud data to a new device, or when a device logs into an account for the first time. Apple also plans to implement two-factor authentication, which would demand hackers have access to at least two pieces of info the user offered when signing up for the account, such as a code, a password, or a log access key.

"When I step back from this terrible scenario that happened and say what more could we have done, I think about the awareness piece," he told the Journal. "I think we have a responsibility to ratchet that up. That's not really an engineering thing."

We caught up with Mike Davis, CTO at real-time endpoint threat detection firm CounterTack, to get his thoughts on Apple's moves. He told us it's great to see the company taking security more seriously than before. However, he added, what Apple is doing isn't enough.

"Apple, with its estimated 300 million-plus users, is not just a 'cloud service.' They have become like Facebook or LinkedIn in that they are critical to the identity of many users around the world," Davis said. "Your Apple ID allows you to save files, spend money and purchase applications, and even buy iTunes gift cards."

Indeed, your Apple ID is just as powerful as your bank ID in many cases, yet Davis argues Apple is taking the stance that its security is not as important as the security of a bank or other large financial institution. He said this could be because Apple is not under any regulatory or compliance requirements like banks and other institutions.

"If you asked my wife, an avid Apple fan, she would probably be more upset her Apple account was compromised than her bank account because she knows she has fraud protection in place with the bank, but has no such confidence with Apple because they don't communicate to her what they are doing to protect her," Davis said.

What Apple Should Really Do?

As Davis sees it, two-factor authentication is a good first step -- a step Apple should have taken a long time ago. He rightly pointed out that LinkedIn, Twitter, and thousands of other online cloud providers have had two-factor authentication for years. And he also pointed out that two-factor authentication won't prevent other attacks -- it only helps reduce the risk of one type of threat.

"The issue Tim alluded to really is the right issue Apple should be solving: awareness. Apple's approach to technology, the proverbial walled garden, is anathema to security in general as it focuses on 'less is more,' 'don't overload the user with too much information about what is happening,' and just 'make it work,'" Davis said. "Yet as a user you do want to know when your account is being used improperly, or by a device that shouldn't -- and you should know immediately, not just via an e-mail. Send me a phone call, a text, some immediate way so that e-mail doesn't get missed or tossed in spam."

Davis' conclusion: Apple has to step up and realize it is now a tier 1 cloud provider -- and even though the company is not under any regulatory requirements to secure customer 's data, it must implement the security controls that other tier 1 providers have or else risk massive brand -- and ultimately revenue -- impact.

IBM Offers More Security to Cloud Computing

Excerpted from iStreet Research Report by Suchetana Technology

IBM has leaped another step towards the security of the data and information with its new service on the cloud. Softlayers will now provide customer added security and hardware monitoring IBM informed. The cloud will be running on the Intel Trusted Execution Technology.

The cloud technology of the Intel is known for its reliability and strength. The hardware will be operating on the trusted hardware. It will be helpful for the compliance of the large trusted resources across the platforms and frameworks. IBM say that the technology will be especially helpful for the Financial, Healthcare, and Government organizations.

It will certify the cloud computing for workloads and work exposures keeping in line with Audit and compliance policy of course. IBM has again taken a large step for the security of the large enterprises where the data are spreading across many framework and platform.

Encrypted Cloud Data? Control Your Own Keys

Excerpted from Network World Report by Linda Musthaler

With cloud computing there's no longer a question about whether you should encrypt data. That's a given. The question today is, who should manage and control the encryption keys?

Whether talking to an infrastructure provider like Amazon or Microsoft, or a SaaS provider, it's imperative to have the discussion about key control. The topic is more relevant than ever as more companies move regulated data into the cloud and as concerns about data privacy grow.

Protecting regulated data is top-of-mind in the US where regulations such as PCI and HIPAA dictate that third parties not be able to access an organization's sensitive data. Even if the data is strongly encrypted, it's a compliance compromise if a cloud service provider has access to a full key that can decrypt the information without the data owner's knowledge or permission.

European countries, especially Germany and France, are more concerned with data privacy. They are troubled by the fact that US-based cloud vendors can be subpoenaed by the US government to provide access to specific information, even if it resides outside the United States. Last April, Microsoft was ordered to hand over a customer's emails to US authorities, even though the data was held in a data center in Ireland. If Microsoft also held the data's encryption key, the vendor could be compelled to provide that to authorities as well.

When it comes to processing and storing data in the cloud, organizations need to control their own encryption keys. What's more, this ownership must be established before contracting for a cloud application or platform.

One key management and encryption company, Porticor, has an interesting way to address these issues. When we first introduced you to Porticor as a startup company in 2012, we mentioned the company uses a split-key approach to key management. This approach has gained a lot of traction in the past two years, with a significant partnership with HP validating the notion of a "safe deposit box" for encryption keys that puts the customer in control.

Porticor provides both encryption schemes and key management technology, but it is the latter that is the distinct service offering. Porticor's Virtual Private Data (VPD) solution is a cloud-based virtual appliance. The encryption engine and the key management function are software based and hosted in the cloud, allowing the solution to become part of the cloud infrastructure for platforms (e.g., AWS, VMware, HP Cloud Services, etc.) and for SaaS offerings.

According to Porticor CEO Gilad Parann-Nissany, the company has two customer segments. One is the end user organization that is deploying its applications on AWS or a similar cloud infrastructure. The other is SaaS providers who want to offer their customers a range of encryption schemes and, most importantly, the ability for those end customers to control their own keys.

In developing its Virtual Key Management Service, Porticor followed the principle of a bank safe deposit box. When data in the cloud is encrypted, the key is split such that Porticor holds one part of the key and the customer holds the other—the master key. As with a safe deposit box, the customer can't decrypt the data without the key held by Porticor, and Porticor can't decrypt the data without access to the customer's master key. The keys must pair to provide access to the encrypted data, thus putting the user in control of the data. To further enhance security, the keys themselves are encrypted by the customer's master key.

This solution has been designed to basically snap into cloud infrastructures, so it is apparently possible to bring up secure encrypted disks in a matter of minutes and entire database systems in a matter of hours. Porticor makes extensive use of APIs and offers RESTful APIs in order to integrate with cloud systems and applications.

In addition, Porticor's solution can work on multiple levels. For example, customers can encrypt a complete database or a complete file store, and at the same time they can get granular in order to encrypt a single field of an application. Porticor's customers often use these capabilities in tandem to address a specific need. This multi-level capability is especially important for SaaS providers that want to enable users to encrypt, say, a field containing a credit card number, but not necessarily the entire database. Moreover, different encryption schemes can be applied to each element that is being encrypted; for example, order-preserving encryption will be applied to the ZIP code field.

Porticor's encryption and key management approach received quite a boost when HP selected the vendor to partner with for its own cloud-based Atalla security solution. Porticor's technology has been integrated into the HP stack to provide secure cloud encryption. An HP cloud encryption customer can now automatically store their part of the encryption key — the master key — directly into a FIPS Level II compliant hardware security module that is part of the Atalla security system.

The imperative for encryption for data in the cloud grows stronger every day—for security, for compliance, for privacy, and for peace of mind. Organizations that are putting their data in the cloud need options in which they control the encryption keys. Porticor's cloud-based Virtual Private Data system addresses those needs at the infrastructure level to reduce complexity while providing strong security.

Coming Events of Interest

Cutting Edge Developments Affecting Cloud Companies in Privacy & Data Security — September 16th at 12:00 PM ET. One-hour webinar by the DCIA and Edwards Wildman Palmer featuring a discussion of key issues affecting privacy and security in data and high tech and a preview of what to expect at CDSE:2014.

Cloud Connect China — September 16th-18th in Shanghai, China. This event brand was established in Silicon Valley (US) in 2008. Last year, it was first introduced into China, providing all-dimensional cloud computing solutions through pay conferences and exhibition.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

IEEE International Conference on Cloud Computing for Emerging Markets — October 15th-17th in Bangalore, India. The third annual CCEM, will address the unique challenges and opportunities of cloud computing for emerging markets in a high quality event that brings together industry, government, and academic leaders in cloud computing.

CloudComp 2014 — October 19th-21st in Guilin, China. The fifth annual international conference on cloud computing. The event is endorsed by the European Alliance for Innovation, a leading community-based organization devoted to the advancement of innovation in the field of ICT.

International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

GOTO Berlin 2014 Conference — November 5th–7th in Berlin, Germany. GOTO Berlin is the enterprise software development conference designed for team leads, architects, and project management and is organized "for developers by developers". New technology and trends in a non-vendor forum.

PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.

Storage Visions Conference — January 4th-5th in Las Vegas, NV. The fourteenth annual conference theme is: Storage with Intense Network Growth (SWING). Storage Visions Awards presented there cover significant products, services, and companies in many digital storage markets.

Copyright 2008 Distributed Computing Industry Association
This page last updated September 22, 2014
Privacy Policy