Distributed Computing Industry
Weekly Newsletter

In This Issue

Partners & Sponsors

ABI Research

Acolyst

Amazon Web Services

Apptix

Aspiryon

Axios Systems

Clear Government Solutions

CSC Leasing Company

CyrusOne

FalconStor

General Dynamics Information Dynamics

IBM

NetApp

Oracle

QinetiQ

SoftServe

TrendMicro

VeriStor

VirtualQube

Cloud News

CloudCoverTV

P2P Safety

Clouderati

eCLOUD

fCLOUD

gCLOUD

hCLOUD

mCLOUD

Industry News

Data Bank

Techno Features

Anti-Piracy

May 19, 2014
Volume XLVIII, Issue 3


It's Official: Cloud Most Disruptive Force in 20 Years

Excerpted from ITProPortal Report by Emily Garthwaite

Cloud computing is one of the most disruptive forces in business in 20 years, according to a study by professional services firm KPMG.

The research investigated the future of investment banking to come to this conclusion and warned that financial institutions need to move away from tradition and start embracing new technologies.

"Investment banking has always been cyclical business," claims the report.

"Powerful forces are altering the investment banking landscape in a manner and degree never before witnessed. The 'old ways' of doing business need to change," it adds.

Such forces include factors and trends affecting the sector, such as new payment models like Google Wallet and the rise of mobile.

The research however highlighted the arrival of cloud computing as a particularly disruptive technology, claiming it "continues to change the game."

"Cloud computing is proving to be one of most disruptive forces in business in the past 20 years," it says.

"Banks that continue to use outdated legacy systems will find it increasingly difficult to create and launch new services, to provide access to a mobile workforce and to accommodate geographically dispersed customers and partners as well or as quickly as their competitors who are operating in the cloud," it adds.

As a result of this, KPMG advises those in the banking industry to review operating models, the profitability of their business units and creating industrialized processes to better leverage data.

"The firms that emerge as leaders will be those able to adapt their business mix and operation models, positioning themselves to continue to grow revenues through a relentless focus on serving the changing needs of their increasingly sophisticated and demanding corporate clients," claims the report.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyThanks to each of our speakers, exhibitors, sponsors, and delegates who contributed to making CLOUD COMPUTING EAST 2014 (CCE:2014) an enormously valuable and stimulating event.

The strategic summit for business leaders and software developers focusing on the rapidly advancing gCLOUD (Government Cloud) and hCLOUD (Healthcare Cloud) concluded this past Friday afternoon in Washington, DC.

Please click here for the archival webpage from the event, which includes audio recordings of every session at the conference, keynote audio/visual presentations, the program brochure, and related material; and please click on these links for the conference photo album in Picasa and on Facebook.

On Thursday morning, CCE:2014 opening plenary keynotes provided insightful overviews of the state of cloud computing adoption for government and healthcare by Amazon Web Services' Steve Spano and IBM's Michelle Rudnicki.

Keynotes highlighting leading industry trends and the emergence of standards impacting gGLOUD and hCLOUD, along with learning from obstacles that have been surmounted, followed by Google's Seth Siciliano and Verizon's Igor Kantor.

After a mid-morning networking break, we examined common regulatory frameworks and pending legislation affecting the government and healthcare sectors with Edwards Wildman's Larry Freedman and Wahab & Medenica's Kaiser Wahab.

We closed-out the morning plenary with a panel discussion of outstanding issues still to be overcome for continued advancement with Microsoft's Yung Chou, Rackspace's Bret Piatt, Juniper Networks' Mark Belk, SAP America's Marlyn Zelkowitz, Canon's Thomas O'Neil, and MarkLogic's James Clippinger.

Following our conference luncheon, we zeroed-in on cloud computing essentials that are specifically relevant to adoption of cloud solutions in the government and healthcare sectors.

gCLOUD service models were described by Clear Government Solutions' Christopher Grady while hCLOUD service models were explained by Dell Healthcare and Life Sciences' Tim Quigley.

NetApp's Kirk Kern defined public sector deployment models while healthcare deployment models were delineated by Oracle's Andrew Dietrich.

Then gCLOUD architectures were explored by Dell Services' Jeffrey Lush while Document Advantage Corporation's Dave Wiggins provided insights into hCLOUD architectures.

With these fundamentals covered, we advanced to roundtable discussions of gCLOUD management considerations adding WSO2's Chris Haddad, Virtustream's Sean Jennings, and NetApp's Rediet Tilahun; and hCLOUD management considerations with Level 3's Allen Bintz, NTP Software's Bruce Backa, and VeriStor Systems' Justin Linenkohl.

These conversations continued in our mid-afternoon networking break, after which we delved into basic practical concerns.

Data storage for government was the subject matter for CyrusOne's Stuart Dyer while data storage for healthcare was ServerCentral's Avi Freedman's topic.

gCLOUD software applications were reviewed by the National Renewable Energy Laboratory's Debbie Brodt-Giles while hCLOUD software applications were analyzed by SoftServe's Roman Pavlyuk.

Workflow processes in the gCLOUD were examined by Unitas Global's Grant Kirkwood while hCLOUD workflow processes were scrutinized by VirtualQube's Scott Gorcester.

Our gCLOUD day-one closing panel added ASG Software's Brian Crowley, QinetiQ's Viswa Kumar, and General Dynamics Information Technology's Damian Whitham for a full discussion of implementation strategies.

Meanwhile, two sessions closed-out our first-day exploration of the healthcare cloud: Apptix's Bob Finocchioli joined our hCLOUD implementation strategies panel and then a special seminar on directed exchange introduced DirecTrust's David Kibbe, DigiCert's Scott Rea, Cerner's Andy Heeren, and ONC's Kory Mertz.

Our conference networking cocktail reception immediately followed, with special thanks to CCE:2014 sponsors and exhibitors, starting with platinum sponsors Amazon Web Services and IBM.

Gold sponsors included Acolyst, Aspiryon, CyrusOne, FalconStor, General Dynamics Information Technology, NetApp, Oracle, SoftServe, and VeriStor Systems.

Friday morning, we began with gCLOUD and hCLOUD business strategy tracks.

Vendor selection criteria for government cloud solutions were recommended by Kwaai Oak's Reza Rassool while StoAmigo's John Papadakis addressed this topic for the healthcare sector.

Two IBM executives offered their perspectives of task prioritization: Sal Vella for gCLOUD and Ramesh Menon for hCLOUD.

Performance metrics for the gCLOUD were outlined by Aspiryon's Andy Caldwell while Kwaai Oak's Reza Rassool summarized hCLOUD performance metrics.

Before our mid-morning networking break, two final panels in this program block explored economic considerations with IBM Federal's Victor Brown and DST's Mark Wells joining for the gCLOUD and CSC Leasing's Tom Mountcastle and BrightLine's Stephen Halbrook coming on board for the hCLOUD.

Our closing plenary session began with keynotes by Oracle's Brian Kracik, FalconStor's Chris Poelker, VeriStor Systems' Justin Linekohl, and Trend Micro's Blake Sutherland on new advances in cloud application development as well as programming challenges and opportunities in the public and healthcare sectors.

We concluded the conference with a panel on final considerations for selecting, deploying, and evaluating cloud solutions adding Dell's Michael Elliott, Oracle's Andrew Dietrich, and "Securing the Cloud's" author Vic Winkler.

Please plan now to actively participate in the CLOUD DEVELOPERS SUMMIT & EXPO 2014 on October 1st and 2nd in Austin, TX.

The all new format for this event — with hands-on instructional workshops as an important centerpiece — promises to be very exciting and not to be missed! Share wisely, and take care.

FalconStor's Chris Poelker Keynotes at CLOUD EAST 

Excerpted from the Wall St. Journal Report

FalconStor Software, a market leader in data protection and migration, is proud to have been a Gold Sponsor of CLOUD COMPUTING EAST 2014, which took place in Washington, DC from May 15-16. FalconStor's Vice President of Enterprise Solutions, Chris Poelker, gave a closing keynote speech on Friday, May 16th at 11 AM, offering his insights into how government and healthcare organizations can optimize IT for the cloud.

Chris Poelker is a highly regarded storage expert who recently served as deputy commissioner of the TechAmerica Foundation Commission on the Leadership Opportunity in US Deployment of the Cloud (CLOUD(2)). As part of his speech, Poelker demonstrated the methodology he developed for CLOUD(2) to help businesses determine which data and applications should be stored and accessed in the cloud. He also offered his views on how the software-defined data center combines the power of Artificial Intelligence (AI) with virtualization to optimize and automate IT.

"If you look closely at the five characteristics the National Institute for Standards and Technology (NIST) cites as essential in order to be considered a cloud solution, it's apparent that storage virtualization needs to be a key part of the offering. Storage virtualization is used to provide a level of intelligent abstraction between the actual storage hardware and the services provided by the cloud providers. The result is a more cost effective, agile, and automated IT infrastructure delivering the required services to the business as efficiently as possible," said Poelker. "Intelligent abstraction software is at the very foundation of what makes cloud storage viable."

Poelker has decades of experience in data management, including positions as storage architect at Hitachi Data Systems (HDS) and lead storage architect at Compaq. He also worked as an engineer and VMS consultant at Digital Equipment Corporation. Chris is the author of Storage Area Networks for Dummies and writes a regular blog for Computerworld on the topic of Intelligent Storage Networking.

Following his speech, Poelker served as a panel member on the topic of "Final Considerations for Selecting, Deploying, and Evaluating Cloud Solutions." Joining on the panel were executives from Dell, Oracle, and Covata.

Attendees of CLOUD COMPUTING EAST 2014 were also invited to visit the FalconStor booth, where FalconStor demonstrated how its solutions enable storage virtualization.

Global Provider CyrusOne at CCE:2014

Excerpted from Wall St. Journal Report

Global colocation solutions provider CyrusOne sponsored and exhibited at the second annual CLOUD COMPUTING EAST conference, presented by the Cloud Computing Association (CCA) and the Distributed Computing Industry Association (DCIA). CLOUD COMPUTING EAST 2014 took place May 15th and 16th at the Doubletree by Hilton in downtown Washington, DC.

"Businesses are increasingly turning to enterprise colocation solutions providers like CyrusOne to help successfully enable cloud infrastructure systems often pursued in response to cost, regulatory and security issues," said Scott Brueggeman, CyrusOne Chief Marketing Officer. "Often referred to as 'The Sky for the Cloud,' our facilities are designed for optimizing power usage effectiveness and the CyrusOne National IX, which enables fast interconnection to an ecosystem of business partners, content providers, networks, carriers, Internet service providers, and Ethernet buyers and sellers."

Two sectors of the economy that are leading the way in adopting cloud-based IT solutions are government and healthcare. This year's CLOUD COMPUTING EAST conference focused on how these two major sectors use cloud-based technologies to revolutionize business processes, increase efficiency, and streamline costs. The conference's speaking faculty was made up of more than 50 thought leaders who brought broad industry knowledge, technological savvy, and strategic insight to the undertaking.

CyrusOne specializes in highly reliable enterprise data center services and colocation solutions, and it engineers its facilities to include the power-density infrastructure required to deliver excellent availability, including an available highest possible power redundancy (2N) architecture.

Customers have access to the CyrusOne National IX, which marries low-cost robust connectivity with the massively scaled data centers that the company is known for by creating the first-ever data center platform that virtually links a dozen of CyrusOne's enterprise facilities in multiple metropolitan markets.

The CyrusOne National IX, coupled with the company's multiple dispersed locations and available 100 percent uptime, enables Fortune 1000 enterprises to implement cost-effective, multi-location data center platforms that can help manage their internal disaster recovery requirements and applicable regulatory or industry-specific requirements such as Sarbanes Oxley, HIPAA, and PCI.

CyrusOne was also recently the first to receive multi-site data center certification from the Open-IX (OIX) Association. The OIX Association has a mandate to promote resilient interconnection in hub cities to facilitate a more resilient Internet.

With 25 carrier-neutral data center facilities across the United States, Europe, and Asia, CyrusOne provides customers with the flexibility and scale to match their specific growth needs. The company is renowned for exceptional service and for building enduring customer relationships and high customer satisfaction levels. CyrusOne's more than 625 customers include nine of the Fortune 20 companies and more than 130 of the Fortune 1000.

Hybrid Cloud: The New Normal for Federal IT

Excerpted from GCN Report by Anne Altman

Some might think we're still in the early days of cloud adoption with agencies focused on simple application hosting — moving email to the cloud — to meet mandates. However, the numbers tell a different story. Recent projections from International Data Corp. predict that federal cloud services spending alone will reach $1.7 billion by fiscal year 2014.

By 2017, the federal government will spend nearly $9 billion on cloud computing.

As cloud adoption increases and grows in complexity, agencies find that they will compile more and more IT resources — applications, data and services — that reside on different platforms. Whether by design or by accident, hybrid cloud is becoming the new normal for CIOs. It's no longer an argument of cloud versus traditional IT, or public versus private. It's all of the above.

On the surface, this kind of mixed IT environment may be viewed as a challenge. One could argue that more IT platforms can only lead to more complication, especially as CIOs continue to rely on older, more traditional IT systems to provide important citizen services via new systems of engagement. More significantly, a hybrid approach can serve as a transformational asset for CIOs.

To start, a hybrid model allows agencies to use their existing infrastructure. Many agencies have significant IT investments, which do not need to be washed away in the race to the cloud. With a hybrid approach, CIOs can use the best resources for the task at hand, a way to connect data and applications to drive new value and respond faster to market changes.

The private sector has dealt with this issue over the past few years as they've adopted new cloud and mobile technologies — and they are seeing the benefits of a hybrid approach. According to IBM Center for Applied Insights research, 61 percent of surveyed private sector companies said they will have a hybrid IT environment by the end of this year. By 2015, we expect nearly 75 percent of large enterprises to have hybrid clouds.

Why the move? A hybrid approach provides unmatched choice and flexibility. Businesses are not forced into a single solution — they can pick the best application for the job, regardless of delivery platform. Data can be located wherever regulatory or security requirements dictate. This is a key benefit of a hybrid model for government agencies who must manage sensitive and secure data while still remaining accessible. Using the scalability of the public cloud, non-sensitive computing workloads can be placed where they fit best based on resource availability, operational costs and a host of other factors. With a hybrid model, agencies can quickly move when situations change.

How would a hybrid cloud look for the federal government? Here's a hypothetical situation we may see some day. The Centers for Disease Control and Prevention is responsible for tracking the spread of epidemics, gathering information in a variety forms, quantities and volumes from various sources across many geographies. When a potential global threat emerges, CDC researchers need to tap into extra computing resources quickly to model the potential impact. Constantly maintaining a computing network of that scale isn't necessary in a hybrid model. The agency can maintain a private cloud with the everyday computing power it needs but then tap into public cloud resources as needed.

In addition, a hybrid model allows data to be shared seamlessly across platforms. Enterprises can more easily integrate next-generation application platforms — for mobile phones and all sorts of intelligent devices — with their existing systems. By assembling ready-to-use services on various cloud platforms, enterprises can innovate more rapidly and launch new digital products and services while building new apps for customers.

The hybrid model is growing in the government and will continue to take hold as agencies understand its unique benefits. The private sector's experience with hybrid clouds provides a strong roadmap for government agencies. However, the private sector has shown that the benefits of a hybrid approach cannot be realized by accident or without the right strategy. It requires a thoughtful approach and design.

Cloud Benefits for Healthcare Extend to Safety

Excerpted from Business Solutions Report by Megan Williams

By now you're aware of the benefits that cloud computing can bring you and your customers. From more affordable infrastructure to an answer to the BYOD question — the technology is being used to tackle problems in myriad ways. But can it go beyond just fixing tech woes?

In healthcare, it looks like it can, and that it can do it at multiple levels. Overall though, the cloud addresses patient safety one very important way — it helps get existing measures over the hurdle of lagging technology that characterizes so much of the healthcare industry.

This goes well beyond cost savings. Much of the risk that IT presents to patient safety, centers on lost information, old information, and general mistakes. One of the most basic solutions is moving facilities from an infrastructure based on its own well-being, to one focused on patient-centered needs. The flexibility (and yes, affordability) of cloud solutions gives facilities and other entities options they did not have before in coordinating patient care, and also in managing complex, and sensitive data. 

A more unified data system also cuts back on the time clinical staff spends accessing disparate systems, allowing them to spend more time addressing patient needs (a core element of patient safety).

The much-touted electronic health records have become the norm in modern healthcare and their use is being encouraged from every direction — from government incentive programs to client curiosity. Beyond eliminating cumbersome paper trails, they are also seen as a direct contributor in the fight to improve patient safety. Through new Meaningful Use standards, they have been inextricably tied to the improved patient results. 

Getting them implemented, however, has proven to be a challenge. Cost, IT resource requirements, implementation difficulties and scalability all stand as hurdles in the use of EHR. Cloud computing, through its flexibility (great for small practices), automatic update capabilities (EHR, medical coding, and security standards are updated frequently) and simple implementation processes addresses many provider concerns around the next generation of medical record keeping.

This touches on the EHR point a bit, but when you combine the cloud with insightful use of the data in patient records, you get results that change healthcare communities.

In North Carolina, for example, a 12-hospital, not-for-profit system decided to tackle its inpatient glycemic management problem, using the cloud to create its eGlycemic Management System. The system is integrated with EHRs, and it helps improve workflow and nursing efficiency. The results? The system saw a 79 percent improvement over the national average across eight hospitals. The initiative meant dramatic reductions in readmissions and length of patient stay, as well as patient safety improvements.

Paul Chidester, M.D., VP of medical affairs for the hospital says, "Not only has the system resulted in excellent and sustainable improvements in patient quality and safety for Sentara, but at this stage we have data indicating it has also successfully reduced avoidable readmissions and shortened length of stay by, on average, an entire day. We will be publishing a large outcomes study looking at the impact of Glytec's system on these metrics."

What does this all mean for solutions providers? Hospital CIOs and tech leaders can sometimes have a difficult time making a case connecting advances in technology to the deepest concerns of healthcare administrators and decision makers. A clear link between cloud computing and patient safety is bound to be a selling point for any of your customers.

Forget Fad: Cloud Is Real, Here, and Growing

Excerpted from GCN Report by William Jackson

The cloud is not a fad, and the government's commitment to cloud computing is not a fleeting thing. In fact, as the bedrock technology for the new era of IT, "it's definitely here to stay, and it's only going to get bigger," said OMB's Scott Renda.

Government is at the very beginning of the third major shift in computing paradigms with the cloud, following the ages of mainframe and client-server computing, said Renda, the Office of Management and Budget's cloud computing and federal data center consolidation portfolio manager.

The global market for cloud services is estimated to be $158 billion this year, growing to $244 billion by 2017. However, federal government spending on the cloud is currently only a small percentage of this, Renda said, with around $3 billion of its annual $80 billion IT budget spent on cloud services.

Government drivers for cloud computing include data center consolidation, mobile applications, social networking and big data. But the primary driver for most agencies, Renda said, is the administration's Cloud First policy, established in 2010, which makes the cloud the default choice for new IT projects if there is a secure offering available.

The security challenge is being met in part by the FedRAMP program, which certifies that cloud service providers meet a baseline set of security controls included in the Federal Information Security Management Act (FISMA).

"FedRAMP does not replace FISMA," Renda said. But by ensuring that a set of basic controls are in place, it streamlines the process of evaluating system security before the system receives an authorization to operate.

Despite the government's commitment to the cloud, "it is not a panacea," Renda said. "The cloud is a means to an end," and mission must drive the choice of services.

To date, 75 percent of government cloud spending has been on private clouds and about 20 percent on public clouds, which come under the FedRAMP certification program. The remaining 5 percent is on hybrid clouds. But, "hybrid clouds appear to be getting more steam," Renda said.

The government's move to the cloud will not take place overnight, said Jeff Lush, former Veterans Administration Chief Technology Officer and now CTO of Dell Federal Government Services. But the shift is appropriate for many kinds of government services, and agencies have an opportunity to use it to improve IT security.

For most legacy IT systems, security took a back seat during development and has never caught up, making government cybersecurity a high-risk area since before the turn of the century, according to the Government Accountability Office. Cloud First gives agencies a chance "to hit the reset button," Lush said, adopting services and platforms in which security has been built in from the beginning and certified as meeting minimum government requirements.

Cloud Has Passed Tipping Point, Now Mainstream

Excerpted from WhaTech Report

Hosting and cloud computing are now mainstream with 45 percent of organizations beyond the pilot phase and 32 percent having a formal cloud computing plan as part of their overall IT and business strategy. Those are the conclusions of a global survey of more than 2,000 IT decision-makers undertaken by 451 Research.

The study also found, based on responses, that on-premises private cloud adoption accounted for 26 percent of on-premises infrastructure spending in 2013 and that hosted private cloud would experience the highest rate of growth for off-premises infrastructure, accounting for 32 percent of hosted spending in the next 24 months.

Seventy one percent of respondents use software as a service and 69 percent use hosted infrastructure services. These figures are set to rise to 85 and 83 percent, respectively over the next two years. Thirty seven percent of respondents presently use platform as a service, and 26 percent said they would do so in the next two years, taking the total to 63 percent. Respondents' average distribution of expenditure on hosted infrastructure services between traditional dedicated, hosted private cloud and public cloud was 48, 28, 24 percent and in two years will move to 42, 32 and 25 percent.

However only a small percentage (16 percent) of respondents indicated that their organizations had moved to broad implementation of production apps in the cloud. The majority were either at the initial implementation of production apps stage (29 percent) or running trials/pilot projects (27 percent). A further 27 percent were still at the evaluation stage. Most (five percent) still selectively target new apps for cloud computing, and 24 percent said that cloud was either their default platform for new application or that they relied heavily on cloud for new projects.

According to Marco Limena, Vice President, Hosting Service Providers at Microsoft, which sponsored the survey, "Hosted private cloud is a gateway to hybrid cloud environments for many customers. With this momentum continuing to build, it's clear that we've reached a tipping point where most companies have moved beyond the discovery phase and are now moving forward with cloud deployments to deliver improved business results and capabilities."

Michelle Bailey, Senior Vice President, Digital Infrastructure and Data Strategy at 451 Research, added: "While cloud environments are significantly changing the way businesses operate today, one thing that hasn't changed is the importance of security. As a result, security has emerged as the primary, and potentially most lucrative, cloud opportunity for hosters.

BitCloud's Enterprise Sales and Marketing Manager, Jorge Villalpando, agrees. "Security is paramount for any cloud environment, but no one product can address security," he said. "Data security in the cloud is comprised of many layers in a complex environment."

Villalpando said businesses should ask cloud service providers a number of questions to assess the strength of their security. "Do they have a dedicated resource for security? Do they actually take security seriously? What is their historical performance? Are they transparent when it comes to reporting and data access control? And finally, how secure are their website and their customer facing portals?"

Bailey said that customers were prepared to pay a premium for security. "Our research shows that 60 percent of customers would pay their hosting service provider a 26 percent premium on average for security guarantees — and an additional 25 percent are already paying for such services."

The survey gathered responses from more than 2,000 IT decision-makers across small, mid-size and large organizations who are purchasing hosting services or software-as-a-service. They came from 11 countries: the United States, Canada, the United Kingdom, Germany, Russia, Japan, South Africa, Singapore, India, Australia, and Brazil. Australia accounted for six percent of responses, the US 35 percent.

A Tough Stretch for Tom Wheeler on Net Neutrality

Excerpted from the NY Times by Edward Wyatt

Tom Wheeler, the Chairman of the Federal Communications Commission, has had a tough month.

Two of his fellow Commissioners this week said he should delay introducing a new set of Net Neutrality rules, which he had scheduled for May 15th.

A group of 11 United States Senators told him Friday that rules allowing companies to pay an Internet service provider for express-lane access to consumers, as the rules are widely expected to do, would violate the principle of an open Internet.

And last week, a finger-wagging, tough-talking speech he gave to cable television executives received a lukewarm response and engendered questions about what some listeners perceived as a "father knows best" tone.

Mr. Wheeler has vowed to forge ahead.

Late Friday, the Open Technology Institute at the New America Foundation released a letter from Mr. Wheeler, in which he said that he would ask for public input on whether to classify broadband Internet service as a sort of public utility, a route that many consumer advocacy groups have pushed.

"We will specifically ask whether Title II or Section 706 of the Communications Act is the best way to address the matter of Internet openness," Mr. Wheeler wrote. Title II is the portion of the Communications Act that deals with "common carriers," the designation for telephone networks, which are subject to rate regulations and other strict oversight.

Section 706 is the portion of the Act that a federal appeals court said appeared to give the FCC the legal authority to prevent Internet service providers from favoring one company's content over another when routing it to customers.

But people who have seen Mr. Wheeler's current proposal have indicated that rules relying on Section 706 could still allow broadband companies to sell a fast lane to content companies. And in an FCC filing Friday, AT&T argued that even a Title II reclassification would not prevent broadband companies from instituting paid prioritization of Internet content.

Mr. Wheeler said paid prioritization would not be tolerated. "I will not allow some companies to force Internet users into a slow lane so that others with special privileges can have superior service," Mr. Wheeler wrote.

Reasons Web Giants Want FCC to Enforce Net Neutrality

Excerpted from eWeek Report by Don Reisinger

Some of the Internet's largest companies — including Amazon, Google, Microsoft, and Facebook — have issued a letter to the Federal Communications Commission (FCC) to urge its Chairman Tom Wheeler to support Network Neutrality.

The companies argue that the FCC's support for Net Neutrality would ensure equal web access for users while promoting competition among web application providers, a major concern for those in the government who support the idea of a free and open Internet.

It's not a surprise that major web companies are so concerned about Net Neutrality.

These are the companies that provide web access to millions of users around the world.

But they are also for-profit organizations concerned about keeping costs low and maximizing profits.

But if Internet service providers (ISPs), which are delivering web bandwidth to individual users as well as the big web application companies, get broad power to decide who will pay more for web bandwidth and who will pay less, perhaps based on the types of web services people want to access, then the concept of a free and open web goes out the window.

Seen from this point of view, it's clear why the big web services companies have a vested interest in maintaining a strong Net Neutrality policy.

Please click here for the slideshow.

Doubts Raised over FCC Open Internet Proposal

Excerpted from SmartBrief Report

The Federal Communications Commission's (FCC) plan to revise Open Internet rules seemed to satisfy few after the agency voted Thursday to open its proposal to debate.

Even the Democrats who supported Chairman Tom Wheeler's proposal seemed lukewarm to the plan, which would let content providers pay to send their data over a "fast lane" of bandwidth.

Verizon Communications and AT&T continue to take issue with a proposal that would reclassify broadband as a common-carrier service. Read more coverage in Broadcasting & Cable, Bloomberg Businessweek, The Wall Street Journal and The Verge.

Verizon Brings Thunder to the Cloud

Excerpted from Light Reading Report by Mitch Wagner

Verizon Communications. is building out its cloud infrastructure to appeal to customers looking to take advantage of the benefits of the cloud without having to abandon tried and true enterprise infrastructure.

Familiarity is key to Verizon's strategy. It wants enterprise customers to be able to bring familiar applications to the cloud, rather than starting from scratch.

"Instead of saying 'ditch everything you know and go to the new stuff,' we'll run the new stuff but also give you the ability blend your environment and use some of the technologies that you already have," says John Considine, CTO, Verizon Terremark.

To that end, Verizon is building out its cloud infrastructure fabric to accommodate enterprise customers with hybrid cloud services, a program announced in beta in October. (See Why Verizon Needed a Cloud Reboot.)

Verizon uses a fabric architecture, using modules built in partnership with SeaMicro, which is owned by Advanced Micro Devices. Each module contains compute, storage, and SDN networking components, on a building-block chassis that provides availability, reliability, and connectivity.

Networking switches, meanwhile, are provided by Arista Networks.

"These hardware components comprise the entire hardware infrastructure," Considine says. "Everything else is our own software and from partners, in terms of firewalls and other network apps. We do this to obtain global scalability and flexibility."

To support its services, Verizon has built data centers in seven locations: Culpepper, VA; Miami, FL; Denver, CO; Santa Clara, CA; Sao Paolo, Brazil; London; and Amsterdam.

Verizon wrote the base platform software in-house, using Linux and other open source components. The storage stack is ZFS; Verizon wrote its own SDN and orchestration software. On top of that, Verizon works with partners to provide enterprise applications to customers, including:

F5 Networks for virtualized traffic management and security policy management.

NetApp for virtualized storage.

Oracle for its 11G and 12C database software, as well as its Fusion middleware.

Cloudera for analytics and Hadoop management.

"Our customers can come in and say, 'Get me one of those.' It can be anything from an Oracle template to Hadoop, and run them in our cloud," Considine says.

Verizon's infrastructure investment is a differentiator for the company, says Heavy Reading analyst Caroline Chappell. "The big thing that is different about Verizon is that it has gone out and invested very heavily in a new architecture and cloud infrastructure. It very strongly believes that, as a cloud provider, you have to innovate and do your own thing, and control it, just don't buy the standard cloud blocks from Cisco Systems and VMware.

The differentiator hasn't allowed Verizon to pull ahead in the marketplace -- at least not yet, Chappell says. "But Verizon is definitely one of the big contenders, and the US cloud market is one of the most advanced and competitive in the world. Verizon is making a big play to be a major dominant player in that market and around the world," she says.

And so the operator continues to develop its service offerings. In April, Verizon announced its Secure Cloud Interconnect (SCI) service to connect enterprise private networks into Verizon's and other providers' clouds. It's part of Verizon's overall strategy of allowing enterprises to run applications locally where they make sense, and in the cloud where that makes sense, Considine says.

AT&T Buying DirecTV for $67.1 Billion 

Excerpted from MarketWatch Report by Michael Kitchen

AT&T said Sunday it will buy satellite-television provider DirecTV in a stock-and-cash deal worth $67.1 billion. Under the deal, approved by both firms' boards, DirecTV shareholders will receive $28.50 per share in cash, along with $66.50 per share in AT&T stock, with provisions adjusting the stock portion of the offer based on further movements in AT&T shares.

The total value "implies an adjusted enterprise value multiple of 7.7 times DIRECTV's 2014 estimated EBITDA," AT&T said in announcing the deal. AT&T said it would finance the deal "through a combination of cash on hand, sale of non-core assets, committed financing facilities and opportunistic debt market transactions."

It also said it would sell off its interest in America Movil in order to ease regulatory approval of the deal. It also said the new entity would be committed to "for three years after closing" to upholding current Net Neutrality rules as established in 2010, "irrespective of whether the FCC re-establishes such protections for other industry participants following the DC Circuit Court of Appeals vacating those rules."

The US telecom said the deal would likely close within "approximately 12 months," and would be accretive to AT&T's earnings within 12 months after the close, adding that its 2014 outlook "remains largely unchanged" though the sale of the America Movil holdings would cut about 5 cents per share from earnings.

In touting the worth of the deal, AT&T said: "The transaction enables the combined company to offer consumers bundles that include video, high-speed broadband and mobile services using all of its sales channels — AT&T's 2,300 retail stores and thousands of authorized dealers and agents of both companies nationwide."

Inside BitTorrent Sync's Cloudless File Syncing

Excerpted from PC World Report by Ian Paul

Cloud services like OneDrive and Dropbox are dead-simple to set-up and make multi-device file syncing an absolute breeze — but those services force you to stash your files on the company's third-party servers. For some people that's just not an option, or at least an option they'd rather do without.

But there's another easy alternative that lets you automatically sync files between devices even if you don't trust the cloud: BitTorrent Sync Beta.

What is BitTorrent Sync?

Harnessing the distribution power of the BitTorrent protocol, BitTorrent Sync Beta — which I'll call BTSync from here on out — lets you sync files across devices or share them with friends using peer-to-peer (P2P) file sharing technology. Files are also encrypted in transit, and since there's no real cloud service there are no storage limits beyond the hard drive size of your devices.

BTSync is also flexible. You can specify which folders to sync between devices. You can share documents with other BTSync users in read-only mode (meaning they can't change the file on your devices). Advanced users can even use BTSync to track document versioning.

But there are some downsides to BTSync. You can't sync specific files with specific devices. With BTSync, you either sync a folder among your devices or you don't. You also need to have your devices powered on and signed in for BTSync to work; the BitTorrent protocol relies on both devices being active to share files.

Perhaps most notably, the program isn't very newbie-friendly — though BitTorrent Sync chief Erik Pounds says the team is "laser-focused on making BitTorrent Sync drastically easier to use."

All that said, BitTorrent Sync is a terrific option for folks who want to sync data without relying on cloud services and third-party servers—if you know how to use it. Let's dig in.

Getting started.

Getting BTSync running is easy. Download the software from the BitTorrent website, then install it, keeping all the default options intact. When that's done, you'll see a "secret''—a long string of letters and numbers—for the initial Sync folder. Don't worry about it! You can retrieve it later.

BitTorrent Sync as it appears right after installation.

Once you're done with the installation, BTSync opens automatically. With BTSync installed on the single PC, you're essentially limited to adding local folders you'd like to have synced with other devices via My Sync > Add Folder.

Next, you need to install the software on other devices. BitTorrent Sync supports a wide range of mobile and desktop operating systems, but we'll focus on syncing with another PC first.

Syncing files with other PCs.

Syncing data across PCs relies on the "secret" code mentioned earlier. Let's say we want to share some videos that are on a PC called OFFICE with another PC called DEN. The videos on OFFICE are located at C:\Users\PCWorld\AwesomeVids.

Open BTSync on OFFICE, right-click the AwesomeVids folder, and select Copy secret from the contextual menu.

Now we have to get that secret code over to DEN — by email, snapping a picture of it on your phone, sneakernet, or even just writing it down and sticking it in your pocket. Treat the code with care—you wouldn't want someone else using it to gain access to your files.

To start syncing data from another computer you need to know the secret and decide where to save it.

On DEN, click the Add folder button in BTSync, then enter the secret from OFFICE into the Folder secret field in the pop-up window that appears. Next, choose a new folder where you want to save the data, then click OK and let BitTorrent Sync work its magic. Syncing time will vary depending on file size and the speed of your Internet connection(s), naturally.

Syncing files with Android and iOS devices

Syncing folders to mobile devices is easier.

Let's say we want to share the same AwesomeVids folder on the OFFICE computer with an Android tablet. Open the program on the PC, select the AwesomeVids folder, and then click the Connect mobile button. You'll see a QR code pop up in a new window.

Now, open BTSync on the tablet and tap the Add folder icon in the top right corner.

BitTorrent Sync on Android, taken on a Nexus 7.

On the next screen, tap Choose folder, then select where you want to save the data on your tablet and press the Choose folder button. You'll return to the Add folder screen. Tap Scan QR code, which will activate your camera. Take the picture!

If your tablet has only a front-facing camera, the QR code may be a little trickier to line up. You can manually enter the secret code in the "or enter secret here" field under the Scan QR code button, too—that field is automatically populated when you photograph the QR code. Once the code's there, tap Done.

BitTorrent Sync works similarly on iOS, but with one major difference: You don't have to choose where to save your synced data. Just snap the QR code and BTSync takes care of the rest.

Note that both apps also have a backup option. On iOS, you can sync your photos to your PC, while the Android app can back up any folder you'd like. The setup process here is basically the same as above.

Please click here for the full report.

Telefonica Deploys TOA Cloud-Based Solution

Excerpted from Telecoms Report by Jonathan Brandon

Telefonica has deployed field service management technology from TOA.

Vivo, the Brazilian subsidiary of Spanish telco Telefonica said Tuesday that it has deployed TOA Technologies' cloud-based field service management application, which provides end-to-end service tracking for both employees and contractors.

"The deployment of this cloud-based tool to manage Telefonica's field technician workforce shows Telefonica's Commitment to its digital transformation. This is a pioneer project in the operating support systems area, and the focus has been on the quick Implementation and achievement of efficiency goals," said Enrique Blanco, global chief technology officer for Telefonica.

The company said it deployed the application and integrated it with its OSS in just six months. Blanco said that the project was completed in record time, in part because the SaaS solution was easier to deploy than on-premise equivalents.

The solution, which has already been deployed at Telefonica's Chilean outfit, is scheduled to be rolled out to the wider business over the coming months.

"The transformation of Telefonica OSS, With TOA Technologies' SaaS solution, is disruptive for two Reasons: first, using an SaaS for OSS implementation enables more flexibility and quick deployments, and second, the SaaS delivery allowed us to manage the Implementation in each country with a common roadmap and set of processes. These two points are critical for applying to Telefonica's OSS implementation at scale," said Juan Manuel Caro, OSS operation and global director in Telefonica.

Telefonica's embrace of the cloud to improve its own operations has been a strategic priority for the past couple of years. Earlier this year the Spanish telco unveiled comprehensive network function virtualization plans in a bid to make its network more flexible, particularly as it seeks to add more enterprise customers to its network and encourage uptake of its cloud services.

Octoshape Partners on OTT 4K Streaming

Excerpted from RapidTVNews Report by Michelle Clancy

Octoshape is partnering with Harmonic and AirTies to demonstrate over-the-top (OTT) 4K streaming, delivered over the public Internet to consumer set-top boxes (STBs).

The total solution includes Harmonic's ProMedia suite of multiscreen production and preparation applications, integrated with Octoshape's Infinite HD-M suite of stream acceleration and multicast technologies, and the AirTies HEVC/Octoshape-enabled STB. 

The platform includes adaptive bitrate, stream acceleration and a multicast suite of technologies, and targets telcos, operators and content providers to deliver UltraHD quality streams via best-efforts Internet. 

Quality of service is preserved over broadband utilizing the Octoshape stream acceleration and multicast technologies. Content is encoded and prepared by the Harmonic ProMedia solution using HEVC at 4K resolution, and then the video is delivered to the television via the AirTies/Octoshape enabled set-top box. 

"4K stream resolution over the public Internet represents a de-facto standard for the future of video streaming," said Bulent Celebi, Executive Chairman and Co-Founder of AirTies. "We are incredibly proud of our partnership with Octoshape and Harmonic, and this successful demonstration solidifies the industries' access to the highest quality solution available on the market today."

Why Cloud Should Be on Your Resume

Excerpted from Forbes Report by Joe McKendrick

Cloud computing may have been designed and initiated by tech professionals, but it's something in which everyone should be involved. It doesn't matter if you're in accounting or market research or field service — having knowledge and experience with the right online resources is a powerful capability to bring to any employer or client.

Cloud is an expanding world no longer limited to the technology folks — though it's benefiting them as well. I recently spoke with an accounting manager at a major financial services company who has taken a leadership position in his organization by successfully moving many of the company's operations to the cloud. For a market research professionals, online Software-as-a-Service-based survey research tools provide a wealth of best practices and techniques that enable them to reach conclusions in the blink of an eye.

There are many reasons why cloud adds enormous value to anyone's resume. (And it's not even necessary to plug the word "cloud" in there!) Being well-versed or adept at cloud doesn't have to necessarily suggest tech savvy, but it reflects a high degree of resourcefulness — you know where to to turn to get the job done. Here are five reasons that stand out:

1) Professional and industry best practices are embedded in cloud services. Most of the discussion you hear around cloud is around how cheap it is, and how easy it is to install. But its greatest advantage is its best-kept secret: cloud services are built on constant interaction and feedback from customers on the best practices that are built into them. Many leading cloud offerings offer pre-designed industry templates that offer the latest thinking in processes and practices. Cloud helps you keep up to date with the latest thinking.

2) Everyone loves a self-starter. At the foundation of today's "do-it-yourself" economy is cloud — you can launch a multitude of ventures right off the cloud, from accounting services to supply chain mediators. Having experience with important cloud offerings specific to your profession or industries not only shows you can hit the ground running — but you already have been running.

3) Everyone wants an innovator. Cloud offers the ability to rapidly iterate and experiment with new ideas. Servers and applications for testing new marketing programs, new product formulations, or new marketing offers can be quickly spun up as testbeds for ideas a little or not cost, and just as quickly changed or adapted as required.

4) Everyone needs a polymath — someone with multiple skills, and adept at building bridges. Cloud is all about discovery — figuring out new ways to take tired, calcified processes and make them work better. It means applying fresh thinking to old problems. Just as coming up with a mobile app helped cut into the often-cumbersome process of procuring taxi cab rides, or employing mobile phones to perform eye exams is making healthcare more accessible. If you're resourceful at applying cloud-based solutions to problems and opportunities, you may be able to contribute the fresh thinking needed to reshape parts of the business outside your domain.

5) Being cloud-savvy means you avail yourself to world-class education and training opportunities. For example, SAP is no longer a secret limited to employees of companies that are paying hundreds of thousands of dollars to get and install the software. Someone interested in learning more about SAP can study information on SAP's site, and even participate in independent training programs. For broader education in IT and beyond, massive open online courses have opened up courses from the world's leading universities to anyone who wants to participate — and at no charge, for the most part.

Coming Events of Interest

Enterprise Apps World — June 17th-18th in London, England. EAW is a two day show, co-hosted with Cloud World Forum, that will look at all the implications of going mobile in the workplace and how enterprise apps can help.

Silicon Valley Innovation Summit — July 29th-30th in Mountain View, CA.AlwaysOn's 12th annual SVIS is a two-day executive gathering that highlights the significant economic, political, and commercial trends affecting the global technology industries. SVIS features the most innovative companies, eminent technologists, influential investors, and journalists in keynote presentations, panel debates, and private company CEO showcases.

International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.

CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.

International Conference on Cloud Computing Research & Innovation - October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development. 

Copyright 2008 Distributed Computing Industry Association
This page last updated May 26, 2014
Privacy Policy