Click here to close now.


Linux Containers Authors: Pat Romanski, AppDynamics Blog, Liz McMillan, Elizabeth White, Yeshim Deniz

Related Topics: Cloud Security, Java IoT, Microservices Expo, Linux Containers, @CloudExpo, @BigDataExpo

Cloud Security: Article

Protecting the Network with Proactive Encryption Monitoring

Encryption technology is everywhere: in applications, data centers and other foundation infrastructure

Encryption is a key element of a complete security strategy. The 2013 Global Encryption Trends Study shows a steady increase in the use of encryption solutions over the past nine years. Thirty-five percent of organizations now have an encryption strategy applied consistently across the entire enterprise, up from 29 percent in 2012. The study showed that, for the first time, the main goal for most organizations in deploying encryption is mitigating the effects of data breaches. There is good reason for this shift: the latest Ponemon Institute research reveals that the cost of a data breach is $3.5 million, up 15 percent from last year.

On the surface, the 35 percent figure seems like good news, until one realizes that 65 percent of organizations do not have an enterprise-wide encryption strategy. In addition, even a consistently applied strategy can lack visibility, management controls or remediation processes. This gives hackers the green light to attack as soon as they spot a vulnerability.

While organizations are moving in the right direction when it comes to encryption, much more needs to be done - and quickly. Encryption has come to be viewed as a commodity: organizations deploy it and assume they've taken the steps they need to maintain security. If breaches occur, it's rarely the fault of the software or the encryption protocol. The fault lies rather in the fact that encryption management is left in the domain of IT system administrators and has never been properly managed with access controls, monitoring or proactive data loss prevention.

Too Many Keys Spoil the Security
While recent high-profile vulnerabilities have exposed the need to manage encrypted networks better, it's important to understand that administrators can cause vulnerabilities as well. In the Secure Shell (SSH) data-in-transit protocol, key-based authentication is one of the more common methods used to gain access to critical information. Keys are easy to create, and, at the most basic level, are simple text files that can be easily uploaded to the appropriate system. Associated with each key is an identity: either a person or machine that grants access to information assets and performs specific tasks, such as transferring a file or dropping a database, depending on the assigned authorizations. In the case of Secure Shell keys, those basic text files provide access to some of the most critical information within an organization.

A quick calculation will reveal that the number of keys assigned over the past decade to employees, contractors and applications can run up to a million or more for a single enterprise. In one example, a major bank with around 15,000 hosts had over 1.5 million keys circulating within its network environment. Around 10 percent of those keys - or 150,000 - provided high-level administrator access. This represents an astonishing number of open doors that no one was monitoring.

It may seem impossible that such a security lapse could happen, but consider that encryption is often perceived merely as a tool. Because nothing appeared on the surface to be out of place, no processes were shut down and the problem was undetected.

Safety Hazards
Forgetting to keep track of keys is one problem; failing to remove them is another. System administrators and application developers will often deploy keys in order to readily gain access to systems they are working on. These keys grant a fairly high level of privilege and are often used across multiple systems, creating a one-to-many relationship. In many cases, employees or contractors who are terminated - or even simply reassigned to other tasks that no longer require the same access - continue to carry access via Secure Shell keys; the assumption is that terminating the account is enough. Unfortunately, this is not the case when Secure Shell keys are involved; the keys must also be removed or the access remains in place.

SSH keys pose another threat as well: subverting privileged access management systems (PAMs). Many PAMs use a gateway or jump host that administrators log into to gain access to network assets. PAM solutions connect with user directories to assign privileges, monitor user actions and record which actions have taken place. While this appears like an airtight way to monitor administrators, it is incredibly easy for an administrator to log into the gateway, deploy a key and then log in using key authentication, thereby circumventing any PAM safeguards in place.

Too Clever for Their Own Good
Poorly monitored access is just one security hazard in encrypted environments. Conventional PAM solutions, which use gateways and focus on interactive users only, are designed to monitor administrator activities. Unfortunately, as mentioned earlier, they end up being fairly easy to work around. Additionally, encryption blinds attackers the same way it blinds security operations and forensics teams. For this reason, encrypted traffic is rarely monitored and is allowed to flow freely in and out of the network environment. This creates obvious risks and negates security intelligence capabilities to a large degree.

The Internet offers many articles on how to use Secure Shell to bypass corporate firewalls. This is actually a fairly common and clever workaround policy that unfortunately creates a huge security risk. In order to eliminate this risk, the organization must decrypt and inspect the traffic.

Traffic Safety
Decrypting Secure Shell traffic would require an organization to use an inline proxy with access to the private keys - essentially a friendly man-in-the-middle - to decrypt the traffic without interfering with the network. When successfully deployed, 100 percent of encrypted traffic for both interactive users and M2M identities can be monitored. Also, because this is done at the network level, it's not possible for malicious parties to execute a workaround. With this method, enterprises can proactively detect suspicious or out-of-policy traffic. This is called encrypted channel monitoring and represents the next generation in the evolution of PAM.

This kind of monitoring solves the issue of decrypting traffic at the perimeter and helps organizations move away from a gateway approach to PAM. At the same time, it prevents attackers from using the organization's own encryption technology against itself. In addition, an organization can use inline access controls and user profiling to control what activities a user can undertake. For example, policy controls can be enforced to forbid file transfers from certain critical systems. With the more advanced solutions, an organization can even block subchannels from running inside the encrypted tunnel, the preferred method of quickly exfiltrating data.

Encryption technologies are often set up without effective monitoring or proper access controls, which also blinds layered defenses. A major vulnerability could potentially compromise the entire server, which could in turn expose other areas of the network to subsequent attacks.

A Healthy Respect for Encryption
Encryption technology is everywhere: in applications, data centers and other foundation infrastructure. While it has been widely embraced, it has also often been abused, misused or neglected. Most organizations have not instituted centralized provisioning, encrypted channel monitoring and other best practices, even though the consequence of inadequate security can be severe. IT security staff may think conventional PAM is keeping their organizations safe, when commonly-known workarounds are instead putting their data in jeopardy.

No one understands better than IT administrators how critical network security is. This understanding should spur security professionals to do all in their power to make their organizations' data as safe as possible. Given all that can go awry, it's important to examine encrypted networks, enabling layered defenses and putting proactive monitoring in place if they have not yet done so. An all-inclusive encrypted channel monitoring strategy will go a long way toward securing the network.

More Stories By Jason Thompson

Jason Thompson is director of global marketing for SSH Communications Security. He brings more than 12 years of experience launching new, innovative solutions across a number of industry verticals. Prior to joining SSH, he worked at Q1 Labs where he helped build awareness around security intelligence and holistic approaches dealing with advanced threat vectors. Mr. Thompson holds a BA from Colorado State University and an MA for the University of North Carolina at Wilmington.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust IoT ...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT.
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.