Click here to close now.


Cloud Security Authors: Cloud Best Practices Network, Liz McMillan, Pat Romanski, Teresa Schoch, Marc Crespi

Related Topics: Cloud Security, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, @BigDataExpo, SDN Journal

Cloud Security: Article

Switching the Locks: Who Has Copies of Your SSH Keys?

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations

Despite the recent flood of high profile network breaches, hacking attempts are hardly new. In 1995, I was attending school in Helsinki when I discovered a password "sniffer" attack in our university network. In response, I wrote a program called the "secure shell" to safeguard information as it traveled from point to point within the network. This new program shielded all of our data and ensured that these kinds of attacks didn't jeopardize our logins.

This program, SSH, works by developing an encryption key pair - one key for the server and the other key for the user's computer - and encrypting the data that is transferred between those two keys. Currently, almost every major network environment - including those in large enterprises, financial institutions and governments - uses a version of SSH to preserve data in transit and let administrators operate systems remotely. Organizations use SSH to encrypt everything from health records to logins, financial data and other personal information.

Management of Keys a Low Priority
Despite the fact that SSH keys safeguard extremely sensitive information, companies have been incredibly casual at managing SSH key generation, access and location throughout their network environments. It's similar to a home security company making numerous copies of a person's housekeys, throwing them all over the streets and never changing the lock. The only things needed to pick up one of these keys and use it to access encrypted data are interest, time and a little know-how.

Organizations are constantly leaving themselves open to security breaches and noncompliance with federal regulations by not being more diligent about SSH key management. Many are incapable of controlling who creates keys, how many are created, or where they are positioned in the network after being dispensed and those discrepancies will lead them to network-wide attacks.

Swept Under the Rug
The issue has remained concealed in the IT department, guarded by its vastly technical nature and frequent organizational challenges. System administrators may not appreciate or understand the full scope of the problem because they typically only see a small piece of their environment. On the other side of the company, even if executives and business managers recognize that there is an issue, they are usually too busy to evaluate its scope or possible implications.

SSH key mismanagement is as mysterious as it is widespread. Through dialogs with prominent governments, financial institutions and enterprises, we have determined that on average most companies have between eight and over 100 SSH keys in their environments that allow access to each Unix/Linux server. Some of these keys also permit high-level root access, allowing servers to be vulnerable to "high-risk" insiders. These "insiders," including anyone who has ever been given server access, can use these mismanaged SSH keys to gain permanent access to production servers.

Mismanaged SSH Keys Give Viruses the Advantage
Each day, the probability increases of such a breach occurring. Attacks are becoming more prevalent and sophisticated, and news stories about network breaches are popping up daily. Using SSH keys as an attack vector in a virus is very easy, requiring only a few hundred lines of code. Once a virus secures successful entry, it can use mismanaged SSH keys to spread from server to server throughout the company.

Key-based access networks are so closely connected that it is extremely likely that a successful attack will travel through all organizational servers, especially if the virus also uses additional attack vectors to increase privileges to "root" after breaching a server. With the high number of keys being distributed, it is likely that the virus will infect nearly all servers within minutes, including disaster recovery and backup servers that are typically also managed using such keys.

In the worst case scenario, a virus utilizing numerous attack vectors could spread Internet-wide, rapidly and, combined with dissolution technologies, could corrupt enormous quantities of data.

Industry Regulations Flouted
Organizations lacking proper SSH key management protocols are not only vulnerable to security breaches, they are also out of compliance with mandatory security requirements and laws. SOX, FISMA, PCI and HIPAA are all industry regulations that require control of server access as well as the ability to discontinue that access. Additionally, companies may also be disregarding internal security practices (in some cases, policies mandated by customers).

The SSH protocol and its most commonly used implementations do not create these risks. Rather, it is the result of faulty protocols relating to SSH keys, inadequate time and means to research the problem to develop solutions, lack of understanding of the implications of the issue and the hesitancy of auditors to flag problems that they do not have solutions for.

Clearly the issue of SSH keys being improperly managed cannot be glossed over forever. Without auditing, controlling, or terminating SSH key-based access to their IT systems and data properly, most healthcare providers, enterprises and government agencies are easy targets for an attacker.

Steps to Combat the Risks
Before steps can be taken to solve a problem, it must be identified as a legitimate issue. It may take multiple IT teams to begin a remediation project and will require proper endorsement and support within the company.

There are multiple steps that make up the core of the remediation project:

  • Automating key setups and key removals; eliminating human errors, manual work and reducing the amount of administrators from hundreds to almost none.
  • Controlling what commands can be executed using each key and where the keys can be used from.
  • Enforcing proper protocols for establishing keys and other key operations.
  • Monitoring the environment in order to determine which keys are actively in use and removing keys that are no longer being used.
  • Rotating keys, i.e., switching out every authorized key (and corresponding identity keys) on a regular basis, so that any compromised (copied) keys stop working.
  • Unearthing all current trust-relationships (who has access to what).

The Future of Security
SSH continues to be the gold standard for data-in-transit security but the management of SSH network access must be addressed by organizations in the current threat landscape.

Nearly all of the Fortune 500 and several prominent government agencies are inadvertently putting themselves at risk to major security threats from hackers or rogue employees because they continue to operate out of compliance. This problem cannot be solved overnight. It will take numerous years and thousands of well-trained people to fully combat the problem. It must be the entire organization's responsibility to address the issue. Time must be allotted and it must become a priority to ensure that SSH user keys are properly managed in their companies.

More Stories By Tatu Ylönen

Tatu Ylönen is the CEO and founder of SSH Communications Security. While working as a researcher at Helsinki University of Technology, he began working on a solution to combat a password-sniffing attack that targeted the university’s networks. What resulted was the development of the secure shell (SSH), a security technology that would quickly replace vulnerable rlogin, TELNET and rsh protocols as the gold standard for data-in-transit security.

Tatu has been a key driver in the emergence of security technology, including SSH & SFTP protocols and co-author of globally recognized IETF standards. He has been with SSH since its inception in 1995, holding various roles including CEO, CTO and as a board member.

In October 2011 Tatu returned as chief executive officer of SSH Communications Security, bringing his experience as a network security innovator to SSH’s product line. He is charting an exciting new course for the future of the space that he invented.

Tatu holds a Master of Science degree from the Helsinki University of Technology.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innovation is automation. The most painful pain point for any business is the mismatch between supplies a...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.