|By Jonathan Lewis||
|September 30, 2013 07:45 AM EDT||
Identity and access management solutions provide governance and visibility capabilities that enable organizations to provision and control access to their applications, cloud infrastructure, servers and both structured and unstructured data. Enterprise IAM deployments are generally effective in managing the identities assigned to interactive, human users. However, within a typical enterprise there often are a greater number of identities assigned to the automated processes that drive much of the computing in large- scale data centers. As enterprises adopt more and more process automation, the number of non-human identities continues to grow while the number of identities assigned to human users remains relatively flat or even declines. The net result is enterprise IAM deployments are ignoring the much larger set of identities that actually perform most of the enterprise computing functions.
The vast majority of the identities enabling machine to machine (M2M) processes use Secure Shell for authentication, authorization and to provide a secure encrypted channel for M2M data transfers. For example, an automated process that retrieves server log data requires an authenticated and authorized connection to each server, plus a secure channel to move the log data to a centralized processing application. Secure Shell is ideal for these functions because:
- Public key (PKI) based authentication supported by Secure Shell enables the process to present its credentials without requiring an interactive user to login via username and password - or via any other interactive authentication process.
- The PKI based authentication process used by Secure Shell provides security for the login credentials. The private Secure Shell user key is never sent over the network.
- Secure Shell provides facilities to define and limit what functions a process may perform under a Secure Shell authorization. This meets "need to know, need to do" criteria of basic IAM governance.
- Finally, Secure Shell provides for confidentiality of data in transit. Communications over a Secure Shell channel are encrypted.
In spite of these advantages, there are significant gaps in IAM governance of identities that use Secure Shell. Typically, the provisioning of these identities is decentralized. Identities may be assigned by application developers, application owners and process owners. This often leads to a lack of proper control and oversight over creation of identities and their authorizations. Without central management and visibility, enterprises cannot be sure how many Secure Shell identities have been created, what these identities are authorized to perform and what authorizations are in fact no longer needed. The scope and nature of this problem are not theoretical. The typical enterprise server has between 8 and 100 Secure Shell authorizations (i.e., public Secure Shell user keys). This adds up. A large enterprise may have over one million keys, which in turn establish an even greater number of unmanaged machine-to-machine (M2M) trust relationships.
The Challenge of Ubiquitous Encryption
While many in IT security use Secure Shell to securely access remote servers, most are surprised to discover that M2M communication makes up the majority - in some cases over 90% of all Secure Shell traffic - on their network. The vast majority of Secure Shell trust relationships provide access to production servers and carry high-value payloads; including credit card information, healthcare records, national secrets, intellectual property and other highly critical information.
Shockingly, access to M2M encrypted channels via Secure Shell, which uses keys to authenticate a non-human user, almost always lacks proper identity and IAM controls, creating a huge risk and compliance issue for most enterprises. Any interactive user who has the proper credentials - in the case of Secure Shell, a simple copy of the key file - can hijack these uncontrolled M2M networks. This means that, in many cases, the most valuable information in the enterprise has the least amount of protection from unauthorized access.
Most large organizations have between 100,000 to well over a million of these keys in their network environments. Even though these keys grant access to critical systems and servers, many have never been changed. Even more incredibly, many organizations have no process for approving and enforcing who can grant permanent access to servers using these keys. One study at a large bank, with over one million keys in use, found that 10 percent of these keys granted unlimited administrative ("root") access to production servers; a grave security risk.
The lack of security controls - coupled with the high value of data it protects - has made Secure Shell a target for hackers. A recent IBM X-Force study found most attacks against Linux/Unix servers utilize stolen or lost Secure Shell keys as a threat vector. Because many keys are deployed in one-to-many relationships, it is possible that a single breach related to a compromised key could have a cascading effect across a large swath of the network environment.
In an ironic twist, the very function that blinds prying eyes from spying on sensitive data in-transit also prevents systems administrators from seeing whether information is being accessed improperly using a stolen Secure Shell key. All data-in-transit encryption, including Secure Shell, blinds layered security defense systems to malicious activity originating from a hacker, trusted insiders, business partners and outsourced IT. This means that unless the enterprise has deployed an encrypted channel monitoring, security operations and forensics teams cannot see what is happening in the encrypted network. Encrypted channel monitoring enables security intelligence and DLP solutions to inspect, store and - if need be - stop traffic to make sure hackers or malicious insiders cannot use Secure Shell encryption to spirit away information in an undetectable and untraceable manner. This way, the network administrator can track what a user is doing inside the encrypted channel, without exposing the data in the clear during transmission.
Evolving Standards to Include Other Authentication Methods
Moving to protect themselves against both hacker attacks and security compliance mandates, many enterprises are bolstering interactive user authentication methods; including enforcing password strength, requiring periodic password changes and implementing two-factor authentication. These methodologies are designed to confound hacker attempts to access interactive accounts through brute force attacks, lost or stolen passwords, or spoofed credentials. These approaches are now considered best practices and are enshrined in compliance requirements like PCI, HIPAA, FISMA, SOX and others.
Currently, compliance bodies are updating their regulations to specifically include other methods of authentication above and beyond user names and passwords - such as certificates and keys - in their regulatory language. This means that auditors will be required to flag instances where access is not being controlled via Secure Shell. This is a natural progression for compliance mandates, arriving at a time when the market is beginning to recognize that strong standards are required to ensure the safety of the enterprise's most critical business information.
To provide the highest levels of security and accountability, it is in the organization's best interest to research, design and deploy an IAM strategy that includes processes designed specifically for M2M communications. A comprehensive, best practices-based IAM program that includes provisions for Secure Shell-based M2M security must address both the provisioning and intelligence aspects of IAM across large, complex and heterogeneous environments.
Best practices based Secure Shell key management enables strong authentication practices, including:
- Restricting root access to servers so that only the key manager can provision or revoke keys
- Automated key creation, rotation and removal
- Discovery and continuous monitoring of trust relationships and unauthorized key deployments and removals
- Enforcing proper key type, size and version of Secure Shell
- Controlling where each key can be used from and what commands can be executed using the key
- Monitoring traffic in encrypted channels
In an environment where ever-increasing numbers of users, devices and machines are connected to the Internet and the company network, ensuring that the enterprise's IAM strategy includes strong Secure Shell access controls in M2M communications is mission-critical. While ubiquitous encryption offers clear network security benefits, left unmanaged it can present a significant threat to the business. IT security, compliance and audit professionals must begin the process of addressing Secure Shell access control and governance issues. The absence of such controls creates security vulnerabilities and can cause an organization to run afoul of compliance mandates, resulting in the risk of fines and other liabilities. By critically examining the organization's Secure Shell environment, IT teams can reveal and address the M2M access control issues that lie beneath the tip of the iceberg.
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
May. 27, 2015 02:00 PM EDT Reads: 2,106
The world is at a tipping point where the technology, the device and global adoption are converging to such a point that we will see an explosion of a world where smartphone devices not only allow us to talk to each other, but allow for communication between everything – serving as a central hub from which we control our world – MediaTek is at the heart of both driving this and allowing the markets to drive this reality forward themselves. The next wave of consumer gadgets is here – smart, connected, and small. If your ambitions are big, so are ours. In his session at @ThingsExpo, Jack Hu, D...
May. 27, 2015 12:49 PM EDT Reads: 675
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
May. 27, 2015 12:00 PM EDT Reads: 2,391
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
May. 27, 2015 12:00 PM EDT Reads: 2,347
We’re entering a new era of computing technology that many are calling the Internet of Things (IoT). Machine to machine, machine to infrastructure, machine to environment, the Internet of Everything, the Internet of Intelligent Things, intelligent systems – call it what you want, but it’s happening, and its potential is huge. IoT is comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures. As a result, huge volumes of data are being generated, and that data is being processed into useful actions that can “command and control” thi...
May. 27, 2015 11:51 AM EDT Reads: 642
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at @ThingsExpo, Andrew Bolwell, Director of Innovation for HP's Printing and Personal Systems Group, discussed how key attributes of mobile technology – touch input, sensors, social, and ...
May. 27, 2015 11:30 AM EDT Reads: 4,363
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
May. 27, 2015 11:15 AM EDT Reads: 2,815
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will addresses this very serious issue of profound change in the industry.
May. 27, 2015 10:30 AM EDT Reads: 1,360
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
May. 27, 2015 10:30 AM EDT Reads: 5,492
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
May. 27, 2015 10:00 AM EDT Reads: 2,265
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
May. 27, 2015 09:30 AM EDT Reads: 7,147
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
May. 27, 2015 09:30 AM EDT Reads: 4,654
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
May. 27, 2015 07:30 AM EDT Reads: 5,744
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
May. 27, 2015 04:30 AM EDT Reads: 4,351
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
May. 27, 2015 03:00 AM EDT Reads: 6,026
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
May. 27, 2015 02:30 AM EDT Reads: 5,693
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
May. 27, 2015 02:00 AM EDT Reads: 6,445
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
May. 27, 2015 01:00 AM EDT Reads: 2,656
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
May. 26, 2015 09:00 PM EDT Reads: 5,294
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
May. 26, 2015 05:00 PM EDT Reads: 5,025