Welcome!

Cloud Security Authors: Liz McMillan, Elizabeth White, Pat Romanski, Zakia Bouachraoui, Yeshim Deniz

Related Topics: @CloudExpo, Java IoT, Cloud Security

@CloudExpo: Blog Post

Adaptive Two-Factor Authentication: Is It All It’s Cracked up to Be? | @CloudExpo #Cloud

Adaptive authentication works by granting users access using just their user name and password if they are in a trusted location

It's a given that employee access to corporate systems should be both as secure and simple as possible. Up until recently however, time-strapped CIOs, under pressure from demanding staff and challenged with authenticating users all over the world on multiple devices, have been torn between relying on the fatally flawed password or hard token two-factor authentication (2FA) to keep their systems secure.

Recently, adaptive authentication has gained in popularity, as it reduces the time it takes to log-in by verifying a user based on their location. But is this the best solution?

Adaptive authentication works by granting users access using just their user name and password if they are in a trusted location. Although in theory this process makes it easier for a user to authenticate their identity, there are several issues with this technology, which many may not realise.

From an end-user prospective, they want speed, stability and consistency in their log-in methods. If a user attempts to verify their identity using adaptive authentication in a non-trusted location, they will be asked to use the full 2FA process. This will require them to enter a code generated on a soft or hard token depending on the technology used. If the user is using the full process less than once a week however, they are likely to run into complications - forgetting how the process works or even to bring their hard token if one is required.

Despite this, a small number of users, usually those who are based at home but travel frequently, login at multiple locations, multiple times a day. In this instance, adaptive authentication can prove its value, as a user will feel the benefit of a fast login process in trusted locations, whilst using the full 2FA process frequently enough to be familiar with it.

There are three ways of achieving adaptive authentication, and it's important for CIOs to consider the differences.

The first is to detect a user's geo position via their IP address. The process, called GeoIP, has its own issues. Internet service provides often change IP addresses of private users to prevent them from running their own servers at home. This means that when an IP address is switched, a user's location could appear to be somewhere 200 miles away, flagging them as now being in an unsecure location. The home of the user now assigned to the old IP address, has also suddenly become a trusted site.

Corporate offices and buildings will usually have just one external IP address and several internal addresses. These internal addresses are not made available externally, which means that to make a corporate location trusted, anyone in that office is then identified as a trusted user.

The second method is to use GPS location. This method requires an app to be installed on a user's mobile device. Whilst this is infinitely more precise, employers can track the location of their staff whenever the device is on, raising serious questions on privacy.

The future of adaptive outreach is to use the local base transceiver station's GSM Cell ID to identify the location of a user's mobile phone, and therefore verify their identity. With this method, neither the organisation, nor the two-factor authentication security provider knows the location of the user. Instead, the security provider sends a request to the mobile operator asking whether a user's mobile phone is within a trusted cell. The operator will simultaneously come back and say either yes or no, never revealing which cell they are in. If no, the user will be prompted so sign in using the full 2FA method.

Individuals already trust their mobile network provider to keep their location secure, and this way their location data never leaves its sphere of trust.

The ultimate solution for both the CIO and the end-user's needs is an authentication method that's so quick, simple and secure, there's no need for it to be adaptive. Near Field Communication (NFC)-based mobile authentication for example, can securely transfer all the information required to enable a browser to start up, connect to the required URL, and then automatically enter the user id, password and second factor passcode in one seamless logon.

NFC isn't just limited to mobile phones either. Wearable technology, highly personal in nature, can also be utilised, enabling users to authenticate using their smart watch by simply tapping their wrists against a corresponding device.

This effectively creates a solution that's even quicker than entering a simple user name and password. The CIO is then safe in the knowledge that their end points are covered, and the user is happy authenticating their way. So question, is adaptive authentication as secure as it should be an everything its cracked up to be?

More: http://www.scmagazineuk.com/

More Stories By Steve Watts

Steve Watts is co-founder of SecurEnvoy. He brings 25 years’ of industry experience to his role at the helm of Sales & Marketing for SecurEnvoy. He founded the company with Andrew Kemshall in 2003 and still works tirelessly to grow the company in new and established markets. His particular value is market and partner strategy; having assisted in the development and design of the products, designed the pricing strategy and recurring revenue model that has been so key to the businesses growth and success.

Before starting SecurEnvoy, Steve was responsible for setting up nonstop IT, the UK’s first IT security reseller in 1994. Prior to setting out on his own, Steve worked as Sales Director at the networking and IT division of Comtec, and had started his career in office solution sales in 1986.

Outside of work, Steve is a keen rugby fan. He also enjoys sailing, mountain biking, golf and skiing

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...