Cloud Security Authors: Pat Romanski, Liz McMillan, Yeshim Deniz, Elizabeth White, Derek Weeks

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo

@CloudExpo: Article

Encryption in Use Deep Dive

What you need to know to secure and control your data

Encryption in Use – Fact and Fiction
Risk-conscious enterprises across the globe have been reluctant to embrace the public cloud model. For many, compliance requirements are the source of the reluctance. For others, concerns about ceding control of their data to a cloud service provider, without the cloud service provider accepting liability for customer data, is the major hurdle. Conforming to data residency regulations, when implementing a distributed services model, present a further complication. Even as these challenges to adoption loom large, the economics and productivity benefits of cloud-based services remain compelling. For these organizations to make the transition to the cloud, a range of elements must be in place, including continuous monitoring of the cloud service provider’s data center, enforcement of appropriate service level agreements, data classification and definition of internal processes to manage cloud-based services.  Encryption in use is a critical piece of this puzzle, since it provides a mechanism for the enterprise to extend their boundary of control to their data stored and processed within the cloud service provider's environment. However, not all encryption in use is created equally, secure, and a generic. A one size fits all approach is likely to fall short in providing a balance between security and functionality.

The Case for Encryption in Use
For almost as long as the field of information security has been in existence, encryption of data at rest and encryption of data in transit have served as cornerstone technologies to prevent access to sensitive, proprietary, confidential or regulated data. Both forms of encryption operate through exchange and presentation of a combination of public and private keys that unlock the encrypted data. The great step forward for modern cryptography was the idea that the key that you use to encrypt your data could be made public while the key that is used to decrypt your data could be kept private. The purpose of both is to ensure that only users or systems with access to the key could access the data.

Encryption in use provides functionality that is almost counter-intuitive to the purpose behind modern encryption for data at rest and data in transit, working to ensure that the data remains in an encrypted state, even as users interact with the data, performing operations like search or sort, for example. However, just like encryption for other states of data, encryption in use serves a clear need. Without encryption in use, organizations cannot retain ownership and control of their data stored and processed in a cloud-based service – whether control is required to address security, compliance, data residency, privacy or governance needs. Encryption in use is similar to format preserving encryption in that it is applied in real time, but allows for a far broader range of cloud service functionality and feature support.

Encryption in use enables enterprises to independently secure their data stored and processed at cloud service providers – while holding on to the encryption keys. The ongoing revelations of government surveillance that are supported by laws compelling cloud service providers to hand over customer data, highlight the challenge that end users face of meeting their obligations to retain direct control of their cloud data.  The recent set of recommendations from the Review Group on Intelligence and Communications Technologies appointed by the White House focused on implementing better privacy steps is only the first step in revisiting policies.

Because encryption in use is an emerging area, the technology can be easily misunderstood, or even easily misrepresented. Typically, encryption in use entails the use of a gateway, or proxy, architecture. The user accesses the application via the gateway – whether the application server is in the cloud or on premise.  The key to decrypt the data resides in the gateway (or in an integrated HSM), ensuring that data stored and processed at the server is persistently encrypted, even as the encryption is entirely transparent to the user. Were the user to access the server directly, bypassing the gateway, the data would simply appear as a string of encrypted gibberish.  As long as the gateway remains under the data owner’s control, only authorized users can gain access to the data stored and processed at the cloud service provider, or other third party.

In the event that the cloud service provider is required to hand over customer data in response to a government subpoena, they must their meet their legal obligation. However, if encryption in use has been implemented, the service provider can only hand over encrypted gibberish. The request for data must then be directed to the entity that holds the encryption keys. Likewise, a rogue administrator, a hacker or government entity would only be able view unintelligible gibberish if they gained access to the user account.

Not Some Kind of Magic
In order to deliver on the promise of encryption in use, the gateway must deliver on a robust set of functionality requirements: comprehensive service functionality and water-tight security based on a strong encryption scheme. What this means in practical terms is that the entirety of the service’s functional elements and behavior must be mapped, and that the encryption scheme must allow for preserving functionality without compromising security. This is because the gateway must recreate the session for the cloud-facing leg, and transpose encrypted data into the flow without disrupting functionality like search, sort and index.  Otherwise, the user experience is degraded, and the value proposition of the cloud-based service of improving productivity is undermined.

Vendors face another set of choices: take shortcuts to cover as much ground to provide a superficial sense of security, or invest in extensive R&D work to deliver the optimal balance between functionality and strong security. For instance, vendors can opt to provide encryption for a just a few data fields, out of hundreds or even a few thousand, to encompass a specific subset of the enterprise’s information. Equally, they can choose to implement a cloud data encryption scheme that preserves features relying on referential integrity such as sort, search and index that is easily reversible by attackers.

By way of illustration, if the scheme involves deterministically encrypting words into very short AES blocks, the encoding pattern is consistent enough for common attacks to yield clear text from what might appear to be encrypted text. There are a variety of iterative attacks such as chosen plaintext attacks that will yield clear text if the encryption relies on a simplistic and consistent encoding pattern. So while the data may appear to be encrypted, and less engineering resources are required to support application features and functionality, the data protection in place is barely skin deep.

Encryption in use is not a kind of magic – it requires dedicated engineering expertise, with collaboration between infrastructure, information security and encryption experts. And, the encryption scheme must be tailored to a specific application or service to deliver on the appropriate balance of security and functionality.

Another significant consideration is evaluating encryption in use in the context of a specific application or service. From the customer’s perspective, it is appealing to use a single encryption platform for multiple applications. No customer wants to have to manage multiple appliances, management interfaces and vendors. The reality, however, is that to strike an acceptable balance for any risk conscious organization between security and functionality requires deep application knowledge and encryption-in-use expertise. Dig a little deeper on degree of support, or risk a gamble on production readiness. The degree of support is as critical as the extent of support.

Evaluating Encryption in Use Claims
Can enterprises rely on a standard validation for encryption in use? Precisely because encryption in use is a new area, third-party validation is a critical requirement before it is implemented in production environments. Unfortunately, the current set of standard validation and certification tests have limited applicability.

The most frequently cited third-party validation by vendors in the space is FIPS 140-2 validation. As critical as 140-2 validation is as an evaluation benchmark, and specifically required under some federal procurement mandates, it has some limitations for encryption in use.

Taking a step backward, its important to note the scope of FIPS validation. The process essentially verifies that the algorithms are implemented according to defined specifications. However, it does not provide any validation about how the platform would use the cryptographic module in order to support encryption in use.

For instance, the FIPS validation doesn't outline a set of best practices on how to use the cryptographic module. Instead, it verifies that whenever the system invokes AES encryption, the module performs AES encryption according to the standard specification.  FIPS validation is limited to the cryptographic modules used, not the overall integrity of the platform, or the encryption scheme used in production environments. While FIPS validation is an important consideration, enterprises should be aware of its limitations as the sole third party validation for encryption. In an outside world example, validation would demonstrate that a $500 bicycle lock is impervious to any lock picking attempts, but when used to lock a bike to a fire hydrant, it does nothing to protect the bike from a thief simply lifting the bike up and driving away.

Hopefully this has been useful in helping you to determine the right approach your organization can take to secure and maintain control of your data. I look forward to hearing any further points I might have missed.

More Stories By Elad Yoran

Elad is Chairman and CEO of cloud encryption company, Vaultive. His nearly 20 years in the cyber security industry spans experience as an executive, consultant, investor, investment banker and a several-time successful entrepreneur. Elad’s entrepreneurial experience includes Riptech, the pioneering provider of managed security services to governments and Fortune 500 corporations around the world, acquired by Symantec Corporation, Sentrigo, a leading provider of database security recently acquired by McAfee, and MediaSentry, a provider of anti-piracy technology solutions to the motion picture, music and software industries, acquired by SafeNet. Elad has also served as Vice President, Global Business Development at Symantec and as Vice President at Broadview International (acquired by Jeffries), an investment bank focusing on mergers and acquisitions in the technology industry, where he led the firm’s information security practice. Elad has been recognized as “Entrepreneur of the Year” by Ernst & Young.

@ThingsExpo Stories
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
The Open Connectivity Foundation (OCF), sponsor of the IoTivity open source project, and AllSeen Alliance, which provides the AllJoyn® open source IoT framework, today announced that the two organizations’ boards have approved a merger under the OCF name and bylaws. This merger will advance interoperability between connected devices from both groups, enabling the full operating potential of IoT and representing a significant step towards a connected ecosystem.
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...