|By Elad Yoran||
|May 19, 2014 10:00 AM EDT||
Encryption in Use – Fact and Fiction
Risk-conscious enterprises across the globe have been reluctant to embrace the public cloud model. For many, compliance requirements are the source of the reluctance. For others, concerns about ceding control of their data to a cloud service provider, without the cloud service provider accepting liability for customer data, is the major hurdle. Conforming to data residency regulations, when implementing a distributed services model, present a further complication. Even as these challenges to adoption loom large, the economics and productivity benefits of cloud-based services remain compelling. For these organizations to make the transition to the cloud, a range of elements must be in place, including continuous monitoring of the cloud service provider’s data center, enforcement of appropriate service level agreements, data classification and definition of internal processes to manage cloud-based services. Encryption in use is a critical piece of this puzzle, since it provides a mechanism for the enterprise to extend their boundary of control to their data stored and processed within the cloud service provider's environment. However, not all encryption in use is created equally, secure, and a generic. A one size fits all approach is likely to fall short in providing a balance between security and functionality.
The Case for Encryption in Use
For almost as long as the field of information security has been in existence, encryption of data at rest and encryption of data in transit have served as cornerstone technologies to prevent access to sensitive, proprietary, confidential or regulated data. Both forms of encryption operate through exchange and presentation of a combination of public and private keys that unlock the encrypted data. The great step forward for modern cryptography was the idea that the key that you use to encrypt your data could be made public while the key that is used to decrypt your data could be kept private. The purpose of both is to ensure that only users or systems with access to the key could access the data.
Encryption in use provides functionality that is almost counter-intuitive to the purpose behind modern encryption for data at rest and data in transit, working to ensure that the data remains in an encrypted state, even as users interact with the data, performing operations like search or sort, for example. However, just like encryption for other states of data, encryption in use serves a clear need. Without encryption in use, organizations cannot retain ownership and control of their data stored and processed in a cloud-based service – whether control is required to address security, compliance, data residency, privacy or governance needs. Encryption in use is similar to format preserving encryption in that it is applied in real time, but allows for a far broader range of cloud service functionality and feature support.
Encryption in use enables enterprises to independently secure their data stored and processed at cloud service providers – while holding on to the encryption keys. The ongoing revelations of government surveillance that are supported by laws compelling cloud service providers to hand over customer data, highlight the challenge that end users face of meeting their obligations to retain direct control of their cloud data. The recent set of recommendations from the Review Group on Intelligence and Communications Technologies appointed by the White House focused on implementing better privacy steps is only the first step in revisiting policies.
Because encryption in use is an emerging area, the technology can be easily misunderstood, or even easily misrepresented. Typically, encryption in use entails the use of a gateway, or proxy, architecture. The user accesses the application via the gateway – whether the application server is in the cloud or on premise. The key to decrypt the data resides in the gateway (or in an integrated HSM), ensuring that data stored and processed at the server is persistently encrypted, even as the encryption is entirely transparent to the user. Were the user to access the server directly, bypassing the gateway, the data would simply appear as a string of encrypted gibberish. As long as the gateway remains under the data owner’s control, only authorized users can gain access to the data stored and processed at the cloud service provider, or other third party.
In the event that the cloud service provider is required to hand over customer data in response to a government subpoena, they must their meet their legal obligation. However, if encryption in use has been implemented, the service provider can only hand over encrypted gibberish. The request for data must then be directed to the entity that holds the encryption keys. Likewise, a rogue administrator, a hacker or government entity would only be able view unintelligible gibberish if they gained access to the user account.
Not Some Kind of Magic
In order to deliver on the promise of encryption in use, the gateway must deliver on a robust set of functionality requirements: comprehensive service functionality and water-tight security based on a strong encryption scheme. What this means in practical terms is that the entirety of the service’s functional elements and behavior must be mapped, and that the encryption scheme must allow for preserving functionality without compromising security. This is because the gateway must recreate the session for the cloud-facing leg, and transpose encrypted data into the flow without disrupting functionality like search, sort and index. Otherwise, the user experience is degraded, and the value proposition of the cloud-based service of improving productivity is undermined.
Vendors face another set of choices: take shortcuts to cover as much ground to provide a superficial sense of security, or invest in extensive R&D work to deliver the optimal balance between functionality and strong security. For instance, vendors can opt to provide encryption for a just a few data fields, out of hundreds or even a few thousand, to encompass a specific subset of the enterprise’s information. Equally, they can choose to implement a cloud data encryption scheme that preserves features relying on referential integrity such as sort, search and index that is easily reversible by attackers.
By way of illustration, if the scheme involves deterministically encrypting words into very short AES blocks, the encoding pattern is consistent enough for common attacks to yield clear text from what might appear to be encrypted text. There are a variety of iterative attacks such as chosen plaintext attacks that will yield clear text if the encryption relies on a simplistic and consistent encoding pattern. So while the data may appear to be encrypted, and less engineering resources are required to support application features and functionality, the data protection in place is barely skin deep.
Encryption in use is not a kind of magic – it requires dedicated engineering expertise, with collaboration between infrastructure, information security and encryption experts. And, the encryption scheme must be tailored to a specific application or service to deliver on the appropriate balance of security and functionality.
Another significant consideration is evaluating encryption in use in the context of a specific application or service. From the customer’s perspective, it is appealing to use a single encryption platform for multiple applications. No customer wants to have to manage multiple appliances, management interfaces and vendors. The reality, however, is that to strike an acceptable balance for any risk conscious organization between security and functionality requires deep application knowledge and encryption-in-use expertise. Dig a little deeper on degree of support, or risk a gamble on production readiness. The degree of support is as critical as the extent of support.
Evaluating Encryption in Use Claims
Can enterprises rely on a standard validation for encryption in use? Precisely because encryption in use is a new area, third-party validation is a critical requirement before it is implemented in production environments. Unfortunately, the current set of standard validation and certification tests have limited applicability.
The most frequently cited third-party validation by vendors in the space is FIPS 140-2 validation. As critical as 140-2 validation is as an evaluation benchmark, and specifically required under some federal procurement mandates, it has some limitations for encryption in use.
Taking a step backward, its important to note the scope of FIPS validation. The process essentially verifies that the algorithms are implemented according to defined specifications. However, it does not provide any validation about how the platform would use the cryptographic module in order to support encryption in use.
For instance, the FIPS validation doesn't outline a set of best practices on how to use the cryptographic module. Instead, it verifies that whenever the system invokes AES encryption, the module performs AES encryption according to the standard specification. FIPS validation is limited to the cryptographic modules used, not the overall integrity of the platform, or the encryption scheme used in production environments. While FIPS validation is an important consideration, enterprises should be aware of its limitations as the sole third party validation for encryption. In an outside world example, validation would demonstrate that a $500 bicycle lock is impervious to any lock picking attempts, but when used to lock a bike to a fire hydrant, it does nothing to protect the bike from a thief simply lifting the bike up and driving away.
Hopefully this has been useful in helping you to determine the right approach your organization can take to secure and maintain control of your data. I look forward to hearing any further points I might have missed.
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldw...
Mar. 26, 2017 10:45 AM EDT Reads: 595
SYS-CON Events announced today that Infranics will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Since 2000, Infranics has developed SysMaster Suite, which is required for the stable and efficient management of ICT infrastructure. The ICT management solution developed and provided by Infranics continues to add intelligence to the ICT infrastructure through the IMC (Infra Management Cycle) based on mathemat...
Mar. 26, 2017 10:15 AM EDT Reads: 2,989
SYS-CON Events announced today that SD Times | BZ Media has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. BZ Media LLC is a high-tech media company that produces technical conferences and expositions, and publishes a magazine, newsletters and websites in the software development, SharePoint, mobile development and commercial UAV markets.
Mar. 26, 2017 09:30 AM EDT Reads: 4,273
SYS-CON Events announced today that Cloudistics, an on-premises cloud computing company, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Cloudistics delivers a complete public cloud experience with composable on-premises infrastructures to medium and large enterprises. Its software-defined technology natively converges network, storage, compute, virtualization, and management into a ...
Mar. 26, 2017 09:15 AM EDT Reads: 1,940
Now that the world has connected “things,” we need to build these devices as truly intelligent in order to create instantaneous and precise results. This means you have to do as much of the processing at the point of entry as you can: at the edge. The killer use cases for IoT are becoming manifest through AI engines on edge devices. An autonomous car has this dual edge/cloud analytics model, producing precise, real-time results. In his session at @ThingsExpo, John Crupi, Vice President and Eng...
Mar. 26, 2017 09:00 AM EDT Reads: 3,867
SYS-CON Events announced today that Loom Systems will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2015, Loom Systems delivers an advanced AI solution to predict and prevent problems in the digital business. Loom stands alone in the industry as an AI analysis platform requiring no prior math knowledge from operators, leveraging the existing staff to succeed in the digital era. With offices in S...
Mar. 26, 2017 09:00 AM EDT Reads: 1,235
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Mar. 26, 2017 08:15 AM EDT Reads: 2,874
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
Mar. 26, 2017 08:15 AM EDT Reads: 2,976
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Mar. 26, 2017 08:00 AM EDT Reads: 4,142
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Mar. 26, 2017 07:15 AM EDT Reads: 1,655
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
Mar. 26, 2017 03:30 AM EDT Reads: 3,864
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
Mar. 26, 2017 03:15 AM EDT Reads: 1,067
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
Mar. 26, 2017 01:45 AM EDT Reads: 3,753
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 26, 2017 12:30 AM EDT Reads: 1,935
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Mar. 26, 2017 12:15 AM EDT Reads: 786
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Mar. 26, 2017 12:00 AM EDT Reads: 4,167
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Mar. 26, 2017 12:00 AM EDT Reads: 1,770
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
Mar. 25, 2017 08:45 PM EDT Reads: 6,065
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 25, 2017 08:45 PM EDT Reads: 2,842
Things are changing so quickly in IoT that it would take a wizard to predict which ecosystem will gain the most traction. In order for IoT to reach its potential, smart devices must be able to work together. Today, there are a slew of interoperability standards being promoted by big names to make this happen: HomeKit, Brillo and Alljoyn. In his session at @ThingsExpo, Adam Justice, vice president and general manager of Grid Connect, will review what happens when smart devices don’t work togethe...
Mar. 25, 2017 06:15 PM EDT Reads: 2,598