|By Gerry Grealish||
|January 10, 2014 10:45 AM EST||
Many news organizations including The Washington Post are reporting that the latest documents leaked by former NSA contractor turned whistleblower Edward Snowden show the NSA is in the early stages of working to build a quantum computer that could possibly crack most types of encryption. The NSA actually discloses they are working on quantum computing technology on their website, however, the status of the research was previously unknown. According to these new documents the agency is working on a "cryptologically useful quantum computer" as part of a research program called "Penetrating Hard Targets" and the goal is to use it for cracking encryption.
With headlines that scream, "NSA Secretly Funding Code-Breaking Quantum Computer Research," it's easy to see why many executives and enterprises are anxious and perhaps starting to lose faith in internet communications and transactions. Encryption is used to protect medical, banking, business and government records around the world. But, as many of the articles in the media today point out, the reality today is quantum computing is a theoretical research topic and is many years away from being a usable real-world technology. The Washington Post article quotes Scott Aaronson, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology, "It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it."
As cryptography expert Bruce Schneier said in a piece in USA Today, "I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts."
Mr. Schneier's comments do re-affirm the importance of never locking-in, intractably, to a single encryption algorithm/technique. If an organization does happen to lose faith in the integrity of a specific encryption algorithm, and it has become core to many/all of the systems it runs, they would be in a very difficult position. The systems that are used to protect information, like cloud encryption gateways, need to be flexible enough to do their job regardless of what encryption algorithms are used. This design approach provides organizations with the flexibility to swap algorithms in and out over time based upon their preference without impacting the core capabilities of the solutions using these encryption modules.
Even though quantum computers are years away, the news today is an important reminder to ensure that if you are using encryption, take care to make sure you are using the strongest, most well-vetted techniques available when protecting sensitive data. Groups such as National Institute of Standards and Technology (NIST) have standards such as Federal Information Processing Standards (FIPS) for use across the Federal Government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. Organizations should look for strong, industry acknowledged encryption approaches that meet accredited standards such as FIPS 140-2 when protecting sensitive and private information, and have well documented third-party peer-reviewed security proofs.
Also, enterprises should strongly consider the inherent strength of an alternative data protection technique known as tokenization (which has no keys to crack). Tokenization is a process by which a sensitive data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. The inherent security of tokenization is that it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. If a criminal got access to the token (in a cloud environment for example), there is no "quantum computer" that could ever decipher it back into its original form.
For more information on encryption, tokenization and retaining control over sensitive data in the cloud, please visit our resource center.
PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit / or follow on Twitter @perspecsys.
Aug. 25, 2016 09:00 AM EDT Reads: 546
Aug. 25, 2016 09:00 AM EDT Reads: 3,862
Aug. 25, 2016 08:45 AM EDT Reads: 2,106
Aug. 25, 2016 08:30 AM EDT Reads: 407
Aug. 25, 2016 08:30 AM EDT Reads: 530
Aug. 25, 2016 08:30 AM EDT Reads: 1,702
Aug. 25, 2016 06:00 AM EDT Reads: 1,424
Aug. 25, 2016 02:30 AM EDT Reads: 2,158
Aug. 25, 2016 02:00 AM EDT Reads: 1,793
Aug. 25, 2016 02:00 AM EDT Reads: 1,905
Aug. 25, 2016 01:15 AM EDT Reads: 1,618
Aug. 25, 2016 12:45 AM EDT Reads: 1,914
Aug. 25, 2016 12:00 AM EDT Reads: 2,986
Aug. 24, 2016 09:15 PM EDT Reads: 1,685
Aug. 24, 2016 04:15 PM EDT Reads: 2,573
Aug. 24, 2016 02:15 PM EDT Reads: 1,452
Aug. 24, 2016 09:45 AM EDT Reads: 2,265
Aug. 24, 2016 09:00 AM EDT Reads: 3,526
Aug. 24, 2016 04:30 AM EDT Reads: 2,196
Aug. 24, 2016 12:00 AM EDT Reads: 2,868