Welcome!

Cloud Security Authors: Liz McMillan, Pat Romanski, Elizabeth White, Yeshim Deniz, Terry Ray

Related Topics: Industrial IoT, Cloud Security

Industrial IoT: Article

XML & Security

XML & Security

As organizations increase their use and adoption of XML for their documents and data, the issue of security increases in both visibility and the potential for confusion. For application developers who want to leverage the power of XML in describing those documents and for sharing organizational data, these security issues can become difficult to understand and even more difficult to address.

An organization's concerns about the security of its documents and data are often made more difficult by a lack of understanding of what "secure documents" or "digital signatures" actually mean. By understanding the fundamentals of security, especially as it relates to XML-based documents and data, an application developer can make the implementation of security appropriate to the organization's requirements and greatly simplify the actual development process.

Security Fundamentals
When we think of security, many images come to mind - a bank, a fortress, armed guards. While accurate, these types of images mislead organizations about the core elements required for effective and efficient security. All security-related needs can be summarized in two statements:

  • Someone is trying to take things away from us.
  • We place value on those things..

These two statements are critical to understanding what an organization means when they request "secure documents" or any other type of security. To put effective security measures in place, we must assume that someone, with malicious intent, is actively trying to take away from us the things we value, and that if they succeed we'll be adversely affected. If we don't make this assumption, we don't need security for anything. If there's no threat, security isn't required. If we place no value on the things we're trying to protect, security isn't required. These statements are also valuable in two other ways. They form the foundation of the first questions an organization must ask about the documents and data they wish to protect:

  • Who are we to protect against?
  • Are they internal or external or both?
  • What are the characteristics of these people?
  • What means might they use to take our information away from us?
  • What will they do with the information?
  • How important is the information?
  • What damage will be caused by the loss of this information?

The second way these two statements provide value is that they allow us to fit the level and type of protection we use to the nature of the threat. This concept is critical, and it's the most overlooked aspect of the applications we develop.

Matching the Threat
Consider how you protect your house. Most people have locks on their doors. Some have bars on their windows; some have security systems. Some communities have walls, gates, and security patrols. Each measure is appropriate to the value and the threat. Homeowners decide the level of security they wish to implement.

For documents and data the threats can come in four areas:

  • Privacy: The need to ensure that the document can be read or used only by the appropriate individuals or applications
  • Integrity: The need to ensure that the data hasn't changed between the sender and the recipient
  • Authentication: The need to ensure that persons or applications sending a document are actually who they claim to be and that they're authorized to send it
  • Nonrepudiation: The need to ensure that senders of a document can't claim they didn't send it, that is, they can't repudiate the document or data

Consider sending a letter using the postal service as an example of these four concepts.

You put your letter in an envelope and seal it to ensure that your message is private. No one can read it without tampering with the envelope, which would alert you that your message is no longer private. You trust the postal service as a channel that will respect your privacy while handling your letter.

The envelope also provides a level of integrity. Its condition signals whether your letter has been tampered with. A handwritten letter provides an additional layer of integrity in that the recipient can distinguish your writing from that of someone trying to modify the contents of your message.

Authentication and nonrepudiation are provided in a number of ways. First and foremost is your signature at the bottom of the letter. The return address also confirms that the letter came from you and makes it difficult to deny that you sent it. The postal service helps with a date and time stamp - the postmark - that proves when it was sent and from what location. And a final measure of nonrepudiation can be obtained from an analysis of your handwriting.

Security Technology
Two major mechanisms provide for document- and data-level security: access control and digital signatures. The former provides security at the application level and ensures privacy, integrity, authentication, and nonrepudiation by ensuring that only those persons or applications allowed to access the documents and data can do so. The latter provides direct document-level security by applying a signature to the data and possibly the context of the data. These two mechanisms should be considered complementary, not mutually exclusive. The most efficient and effective applications use both techniques, determined by the specific requirements of each type of document and data generated.

Access Control
The simplest and fastest mechanism to secure any application is the use of access control. It can be as simple as a login ID and password, or it can be based on sophisticated technologies such as swipe cards or hardware tokens, or biometric analysis such as fingerprints or retinal scans. Once a user is allowed access to it, the application becomes responsible for ensuring the privacy of the documents and data generated or accessed by that user. Integrity is also the application's responsibility. Authentication and nonrepudiation are automatic as any documents created or data generated must have been done by the authenticated user.

Digital Signatures
Signatures in general, and digital signatures specifically, are placed on documents for one of two reasons: in-process validation or postprocess validation. In-process validation is the use of a signature for authentication purposes (see sidebar - When Is a Signature Not a Signature?). Postprocess validation - use of a signature for nonrepudiation purposes - is the most common use of a physical signature and the most effective use of a digital signature. If authentication is the objective, then an access control-based mechanism is most appropriate. Examples of postprocess validation are the signatures placed on mortgage or loan documents, which are checked only if a dispute arises at some point..

In these cases the connection between the signature and the XML tags is critical. The application is responsible for linking the signature to the specific tags that need to be protected. In addition, it's the application that must respect the existence of a signature and perform the correct verification based on the specific business rules to be applied. While some applications require that the entire XML document be signed - data and context - not all documents require this. Some may need only certain elements signed; others require multiple signatures. These options can be summarized as follows:

1. Sign just the data. The advantages are:

  • The ability to reuse data with other applications
  • The ability to verify data independent of the generating application
  • The ability to use industry-standard schemas The disadvantages are:
  • Increased application complexity
  • An increased risk of data misinterpretation

Signing just the data is most appropriate when the application controls how the data appears to users. In addition, it's the only method possible when industry-standard schemas are used to exchange data between different applications or organizations. But signing just the data requires that the application take responsibility for the presentation of the data in the correct context. This is a critical responsibility, because if the application misrepresents the data by presenting it to a user in a context different from the one intended, the user may be agreeing to something other than the originator's intention.

2. Sign the context and the data merged together. The advantages are:

  • This is easy to understand because it's the digital, conceptual equivalent of a paper document.
  • Data misrepresentation is prevented.
  • It reduces the application complexity.

The disadvantages are:

  • Industry-standard schemas can't be used since they're generally data only.
  • Access to the data by multiple applications is impeded.
  • The ability to reuse the data in multiple applications and different contexts is complicated.

Signing a merged document is most appropriate (1) when creating an electronic record, and (2) when different applications may be unable to apply the correct context to the data. A situation in which a merged document may need to be signed will generally occur when a document is created or submitted from an external or unknown person or application, such as in an unsolicited bid. The signature is critical to ensure nonrepudiation, but the context is just as important; otherwise the potential exists to misrepresent the data in a manner that damages one or both parties.

3. Sign the data with a reference to the context. The advantages are:

  • The data can be reused in other contexts and applications.
  • The data can't be misrepresented since the context is linked to the data.
  • Industry-standard schemas can be used. The disadvantage is:
  • It increases the application's complexity.

Signing the data with a reference to the context is the most flexible way of signing XML data. It's the method that's recommended as an application default and should be changed only in the case of document archiving or data exchange. Using a reference to the original context provides the application with the maximum flexibility and allows use of industry schemas. In addition, this method allows applications to present the data in a different context while still retaining the original context as part of the complete document.

Archiving XML Documents
One of the most common reasons to apply a digital signature to an XML document is the need to create an electronic "document of record" for a transaction. Like a paper record, the electronic equivalent must provide an organization with a record of what took place and must retain the context within which the transaction occurred. For that reason the most appropriate method of creating an electronic record is to merge the context of the transaction with the data and to sign the entire document.

The requirement for long-term retention of electronic documents makes XML the perfect candidate. But the power and flexibility of XML can introduce added complexity. Application developers need to ensure that when an electronic record of a transaction is created, it has no external links or references. The developer can't guarantee that those links and references will exist when the document needs to be retrieved. Or the developer needs to ensure that all linked or referenced information will be available for as long as the electronic record remains archived.

A signed document used as an electronic record poses a special problem for application developers. As a natural part of a public key infrastructure (PKI), the digital signatures used by individuals expire (see sidebar - Public/Private Key Use). This protects the organization from a stolen signature. The expiration period varies depending on the use of the signature and the organizational requirements. In addition, signatures can be revoked, most likely because the person who originally signed the document is no longer an employee or customer.

Revocation ensures that an unauthorized individual can't use the signature. But developers must expect that the signature currently on a document will expire or be revoked at some point during the archival period. This can cause a serious problem if that document needs to be retrieved in the future to resolve a dispute. If the original signature has expired or has been revoked, it can no longer be verified. Worse, upon verification to prove it was a valid and signed document, the application will reject the document as invalid. One way to avoid this problem is to create a special "archivist's" signature. This signature isn't linked to an individual but to the archiving application. When a document needs to be put into the archive, it's signed a final time by the archiving application using the special archivist's signature. This signature is used to sign everything that'll be placed in the archive connected to this document: the data, the context, if required, any linked or referenced elements, and, most important, the original signature. This final signature represents the application's confirmation of the document as an electronic record and assures future applications retrieving it that the original signature was correct and valid at the time the document was placed in the archive.

Summary
The security of XML-based documents and data is a critical issue as applications begin to exchange XML data and users begin to create and exchange XML documents. To ensure that the most effective and appropriate security measures are in place, we need to apply the right level of security based on the threat to the information and the value of that information. While digital signatures are one of the most powerful technologies for securing XML data and documents, other methods such as access control can provide a completely secure environment for the creation and use of all documents and data.

Even when we apply a digital signature to an XML document, there are more choices about how to apply the signature based on what we need to do with the document and which applications need to access the information. One of the most critical areas is when a document of record, or electronic record, needs to be created. In this case special care needs to be taken to ensure that the document can be fully understood and validated when it's needed in the future.

Once understood, the connection between the power of XML and the need for security allows developers to create applications that satisfy the often contradictory requirements of organizations for maximum flexibility and maximum security.

Side Bar

A public key infrastructure (PKI) is the key technology infrastructure required to implement security for documents and data.

A PKI addresses the security needs of organizations using a concept called public/private keys. The PKI generates unique keys, usually linked to passwords, for each person inside and, if required, outside the organization. These keys become the critical means of protecting information. The PKI also manages the distribution of the keys, their revocation when necessary, and the actual act of protecting the information.

The important aspect of a public key infrastructure is the use of public/private key pairs. As the name suggests, there are two related keys: one available to the public, the other kept private by its holder. The security value comes in the relationship of these two keys.

The keys are generated by a mathematical algorithm that connects the two keys such that if one is used to "lock" something, only the other can be used to "unlock" it. This relationship allows the PKI to give each user a unique private key and to distribute the public keys to anyone who needs them. The two diagrams demonstrate how the key pairs can be used to ensure privacy, integrity, authentication, and nonrepudiation.

In this example, Ann wants to send a document to Bob but doesn't want Eve to be able to read it or tamper with it. She also wants to make sure Bob knows the document is from her.

First, Ann must create a signed document. This is a combination of the document itself and an encrypted fingerprint of the document called a message hash or message digest created from the information in the document and encrypted by Ann's private key. This will ensure that Bob can verify that Ann sent the document and that the information hasn't changed in transit. Ann must then secure the signed document so only Bob will be able to open it.

Once the signed and secure document is ready, Ann can send it to Bob (see Figure 1). Because the security measures are attached directly to the document, Ann can use any method of transportation.

Once Bob receives the signed and encrypted document, he must decrypt the document before he can open it. Since the encryption was applied using his public key, only his private key can decrypt the package (see Figure 2). This ensures that only Bob can open the package. Once opened, Bob can verify that Ann did indeed send the document by verifying her signature. By using Ann's public key Bob can be assured that only Ann's private key could have created the signature. And since the signature includes a fingerprint of the document, Bob can also be sure that the document was not changed in transit.

If a person of malicious intent, such as Eve, tries to intercept the document in order to read its contents or change it, the security measures applied will either stop them completely or allow Bob and Ann to know that the information was tampered with.

Eve cannot open the package containing the signed document without Bob's private key, since it was encrypted using his public key. If Eve were somehow able to steal Bob's private key, thus allowing her to open the package, she would not be able to change any of the information in the document without alerting both Ann and Bob because of the signature that links the information to Ann's private key.

Finally, Eve cannot create a substitute document claiming to be Ann, an activity called spoofing, because she doesn't have Ann's private key and cannot sign a document as Ann.

The use of public/private key pairs provides individuals and organizations with a simple and effective way of securing information and documents.

More Stories By Eric Stevens

Eric Stevens has over 20 years of computing industry experience. As vice president of research and technology evangelism for JetForm Corporation, he's currently responsible for strategic analysis, and assists in setting and communicating the company's technology direction. Over the past 10 years he has focused on workflow and electronic form solutions.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the tec...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...