Click here to close now.

Welcome!

Security Authors: Elizabeth White, Liz McMillan, Pat Romanski, Tim Hinds, Brad Thies

Related Topics: Security, SOA & WOA

Security: Article

After WikiLeaks, What's Next for Document Compliance Management?

A discussion with Brainloop CEO Peter Weger

The WikiLeaks security fiasco has shed a lot of light on document security and its inherent irony: namely that the more confidential a document is, the more it's likely to be shared. Web Security Journal reached out to the CEO of Brainloop, Peter Weger, to discuss document compliance management as a risk mitigation strategy.

Web Security Journal: What security issues do you see becoming more pervasive in the coming months?

Peter Weger: A frequently unaddressed challenge is that companies’ most confidential documents are often those that travel the most outside the enterprise. Business depends on sharing information in collaborative processes like coordination among board members; working with research, supply and distribution partners; and communications with outside experts such as external counsel, consultants, auditors and regulatory authorities. However the more a document has to be accessed outside the corporate network, the greater the risk of leakage, so a company’s most sensitive documents are at much greater risk than other documents.

Web Security Journal: How do most businesses enable collaboration today?

Email continues to be the most common method used for information sharing and communication. Organizations tend to collaborate through this ubiquitous technology, sending emails to their employees, partners, suppliers and customers. These emails, of course, include content, attachments and links, some of which contain sensitive information.

To supplement email functionality, individual departments sometimes acquire Cloud-based collaboration applications, often without advice from corporate IT on the selection, vetting or implementation of these services. These systems provide rudimentary content storage, distribution and work flow that email lacks.


Web Security Journal:
Where do the current methods fall short?

Weger: To state the obvious, email was not designed to be a real-time, multi-user, secure collaboration system. We know that email makes it extremely easy for security policies to be bypassed. A simple “reply all” can find an employee, either unintentionally or maliciously, sending sensitive information to one or more unintended recipients. Email and any attachments that arrive at the recipient’s mail client could be forwarded to other parties that may not have the right or need to view the information. In this latter instance, the organization that owned the data may never find out that this unexpected data sharing activity took place.

Most commercially-available Cloud-based collaboration offerings were purpose-built with a simple, primary objective of sharing information; security became an afterthought for most products. This becomes a serious risk when you consider that these products typically leave the control of the policy and access to the data in the hands of the collaboration solution provider. Some of the top-performing solutions have attempted to wrap security around the content in such a way that end users can apply document protections, requiring them to define the classification and sharing policies themselves. Of course, by putting the decision into the hands of end users with no experience in defining policy, and without the perspective of the company’s central policy standard, poor decisions could be made and sensitive data still exposed to unauthorized access and misuse. Organizations with hundreds of users have no way to ensure consistent application of security measures. In addition to putting their own documents at risk, using unsecure collaboration applications may result in companies violating their contractual obligations to protect their partners’ confidential information.

To address some of these risks, organizations continue to make significant investments in the various perimeter security technologies designed to prevent information from leaving the organization. Some of the technologies used include firewalls, network intrusion prevention systems (IPS), data loss/leak prevention (DLP) and more. The main problem with this ‘protect the perimeter’ approach is that these products focus on protecting at the infrastructure layer, not on the information itself, which must travel outside the network in order for the company to function. This effectively still leaves the user in control of the information’s destiny, which usually leads to the dangerous choice of expedience over security.


Web Security Journal:
What is DCM and how does it fit in?

Weger: Document Compliance Management is a discipline that proactively manages information risk arising from sharing documents electronically.

As organizations move more of their information management processes outside the firewall to the extended enterprise, end users’ demands for collaboration come into conflict with corporate demands to protect information through consistent policy application and control over distribution. DCM seeks to reconcile these demands by creating security provisions that move with documents throughout their lifecycles, both inside and outside the network.


Web Security Journal:
What are some of its applications?

Weger: Organizations that are struggling to collaborate while meeting their regulatory, compliance and governance requirements are grappling with the issues Document Compliance Management addresses. Ultimately, these organizations want to collaborate and transact securely within their communities of trust. Regulatory auditors will look for a complete audit trail that captures the entire lifecycle of the organization’s sensitive information; who had access to which documents at which point in time.

Consider the scenario in which inside counsel is required to work with outside counsel, each sharing sensitive legal documents with their counterparts on the other end. They need to maintain control over their documents after they have left the corporate network and they are required to keep a full audit trail of all document activity. They may deal with documents of varying levels of sensitivity, and need an easy way for end users to apply the appropriate controls to each document.

Another example is the Human Resources team collaborating with healthcare providers, financial services providers, state and federal tax entities, and more. Again, the documents need to be shared with trust and a full audit trail must be available to ensure that employees’ personal information has been protected as it passes to external parties. Some key components of this audit trail must document which information has been provided to which business partner, and whether or not they were able to print it, save it locally, or forward it to other people.


Web Security Journal:
Why not just block all access by default?

Weger: Looking historically at security, most responses to attacks, breaches or compliance exceptions have been to shut down the operation or block the action. Years ago, when organizations experienced viruses running wild through their email systems, they simply shut down email until the problem was resolved. If they were worried about data leaving via USB sticks, they would blanket block the use of USB ports throughout the entire organization. We see this same model being applied to data protection within the collaboration space -- classify data as being sensitive and block it from being shared.

This model fails miserably. A block-by-default policy goes against the business models of today, which rely on employees, partners, suppliers, legal counsel, and other outside parties who must collaborate with each other using sensitive information.

Therefore, the main goal for DCM is to provide a secure means for end users to collaborate within corporate and regulatory policy for all approved parties, both inside and outside the organization. Corporate policy makers should risk rank business processes, define security policies and classifications, and roll them out to end users in a secure collaboration platform. This would ensure the proper use of documents, doing so in a way that is easy and transparent for the end user, without putting the end user in the unenviable position of having to make policy decisions. It must be simple enough that users will be comfortable doing their jobs within the systems they are already familiar with, as opposed to working around a protected system that is blocking them from collaborating.


Web Security Journal:
What should an organization consider when implementing DCM?

Weger: Organizations should try to include these features in their own DCM programs:

  1. Centralized data classification, policy definition, and policy enforcement capabilities
  2. Enables end users to do their job without having to think about security, doing so without making them work outside of their existing business processes
  3. Is flexible enough to support a variety of business processes to prevent the proliferation of disparate point solutions, and can be easily integrated with the company’s existing ecosystem.

Organizations tend to focus on the tactical problems they face with data protection and often look to solve them with technology delivered by their traditional perimeter security vendor. If an organization really wants to be successful in enabling secure business collaboration, they must approach the problem at the document/information level; develop a plan to define and enable their end users, partners, and others to securely collaborate within the boundaries of their internal and/or regulatory constraints.

More Stories By Peter Weger

Peter Weger is CEO of Cambridge, Mass-based Brainloop, a document security vendor. He has 25 years of management experience at companies such as Software AG, Portal Software, Borland and Network Associates. \

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization model to drive results.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from mere Big Data to smart data. He will argue that today’s hybrid cloud storage solutions, with commodity...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...