Welcome!

Cloud Security Authors: Liz McMillan, Pat Romanski, Zakia Bouachraoui, Elizabeth White, Yeshim Deniz

Related Topics: @DXWorldExpo, Cloud Security

@DXWorldExpo: Blog Post

How to Assess Impact of the ‘Panama Papers’ on Your Network | @BigDataExpo #BigData

Establish non-invasive compliance fabric that continually monitors corporate compliance against all known threats & regulations

We all awoke last week to the latest regulatory and reputational risk since names like Madoff and Snowden burst into the headline. Weekly, there are smaller local skirmishes between the behavior of companies, public officials and the prying eyes of an ever more symbiotic relationship between the press and motivated whistle blowers. It is difficult to imagine a practical preventive solution to knowledgeable individuals actively trying to circumvent well-known regulations. Much like the industry's ongoing cyber-security arms race, compliance organizations need to rethink and rebuild their detect-response operations.

Today's compliance infrastructure and operations work in a centralized, siloed, retrospective environment built upon old data, traditional batch reporting, and hope. These investments are sufficient for well-defined regulatory reporting requirements where the exact question is specified, the format the answer must take is regulated, and the organization has 60, 90, or even 180 days advance warning of when the button will be pushed and the "report" must appear. We can schedule our vacations around the regulators visit but must interrupt them when the WSJ breaks the latest whistle blower spectacular.

When these calls come, tradition dictates one of two responses. The first is to phone the IT organization and ask them to spin up a data analysis team, free up the best SQL programmers and then negotiate a re-prioritization of the existing 9-month application backlog. The second traditional approach is to identify a war-room from which to spend nights and weekends coordinating a massive, manual review of databases, contract files and other supporting systems, attempting to cross-reference the names, companies or attorneys identified in the news brief. Both of these processes continue at full speed until new workarounds are established and a new silo is created.

There is another way.

What if instead of starting with the specific report and working backward to extract, transform, move, and store regulation-specific data just because you have a regular reporting responsibility, you establish an intelligent, seamless, non-invasive compliance fabric that continually monitors corporate compliance against all known threats and regulations. When new questions or regulations arise, the compliance fabric is re-tasked to explore the newest potential threat and report back on its impact on your networks. Let's take a closer look at how this is accomplished.

The Intelligent compliance fabric is made up of a collection of special function ‘pneurons.' These pneurons, configured and connected graphically by a business analyst, work cooperatively to address all regulatory tasks without forcing the pre-integration or aggregation of data. This means that the compliance organization takes their compliance questions (analytics) directly to the source systems and source data. When new questions arise like, "Are we doing business with any entity represented by this attorney?", the fabric executes the questions over previously established connections, leverages existing matching, parsing and security capabilities, and returns the answer in less time than it took to read this blog.

The number and diversity of the reputational and compliance threats will not decrease over time. The business will continue to grow and evolve in order to stay competitive. The only strategy to enable continuous monitoring and rapid and agile response to the changing compliance landscape is to leverage newer, adaptive and rapidly adjustable technologies.

More Stories By Ken Lawrence

Ken Lawrence is VP of Sales for Pneuron Corporation, a leader in distributed analytics software, enables organizations to rapidly solve business problems through a distributed approach that cuts across data, applications and processes. Lawrence has held a variety of sales leadership positions, most recently with Medio, a real-time analytics provider for mobile application personalization (acquired by Nokia).

Prior to that, he led sales for Memento, a solutions provider in the fraud analytics space (acquired by FIS); transaction processing provider Wincor Nixdorf; startup data management provider Dataupia; and spent over seven years in various sales leadership roles at SAS Institute. He earned a M.S. in Computer Science from Boston University and a B.S. in Computer Science from Clarkson University. Contact: [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...