Welcome!

Cloud Security Authors: Zakia Bouachraoui, Pat Romanski, Elizabeth White, Yeshim Deniz, Liz McMillan

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Article

Data Virtualization Best Practices

Five lessons from leading data virtualization adopters

Driven by business demands for greater agility and lower IT costs, enterprise adoption of data virtualization has moved into the technology mainstream.

With hundreds of organizations deploying data virtualization, a number of best practices have emerged.

In compiling the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, co-author Judith R. Davis and I identified five best practices common to these successful implementations.

In this article, I will pass along the lessons these ten organizations learned along the way, as they are a valuable to help other organizations avoid common pitfalls and realize the benefits of data virtualization as quickly as possible.

1.  Centralize responsibility for implementing data virtualization
Several organizations stressed the need to centralize the initial design, development and deployment responsibility for data virtualization into a focused data virtualization team. The key benefit here is the ability to advance the effort quickly and to take on the bigger concepts, such as defining common canonicals and implementing an intelligent storage component to speed development, reduce time to solution and deliver a more powerful and complete data virtualization environment.

Relative to data virtualization development, Northern Trust added that centralization provides economies of scale and enable the company to accelerate up the best practices learning curve.

Agreement on a common data model was also identified with the centralization best practice.  Several of the users commented that this will ensure consistent, high quality data, make business users more confident in the data and make IT staff more agile and productive.

According to Qualcomm, establishing a clearly defined, centralized data virtualization development and management environment was important. Key issues are who is responsible for the shared infrastructure and for shared services.

Qualcomm also described the importance of management support for the data virtualization effort. This may be easier if responsibility for data virtualization is centralized.

2.  Educate the business on the benefits of data virtualization
Multiple organizations, including Northern Trust, Pfizer, the Fortune 50 Financial Services Firm and the Fortune 50 Computer Manufacturer emphasized the need to educate and support the business to successfully implement data virtualization.

Recommended ways to educate the business included:

  • Allocate time to consult with business users and make sure they understand the data.
  • Be prepared to provide support if the user has questions and diagnose problems accurately.
  • Establish a culture of information sharing. Sharing and transparency are critical to increasing the value of the information.
  • Internally market the concept of data virtualization. Establish an ongoing effort to make data virtualization acceptable in other areas of the organization. This involves educating people on its significant benefits and flexibility, taking a phased approach to expanding the scope of the data virtualization environment, and building incrementally on the success of each individual project.
  • Expect resistance when bringing in a new approach/technology like data virtualization. There is a need to convince people who are more comfortable with a physical consolidation approach that data virtualization offers significant benefits and can solve a wide range of data integration problems.
  • Manage business expectations. Implementation is faster than other data integration techniques, but it still takes time.

3.  Pay attention to performance tuning and scalability
NYSE Euronext, Qualcomm, the Fortune 50 Computer Manufacturer and Comcast all contributed relevant advice including:

  • Tune performance and test solution scalability early in the development process.
  • Performance tuning expertise is critical. Ensure access to an expert in tuning the data virtualization platform/SQL to do the necessary performance tuning.
  • Consider bringing in massively parallel processing (MPP) capability to handle query performance on high volume data and align the MPP and data virtualization implementations.
  • Accommodate the fact that users are unpredictable on ad hoc analysis and reporting.

Performance of the data virtualization platform was a critical success factor for several of the case study organizations. Comcast, Qualcomm and the Fortune 50 Computer Manufacturer all cited the performance, reliability and scalability of their Composite Data Virtualization Platform as key to their success.

4.  Take a phased approach to implementing data virtualization
Both Northern Trust and NYSE Euronext stressed the need to take a step-by-step approach to implementing data virtualization. NYSE Euronext counsels users to evolve the implementation step by step - first abstract the data sources, then layer the BI applications on top and gradually implement the more advanced federation capabilities of data virtualization.

The Global 100 Financial Services Firm confirms that it is appropriate to start small with point implementations.  This enables the organization to accelerate the business benefits and help fund larger deployments.

Qualcomm cautions user organizations to use data virtualization only where it is appropriate. Data virtualization is not a panacea or the solution to every problem.

5.  Use an experienced vendor partner for data virtualization technology
Based on its experience, NYSE Euronext firmly advises other organizations to partner with an experienced vendor with a mature product when implementing data virtualization.  Their vendor of choice is Composite Software.

Northern Trust suggests taking advantage of vendor professional services to help with difficult design and optimization challenges, avoid pitfalls and resolve issues quickly.

The Global 100 Financial Services Firm established close cooperation with their data virtualization vendor to design and fund necessary enhancements to their data virtualization platform.  The company also stressed the importance of the quality of the people involved in this collaboration, with both the Financial Services Firm and their data virtualization vendor providing their top architects and technologists.

Comcast and the Global 100 Financial Services Firm both cited the importance of their data virtualization's vendor's responsiveness and flexibility in enhancing its product to meet user requirements as critical success factors in their data virtualization implementations.

Conclusion
Data virtualization technology is powerful and mature.  But to gain the full benefit, organizations need to implement data virtualization using proven best practices.

This article highlights five data virtualization best practices derived from ten highly successful data virtualization implementations.  Use these to help you succeed as well.  You will be glad you did.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This article includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...