5 Ways to Survive and Thrive in a Competitive Market, using Data Virtualisation
Reading Time: 3 minutes

The phrase “survival of the fittest” originated from Darwin’sevolutionary theory, describing the mechanism by which natural selection occurs. In business, the same principles apply; fast moving markets, new trends, new methods of customer engagement, new emerging technologies, all favour organisations that are agile and adaptable. Never has the role of the data analyst been more important, whether it be to profile new opportunities, optimise processes, derive propensity models for upsell, or the myriad of other business problems they solve in driving efficiency and creating competitive advantage.

Data insights are at the heart of the business analyst’s capability, gained through the agile combination of data from multiple sources. However, the complex variety of new data locations, formats, and protocols means that the traditional methods for data integration are no longer keeping pace with business needs. In response, organisations are turning to data virtualisation; in Gartner’s Market Guide for Data Virtualization, Gartner estimates that through 2022, 60% of organisations will implement data virtualisation as one of the key delivery styles in their data integration architecture.

Why are so many companies turning to data virtualisation? Let’s look at the top 5 challenges voiced by business analysts and how data virtualisation is being used to solve them:

1 – Data Access

“As a business user, it is hard to understand the connectivity, formats, and protocols of all our data sources to manage source changes (Security is also an issue).”

Data virtualisation negates the need for business analysts to have to understand and deal with the complexities of data access. Instead, they just connect to the data virtualisation layer (which looks like a data warehouse but is in fact only metadata). This data virtualisation layer provides easy-to use yet secure access to all data sources, regardless of where they are located, what format(s) they are is in, or what protocol(s) are used – data delivery is handled by the virtual layer, not the business analyst. Data virtualisation enables a logical data warehouse architecture.

2 – Vendor Lock-in

“We’re concerned that the semantic models are being built into our BI tools. This is creating vendor lock-in, so it’s an impediment to the adoption of new BI and analytics tools.”

With Data virtualisation, consumers can use different analytics and visualisation tools that run on top of the shared virtual layer. The semantic model is defined in the virtual layer, so there is no need for the costly re-writes of data models in each new tool. (This also makes a huge difference to the agility of the business, as any change need only be made once.)

3 – Performance of Data

“Our BI tool has limited (or no) query pushdown, so large amounts of data get dragged across the network to the BI server, where it is processed, which gives poor performance on anything at scale.

A key element of performance in data virtualisation solutions is the extent to which they can optimise the query by pushing it down to the sources. This can make a vast difference in the amount of data that needs to be moved across the network. It is one of the key elements of leading data virtualisation solutions such as the Denodo Platform.

4 – Efficiency

“Our analysts and data scientists spend 70-80% of their time gathering and preparing data rather than actually performing analyses.”

This is perhaps the biggest area for ROI to be gained by business analysts that use data virtualisation — being able to minimise the time spent gathering and preparing data. Customer feedback indicates that this time can be reduced from 70-80% down to something on the order of 10-20%. This means the remaining 60-70% of time can be dedicated to the valuable analytics work, effectively increasing the productivity of a business analyst  3 or 4 fold.

5 – Sharing and Collaboration

“Having our semantic model in our BI tool also means that we do not have the ability to share the data model with users of other BI tools without re-writing everything. This is creating extra work as well as inconsistencies in models with different BI tools.”

Using data virtualisation means common semantic models can be created in the virtual layer rather than in the analytics platform. This is key, as it means that the business analysts gain a single view of the truth that is easily shared between users of different analytics tools.

Survive and Thrive

To return to the Darwinian principles of natural selection, the question for business analysts is: If your business is striving to achieve or maintain competitive advantage through business analytics, why wouldn’t you include data virtualisation capabilities?

Charles Southwood