Data governance systems undergoing ongoing evolution
There’s little debate in today’s digital world that it is the internal management or governance of data that is the lifeblood of a successfully operating organization.
Effective data governance systems allow different audiences to access updated, useful and trusted information in an efficient and secure manner that allows an organization’s value to be fully realized. As data governance systems continue to evolve, it is important to consider some of the factors impacting the data space on a daily basis.
Some examples include organizations moving away from formal enterprise data warehouse (EDW) systems to an environment of data lakes. The former (which still likely exists) allows for processes and structure for adding data as well as the process of extracting that data.
Organizations with strong data governance practices around an EDW often have information defined and approved through a data steward process. Such processes may also include standards for reporting as well as subject matter experts actively mapping reference data in accordance with a data governance process.
Another item impacting the data space is the growth and expansion of traditional business intelligence into areas including machine learning and artificial intelligence.
In organizations that I have been part of, the data governance practice has resided on the business side, usually in a “neutral” department that is not reliant on heavy data for daily functions.
The starting point in each of these organizations was implementing a business glossary process, which involved determining how to organize names, definitions, etc., as well as determining an enterprise approval process. There may also be a need to define a process for incorporating valid values within the data governance system.
The payoff of a disciplined approach is greater efficiency and value within the organization as a whole
After this key starting point, various options can be explored around data quality, data lineage, balancing, and standards for business intelligence among other elements.
At West Bend Mutual Insurance there are two principles that guide the data governance process: a common language and a common user experience. The team that works on this process is an enterprise group that supports the enterprise for both business intelligence and data governance. Resources are aligned to a functional area, including the IT team supporting that area. This enables the data governance associate to provide value through business intelligence support while at the same time supporting the data governance principles as processes, such as identifying new business terms to be added to the business glossary.
The other alignment is to a functional area. Quite often this functional department alignment matches the IT team alignment. In the alignment to the functional area, they are providing business intelligence support typically to power users and data consumers within that department. Both these roles work heavily with data stewards across the enterprise.
There are three factors that work in concert with this.
The first factor is providing a common user experience. There are two processes used to support this concept. First, all of our enterprise reports regardless of tool are available in a common place called the business intelligence portal. Each functional area has its own space and determines what is presented. The second process involves business intelligence standards. As reports are developed, they must comply with a set of business intelligence standards – including a report title and ID and ensuring terminology is consistent with the business glossary, etc. The intent of this standard is to create a common look and feel to the user as they navigate reports, dashboards and other information sources.
One of the practices that is currently in place is a report certification process. This means that if the company issues information as “certified” then a business intelligence specialist works with the report developer to ensure appropriate standards are being met. Additionally, “uncertified” content is also published in the portal, necessitating clear identification of certified and uncertified reports.
A second factor is usage metrics. The current two or three commonly used business intelligences tools provide us usage metrics. This includes how often a who is accessing a report and when. This is done for both certified and uncertified content, and allows for proactive management of reports, including identifying uncertified reports that should be certified.
The third factor is data lineage. As an organization, we are currently limited to lineage from database to report layer. This helps us identify impacts when metrics change or data quality issues exist on a data element.
As data lakes and third-party data sources continue to grow, it will no longer be possible for organizations to govern the full spectrum of data that will be available. As a result, more work will be uncertified, which underscores the need for a strong foundation to monitor processes and protocol in a data governance system. The payoff of a disciplined approach is greater efficiency and value within the organization as a whole.