Data Governance for Asset Managers

Data Governance for Asset Managers, Part 1: Setting the Stage

Data governance is the framework through which asset managers ensure data is high-quality, reliable, trusted, and accessible.

This article is the first in a multi-part series on data governance by Arun Krishnamurthy, Managing Director at Grandview Analytics. The series will focus on how asset and investment managers can establish a robust data governance practice and a culture of data.

Asset and investment managers are inundated with data 

If you work in the asset management and financial services industry, you’ve probably heard – or said – statements such as:  

  • “Data is the business.” 
  • “Data is the firm’s greatest asset.” 
  • “Improved data will make us more agile and prepared to respond to business opportunities.”
  • “We aim to be a data driven organization.”


Throughout the investment lifecycle, asset managers use multiple best-of-breed platforms to support the front office, middle office, and back office as well as functions such as risk, performance, and reporting. These platforms consume and produce valuable data assets. Additionally, asset managers get data from market data providers, such as security indicative data from Bloomberg.

These large volumes of structured data are often paired with semi-structured and unstructured data from a variety of other sources and are increasingly used for data science initiatives. All this data needs to be managed in a way that enables the organization to leverage it for analysis, portfolio management, AI, and other analytical capabilities.

Data governance in the investment lifecycle

Data governance for asset managers touches all parts of the investment lifecycle

Making the case for data governance

All this data introduces challenges. For example, the quality and timeliness of the data is often unknown. The organization may not know where the data came from and what the data means. They may not have a clear understanding of the appropriate usage of the available data or who to go to for questions about the data. The data used in and for business processes needs to be accessible, managed, and governed.

An organization’s focus should be on extracting the greatest amount of informational value from its data, which can only be accomplished by enabling the consumers of that data to use it with the least amount of friction and the highest degree of trust.

All data users benefit from having answers to the following questions:

  • Where did the data come from?
  • What does the data mean and how can it be used?
  • Can I trust the data content?
  • Who validated the data content?
  • Are we using the right source of data?
  • Is my data current?
  • How do I raise and resolve issues with the data?

For asset management firms, the consequences of reporting incorrect data or making investment decisions with inaccurate reporting can be massive, with the potential for regulatory, financial, and reputational impacts. At most asset management firms, data management efforts are intended to ensure data quality at the point where the data is delivered to either a client or a regulator or for portfolio management processes. Furthermore, significant staff hours are often invested to manually review the data for accuracy, and these “4-eye check” processes are still subject to human error.    

Due to the esoteric nature of each organization’s workflows, the processes supporting the enforcement of data quality are often bespoke. In many cases, these processes have grown organically through the best efforts of staff across multiple functions. They are frequently manual, reactive, and non-standard. Often, there is no clear ownership or “who to go to” if there are questions or issues with a data point. As a result, data quality levels are often unpredictable and unreliable, while related processes lack clear definition, accountability, and scale.

What does a robust data governance practice include? 

The adoption of a robust data governance framework with buy-in across the organization will address many of the data quality and data management challenges outlined above and will enhance the value derived from the data. A mature and holistic data governance framework should address the following components:   

A robust data governance framework is needed to address data quality and data management challenges

Data governance framework and policies: The set of standards, procedures, and rules of engagement that each data steward follows, which define how data governance is applied within the organization.

How is data governance practiced in your firm? What is a data steward? Who are the data stewards in your organization? What does a data steward do and how do they do it? 

A data governance framework answers these questions and enables consistency and repeatability of approach, execution, results, measurement, and reporting of data governance across all the data domains and data stewards in an asset management firm.

Data steward: The person (and team) responsible for the ownership of data governance components and their associated processes for a specific domain. 

Which data domains are critical to your organization? Which data domains are critical to your important business processes? What are the data requirements for your business? What is the forward-looking demand for data? Who owns the data domains? Who monitors the access, quality, and timeliness of data? 

The data steward is responsible for ensuring data governance processes are completed and executed in a satisfactory manner. Typically, this will be the group or person with the most proactive view of the use cases, current and future distribution, and consumption patterns of the data domain they are responsible for. Each steward is responsible for a specific set of data domain(s). 

Metadata: Data about the data.

Where is the data from? What is the lineage of the data – what has happened to it since it was created? What is the preferred source? What is the business and technical meaning of the data? 

Metadata describes the data. It has information about the content, structure, and lineage of the data. Metadata provides information about where the data comes from, how the data is processed, how the data is stored, who validates the data, what the data means, and how the data can be used. 

Data architecture, model, schema, and integration: The conceptual and business modeling of data intended to relate diverse data sources to each other and integrate disparate sources of data in a consistent and conformed manner. 

How do different data domains relate to each other and the organization’s business? How do your different sources of data for the same domain relate to or match each other? Which business processes create data and in which domains?  

Physical modeling and technical data integration patterns may also be covered here or separately by the data engineering function. 

Data access, privacy, and compliance: Policies and processes that ensure data access is controlled and acceptable, given the consumer and confidentiality of the data used, including pivacy and compliance requirements.  

How do you ensure that access to data is democratized – that everyone who needs the data is able to access and use it, regardless of their technical abilities? How do you request access to the data? How do you ensure that access is only provided where appropriate? How does an organization ensure that any privacy or compliance regulations are enforced? 

Data quality management: Measuring, testing, and managing the quality of the data going through the data pipeline the method of collecting, validating, transforming, and moving data from one or many sources to a destination such as a data warehouse or other type of data platform.  

Was the data received at its destination? Is the data complete? Does it have the correct “as of” date? Are the values in critical data attributes correct? 

These data quality tests are applied to the data either before or during the data pipeline or at the end of data processing through the entire investment lifecycle to validate that the data meets quality and completion standards. 

Data issue management: Reporting, tracking, and resolving issues with the data. 

Who do you notify if data is missing? How do you notify the appropriate team? How do you know what is being done about the issue reported? How do you monitor status? 

These are the workflows and processes used to raise issues with the data, track resolution progress, and establish a feedback loop for continuous data improvements back to the team responsible for the development work on the data pipeline.  

Data governance helps maximize the value of data

Data is at the heart of an asset management firm. A clearly defined data governance framework supports data quality, reliability, and accessibility throughout the investment lifecycle and creates a common and shared understanding of the data across the organization. 

Part 2 of this series will include a deeper discussion of data governance framework and policies, defining best practices and processes needed to support a strong culture of data in asset management organizations.

At Grandview Analytics, we are here to help our clients solve data challenges and create strong data cultures through our consulting and managed data services. Feel free to reach out to Arun to discuss this article or for more information about how we can help: arun.krishnamurthy@grandviewanalytics.com.

ABOUT GRANDVIEW ANALYTICS

Grandview Analytics is a technology consulting and investment data management software company serving financial institutions. We offer data strategy, technology implementation, systems integration, and analytics consulting services as well as an outsourced investment data management and reporting service powered by our proprietary, cloud-based platform, Rivvit.

Our services drive improved business processes, integrated technologies, accurate and timely data, and enhanced decision-making capabilities. Our seasoned team of financial industry professionals brings deep business and technical domain expertise across asset classes and trade lifecycle. With hands-on financial industry experience, we execute on complex initiatives that help clients optimize ROI on data and technology investments.

Share:

ABOUT THE AUTHOR

Picture of Arun Krishnamurthy

Arun Krishnamurthy

Arun is Managing Director at Grandview Analytics, having joined the company in 2023. With 30 years of financial industry experience, Arun has mastered industry-leading best practices for complex data and technology integration, architecture, analytics, and governance programs.

RECENT NEWS

Are You Ready For AI?

Where are you in your artificial intelligence journey? Whether your organization intends to use AI to support front, middle, or back-office activities, you must overcome fundamental data challenges.