Premier financial institution upgrades data environment improving latency, quality, and governance

Grandview helps client successfully build a Security & Reference Master using Markit EDM as its foundation; bringing together instrument reference and time series data across multiple vendors to enable a “golden copy” of data built on ISO standards for use across the organization.

 

Client Profile

A premier financial institution serving the capital markets and other industries.

Business Situation

The client had inconsistencies surrounding sourcing and use of data across its platform leading to elevated operational risk and challenges to adapt to and deliver products meeting ever-changing client demands.  Legacy systems and outdated enterprise data management infrastructure were key contributors to these inconsistencies; leading to latency, poor data quality, and increased manual efforts to support its existing operating model.  Additionally, the existing landscape hindered efforts to execute a governance strategy.  This all led to the client’s desire to move to a modern/flexible data landscape while reducing risk on single vendor providers, helping streamline operations, and mitigating risk through automation and governance.

Grandview's Role

Grandview, through its industry and data expertise, assisted in both the advisory and execution phases of the program. 

Phase 1: Starting with the design advisory, Grandview guided the appropriate future-state technology architecture, tools, and target operating model.

  • Gathered and documented core business requirements from business stakeholders
  • Defined future-state technology architecture and its integration paths across existing and new systems/applications
  • Designed target operating model framework
  • Set a strategy for data culture transformation

Phase 2: Grandview provided the designs, detailed roadmaps, and expertise to successfully execute the program.

  • Provided technical business analyst resources to implement the end-to-end solution
  • Integrated a set of developers into Markit EDM’s delivery team for efficient and effective development surrounding data integration, matching logic, and mastering rules across seven data providers
  • Led solution test planning and execution throughout the agile delivery process
  • Designed and implemented an end-to-end Operating Model supported by a formalized ‘knowledge center’ with documented process flows, BAU procedures, controls, and program artifacts
  • Enabled an Alation data catalog to serve as a window into the available data and its structure and lineage across its lifecycle within the organization
  • Conceptualized and architected a data mart and associated synchronization processes, serving a downstream pipeline to Databricks and its Delta Lake architecture.
  • Facilitated training across the organization to promote the adoption of new tools and applications and build institutional knowledge surrounding solutions
  • Created a suite of quality checks on key data elements to programmatically highlight and prioritize potential issues for data stewards

Business Results

  • A single-source of truth for the organization’s security and reference data leading to data sourcing efficiencies and higher-quality data for its services offered
  • Over one million securities matched across vendors, mastered and available to the business
  • A more efficient and scalable operating model offering transparency and heightened controls to promote trust in the data
  • Over 5,000 data objects curated within the Alation data catalog to support efficient and effective business and change efforts
  • Identification of ownership of data, processes, and controls for efficient and effective action/resolution
  • Reduction in vendor dependency risks by diversifying vendor integrations and establishing redundancies for fallback protocols
  • Reduction in operational risk via automated controls and enhanced business logic to minimize manual intervention
  • Integration of the Security Reference data to the Databricks platform enables the use of significant compute resources, enabling users to run complex modeling simulations on the data with reduced processing times
  • Maximized ability to convert on revenue opportunities due to the ability to scale operations and shorten time-to-market for new client products