MetLife

Bootstrapped the federated data governance by introducing a strategy and below capabilities to reinvigorate the 50+ constituent DG programs across BUs, functions & regions.

Global Data Governance Management:
* Program Assessment: Assessed the Data Governance program’s maturity at 1.5/5 and developed the strategy to accelerate maturity.
* Global Data Governance Strategy: Bootstrapped the federated data governance model by formulating the MDGO global strategy comprising of an offensive data enablement for business value (data products) balanced with a defensive data risk management (policies and controls) to reinvigorate 50+ individual global DG programs across BUs, regions and functions.
* Strategy Socialization & Adoption: Socialized the MDGO global strategy with other data VPs and peer DG Directors for adoption.
* Global Data Domain (GDD) Standardization and Data Products: Standardized 17 enterprise GDDs for alignment of 150+ actual GDDs across 50+ DG programs. But provisioned data products with business-product orientation in a data-mesh architecture with stewardship.
* Active Data Intelligence: Optimized metadata management with standards-based curation, passive to active ingestion, and quality checks to provision multi-modal metadata products, insights and active data intelligence with increased data catalog utilization.
* Unstructured Data Governance: Researched market leading tools, performed POCs, and selected the tool for unstructured data governance with peer approvals of Infosec and Privacy leaders as well. Implemented Ohalo with GenAI for content intelligence, and data unification and contextualization with knowledge graphs.
* Security & Access Modernization: Aligned data security classification, labeling, and tagging of sensitive data between structured/unstructured DG and DSPM tools for uniform data discovery and security posture. Explored policy-based access controls for non-human (AI) and persona/risk-based access, coupled with usage analytics for continuous optimization.
* Data Risk for NFRC and TPRM: Established data risk as a category under Non-Financial Risk (Committee). Performed process-risk-controls analysis of DG operations to identify DG controls tied to data and role Accountabilities for data risk management. Also Introduced DG checklist as a preliminary for Third-Party Risk Management to perform inherent-risk evaluation of third-party acquisitions or integrations.

US Business & Corporate Functions:
* ESG-CSRD Data Governance: Governed the data for the Environmental, Social, and Governance – Corporate Sustainability Reporting Directive with the rigor of a regulatory data domain for accuracy, integrity, timeliness and reputational risk mitigation.
* AI Governance: Developed the AI Governance policy with the (defensive) data risk council in alignment to the NIST AI RMF framework.
* Responsible AI Compliance: Implemented the RAI regulatory compliance for insurance industry laws from CO DOI and NY DFS. 
* GenAI enabled Legacy Modernization: Guided reverse-engineering of legacy systems with Intellisys tool to extract operational logic from legacy systems. Further supported extraction of business & data rules and decision tables for RAG enabling GenAI use cases.

Data Governance Assessment was performed by interviewing all the data VPs and directors and sampling the DG processes.

Offensive Data Enablement strategy is focused on
* Global data domains (operational) like Claims data domain are aligned to core business capabilities like Claims management and standardized as a model for the constituent DG programs to align to.
* Product-oriented (ex: dental) data products are provisioned in a data-mesh architecture from these standardized data domains for enterprise-wide consumption needs.
* Stewardship of these business-aligned data domains and respective product-oriented data products is paramount optimal data asset management and utilization value (usage analytics).
* This strategy is actively supported by the BU leaders represented through the Data Governance Council.

Defensive Data Risk management
* Began with establishing Data risk as a category under Non-financial risk framework.
* Further all data governance processes were anlayzed to identify risk and establish appropriate risk controls (Process-Risk-Control framework).
* This effort was performed with the oversight of Operational Risk and Internal Audit leaders.
* The Risk committee, which is represented by leaders of all Governance, Risk, Compliance, Privacy, Legal, Architecture, Info-sec functions, has evolved as the policy-making working body to shape the Policies & Standards to present to DGC for ratification.

Metadata management
* A new 5-tier, 15-component framework was introduced to address all aspects of metadata lifecycle from ingestion to provision.
* Metadata category level accountability is strongly advocated during the curation stage, prior to ingestion into the catalog, ensuring meaningfully accurate & complete metadata.
* Data policies, applicable regulations and data quality rules & thresholds are connected to CDEs in the catalog for better understanding of the business terms, related processes, and compliance needs.
* Repurposed metadata-based metrics visualization for specific end-users with actionable intent.
* Active data intelligence is provisioned for most relevant questions directly from the catalog.

Unstructured data governance is introduced
* To manage all content (docs, pdfs, ppts, architecture diagrams etc.) across all SharePoint and shared drives.
* To scan for sensitive data elements and label them appropriately to apply compliance processes.
* ML based regular-expression patterns are synchronized across structured, unstructured, and info-sec tools for uniformity.
* Sanitized data is allowed by the divisional data leaders to be ingested into LLMs/Agents for chunking, summarization to support a variety of GenAI use cases.

Legacy Modernization – a new GenAI based approach is being explored
* Reverse-engineered all operational logic out of legacy systems.
* AI tools are being explored to extract rules and decision tables out of the extracted logic.
* RAG knowledge-base is being built based on the identified rules & decision patterns to support a variety of use cases like cahtbot etc.
* Upon further maturing of the RAG knowledge-base, the new legacy replacement systems can be developed using vibe-coding.