ZipRecruiter
Overview
Job Description Responsibilities
Own the Enterprise Data Health Metrics, including the Enterprise Data Quality Scorecard, ensuring consistent definitions, targets, and drill-downs to remediation owners. Chair the bi-weekly Data Quality Operating Review with PMO: highlight top issues and risks, track MTTR, drive RCAs and prevention actions, and publish decisions/actions. Design and govern the data quality (DQ) rule lifecycle (critical data elements, dimensions, thresholds, exception handling) with shift-left prevention in bronze?silver?gold flows (medallion). Establish data contracts and test gates in CI/CD so bad data is caught pre-production; scale rule reuse via libraries and templates. AI-era data quality & observability: introduce AI-assisted DQ (anomaly/outlier detection, drift monitoring, LLM-assisted rule authoring/explanations, and synthetic data for validation). Partner with AI & Analytics Governance to ensure training/evaluation datasets meet quality, lineage, and consent constraints; contribute to trustworthy AI practices. Partner with PMO & Governance Ops, Domain Stewards, MDM team, and AI & Analytics teams to prioritize fixes with clear owners and dates. Work with Data Engineering to embed medallion/lakehouse guardrails (validation at each hop, schema evolution patterns, replay/rollback). Quantify business impact of fixes (defect burn-down, SLA attainment, incident cost avoidance) and report outcomes to the D&A Council. Maintain an executive-ready DQ narrative: where trust is improving, where risk remains, and which guardrails will accelerate value next. Position requires a Bachelors degree from an accredited institution; minimum 10 years in data management/analytics with 5 years in enterprise data governance and/or quality (rules, scorecards, issue management) in federated organizations. Must be able to work in the United States without corporate sponsorship now and in the future. Proven product ownership of dashboards/scorecards (backlog, UX, adoption, measurable outcomes); hands-on with root-cause analysis, defect prevention, and cross-domain remediation to closure. Leads a lean operating model for data quality, including a standardized action-plan process (clear owners, due dates, countermeasures), tiered operating reviews, and visual management to track and sustain improvements. KPI/Metric Layer & Dashboards: semantic model understanding; KPI definition & standardization; connected metrics, scorecard roll-ups, drill paths, metric certification/versioning. Alation (or similar): glossary curation, certification workflows, stewardship enablement, catalog adoption analytics. Lakehouse/medallion on Azure Databricks/Delta Lake: bronze/silver/gold validation patterns; data contracts; CI/CD for pipelines. Data quality & observability tooling: familiarity with Great Expectations/Deequ/Soda for rules & tests; platforms like Monte Carlo/Bigeye for anomaly/drift/lineage alerting. Pipelines: strong SQL; working Python for profiling/tests; comfort with Git-based workflows and CI/CD (Azure DevOps/GitHub Actions). Lineage & metadata: capturing technical lineage, mapping to business terms; connecting lineage to issue management and RCAs. Security & privacy basics: data classification, masking/PII handling, retention; partner with Security/Privacy on guardrails. AI data readiness: quality criteria for model training/eval sets; data drift and bias detection signals; align with NIST AI RMF Map/Measure/Manage. Executive storytelling & influence: building the business case with crisp visuals and outcome-oriented communication. Program orchestration: runs a recurring, decision-making forum with clear actions and follow-through. Product thinking: prioritizes a scorecard backlog for adoption and impact; ships iteratively with telemetry. Systems thinking: connects people, process, and tech to prevent repeat defects. Facilitation & negotiation: align multiple domains on definitions, thresholds, and timelines without formal authority. Qualifications & Requirements
Minimum education: Bachelors degree from an accredited institution. Experience: Minimum 10 years in data management/analytics with 5 years in enterprise data governance and/or quality (rules, scorecards, issue management) in federated organizations. Position criteria: proven product ownership of dashboards/scorecards (backlog, UX, adoption, measurable outcomes); hands-on with root-cause analysis and cross-domain remediation. Must be able to work in the United States without corporate sponsorship now and in the future. Tools & capabilities: strong SQL, Python for profiling/testing, Git-based workflows, CI/CD; familiar with data quality and observability tooling (Great Expectations/Deequ/Soda; Monte Carlo/Bigeye); lineage and metadata capture; data contracts and CI/CD integration. Other Information
The application window for this position is anticipated to close on 9.5.2025. We are committed to equal employment opportunities for all applicants and employees. Employment decisions are based on job-related reasons regardless of lawful statuses. Qualified applicants with arrest or conviction history will be considered consistent with local laws. You do not need to disclose conviction history or participate in a background check until a conditional job offer is made. Accommodation requests: To request a reasonable accommodation, please call 1-800-836-6345. Only accommodation requests will be accepted at this number. Please note that program options may depend on location, date of hire, and applicable collective bargaining agreements.
#J-18808-Ljbffr
Job Description Responsibilities
Own the Enterprise Data Health Metrics, including the Enterprise Data Quality Scorecard, ensuring consistent definitions, targets, and drill-downs to remediation owners. Chair the bi-weekly Data Quality Operating Review with PMO: highlight top issues and risks, track MTTR, drive RCAs and prevention actions, and publish decisions/actions. Design and govern the data quality (DQ) rule lifecycle (critical data elements, dimensions, thresholds, exception handling) with shift-left prevention in bronze?silver?gold flows (medallion). Establish data contracts and test gates in CI/CD so bad data is caught pre-production; scale rule reuse via libraries and templates. AI-era data quality & observability: introduce AI-assisted DQ (anomaly/outlier detection, drift monitoring, LLM-assisted rule authoring/explanations, and synthetic data for validation). Partner with AI & Analytics Governance to ensure training/evaluation datasets meet quality, lineage, and consent constraints; contribute to trustworthy AI practices. Partner with PMO & Governance Ops, Domain Stewards, MDM team, and AI & Analytics teams to prioritize fixes with clear owners and dates. Work with Data Engineering to embed medallion/lakehouse guardrails (validation at each hop, schema evolution patterns, replay/rollback). Quantify business impact of fixes (defect burn-down, SLA attainment, incident cost avoidance) and report outcomes to the D&A Council. Maintain an executive-ready DQ narrative: where trust is improving, where risk remains, and which guardrails will accelerate value next. Position requires a Bachelors degree from an accredited institution; minimum 10 years in data management/analytics with 5 years in enterprise data governance and/or quality (rules, scorecards, issue management) in federated organizations. Must be able to work in the United States without corporate sponsorship now and in the future. Proven product ownership of dashboards/scorecards (backlog, UX, adoption, measurable outcomes); hands-on with root-cause analysis, defect prevention, and cross-domain remediation to closure. Leads a lean operating model for data quality, including a standardized action-plan process (clear owners, due dates, countermeasures), tiered operating reviews, and visual management to track and sustain improvements. KPI/Metric Layer & Dashboards: semantic model understanding; KPI definition & standardization; connected metrics, scorecard roll-ups, drill paths, metric certification/versioning. Alation (or similar): glossary curation, certification workflows, stewardship enablement, catalog adoption analytics. Lakehouse/medallion on Azure Databricks/Delta Lake: bronze/silver/gold validation patterns; data contracts; CI/CD for pipelines. Data quality & observability tooling: familiarity with Great Expectations/Deequ/Soda for rules & tests; platforms like Monte Carlo/Bigeye for anomaly/drift/lineage alerting. Pipelines: strong SQL; working Python for profiling/tests; comfort with Git-based workflows and CI/CD (Azure DevOps/GitHub Actions). Lineage & metadata: capturing technical lineage, mapping to business terms; connecting lineage to issue management and RCAs. Security & privacy basics: data classification, masking/PII handling, retention; partner with Security/Privacy on guardrails. AI data readiness: quality criteria for model training/eval sets; data drift and bias detection signals; align with NIST AI RMF Map/Measure/Manage. Executive storytelling & influence: building the business case with crisp visuals and outcome-oriented communication. Program orchestration: runs a recurring, decision-making forum with clear actions and follow-through. Product thinking: prioritizes a scorecard backlog for adoption and impact; ships iteratively with telemetry. Systems thinking: connects people, process, and tech to prevent repeat defects. Facilitation & negotiation: align multiple domains on definitions, thresholds, and timelines without formal authority. Qualifications & Requirements
Minimum education: Bachelors degree from an accredited institution. Experience: Minimum 10 years in data management/analytics with 5 years in enterprise data governance and/or quality (rules, scorecards, issue management) in federated organizations. Position criteria: proven product ownership of dashboards/scorecards (backlog, UX, adoption, measurable outcomes); hands-on with root-cause analysis and cross-domain remediation. Must be able to work in the United States without corporate sponsorship now and in the future. Tools & capabilities: strong SQL, Python for profiling/testing, Git-based workflows, CI/CD; familiar with data quality and observability tooling (Great Expectations/Deequ/Soda; Monte Carlo/Bigeye); lineage and metadata capture; data contracts and CI/CD integration. Other Information
The application window for this position is anticipated to close on 9.5.2025. We are committed to equal employment opportunities for all applicants and employees. Employment decisions are based on job-related reasons regardless of lawful statuses. Qualified applicants with arrest or conviction history will be considered consistent with local laws. You do not need to disclose conviction history or participate in a background check until a conditional job offer is made. Accommodation requests: To request a reasonable accommodation, please call 1-800-836-6345. Only accommodation requests will be accepted at this number. Please note that program options may depend on location, date of hire, and applicable collective bargaining agreements.
#J-18808-Ljbffr