The Mutual Group
Lead Cloud Data Engineer - Remote Opportunity
The Mutual Group, Atlanta, Georgia, United States, 30383
Lead Cloud Data Engineer – Remote Opportunity at
The Mutual Group .
We are building a next-generation Cloud Data Platform to unify data from Policy, Claims, Billing, and Administration systems into a single source of truth. This role is 75% hands‑on and will design, build, and optimize our modern data ecosystem leveraging Medallion architecture, Delta Lake, and modern data warehouse technologies such as Snowflake, Synapse, or Redshift.
Responsibilities
Define the strategic roadmap for the enterprise data platform, ensuring scalability, performance, and interoperability across business domains.
Architect and implement cloud‑native, Medallion‑based data architectures (Bronze‑Silver‑Gold layers) for unified and governed data delivery.
Drive standardisation of data models, pipelines, and quality frameworks across Policy, Claims, Billing, and Administrative data assets.
Evaluate and implement emerging data technologies to strengthen the platform’s performance, cost efficiency, and resilience.
Design, build, and optimise high‑performance ingestion pipelines using AWS Glue, Databricks, or custom Spark applications.
Automate ingestion of structured, semi‑structured, and unstructured data from APIs, databases, and external data feeds.
Tune and monitor ingestion pipelines for throughput, cost control, and reliability across dev/test/prod environments.
Hands‑on development of ETL/ELT pipelines using Databricks or similar frameworks to transform raw data into curated and consumption‑ready datasets.
Design and develop relational, vault, and dimensional data models to support analytics, BI, and AI/ML workloads.
Define and enforce data quality standards, validation frameworks, and enrichment rules to ensure trusted business data.
Apply data quality, cleansing, and enrichment logic to ensure accuracy and completeness of business‑critical data.
Collaborate with DevOps and Cloud Engineering teams to design automated, IaC environments using Terraform, CloudFormation, or equivalent tools.
Implement CI/CD pipelines for data pipeline deployment, versioning, and testing.
Lead performance tuning and scalability optimisation to ensure highly available, cost‑efficient data platform.
Implement and enforce data governance, cataloging, and lineage practices using tools such as Purview, Alation, or Collibra.
Partner with InfoSec to implement data privacy, access control, and compliance frameworks aligned with regulatory standards.
Drive consistency and accountability in data stewardship across business and IT teams.
Lead a team of data engineers, providing technical guidance, coaching, and performance mentorship.
Collaborate with Data Architects, Analysts, and Business Leaders to align data solutions with enterprise strategy.
Promote a culture of engineering excellence, reusability, and knowledge sharing across data organisation.
Influence enterprise‑wide standards for data engineering, automation, and governance.
Qualifications
Bachelor’s or master’s degree in computer science, data engineering, or a related field.
12+ years of experience in data engineering with at least 3 years in a lead or architect‑level role and a minimum of 8+ years on cloud platforms (AWS, Azure, or GCP).
Deep hands‑on experience with Python, SQL, and data modelling (relational and dimensional), Databricks, Spark, AWS Glue, Delta Lake, Snowflake, Synapse, or Redshift.
Proven experience with Medallion architecture, modern data warehousing principles, data governance, lineage, and CI/CD for data pipelines.
Excellent leadership, communication, and cross‑functional collaboration skills.
Experience in the Property & Casualty (P&C) Insurance domain such as Policy, Claims, or Billing data preferred.
Familiarity with event‑driven architectures (Kafka, Kinesis) and real‑time data streaming.
Knowledge of machine learning pipeline integration and feature engineering.
Proven ability to lead large‑scale data modernization or cloud migration initiatives.
Compensation
US$140,000 – $165,000 commensurate with experience, plus bonus eligibility.
Benefits
Competitive base salary plus incentive plans for eligible team members.
401(k) retirement plan with a company match of up to 6% of your eligible salary.
Free basic life and AD&D, long‑term disability and short‑term disability insurance.
Medical, dental, and vision plans to meet your unique healthcare needs.
Wellness incentives.
Generous time off programme that includes personal, holiday, and volunteer paid time off.
Flexible work schedules and hybrid/remote options for eligible positions.
Educational assistance.
#J-18808-Ljbffr
The Mutual Group .
We are building a next-generation Cloud Data Platform to unify data from Policy, Claims, Billing, and Administration systems into a single source of truth. This role is 75% hands‑on and will design, build, and optimize our modern data ecosystem leveraging Medallion architecture, Delta Lake, and modern data warehouse technologies such as Snowflake, Synapse, or Redshift.
Responsibilities
Define the strategic roadmap for the enterprise data platform, ensuring scalability, performance, and interoperability across business domains.
Architect and implement cloud‑native, Medallion‑based data architectures (Bronze‑Silver‑Gold layers) for unified and governed data delivery.
Drive standardisation of data models, pipelines, and quality frameworks across Policy, Claims, Billing, and Administrative data assets.
Evaluate and implement emerging data technologies to strengthen the platform’s performance, cost efficiency, and resilience.
Design, build, and optimise high‑performance ingestion pipelines using AWS Glue, Databricks, or custom Spark applications.
Automate ingestion of structured, semi‑structured, and unstructured data from APIs, databases, and external data feeds.
Tune and monitor ingestion pipelines for throughput, cost control, and reliability across dev/test/prod environments.
Hands‑on development of ETL/ELT pipelines using Databricks or similar frameworks to transform raw data into curated and consumption‑ready datasets.
Design and develop relational, vault, and dimensional data models to support analytics, BI, and AI/ML workloads.
Define and enforce data quality standards, validation frameworks, and enrichment rules to ensure trusted business data.
Apply data quality, cleansing, and enrichment logic to ensure accuracy and completeness of business‑critical data.
Collaborate with DevOps and Cloud Engineering teams to design automated, IaC environments using Terraform, CloudFormation, or equivalent tools.
Implement CI/CD pipelines for data pipeline deployment, versioning, and testing.
Lead performance tuning and scalability optimisation to ensure highly available, cost‑efficient data platform.
Implement and enforce data governance, cataloging, and lineage practices using tools such as Purview, Alation, or Collibra.
Partner with InfoSec to implement data privacy, access control, and compliance frameworks aligned with regulatory standards.
Drive consistency and accountability in data stewardship across business and IT teams.
Lead a team of data engineers, providing technical guidance, coaching, and performance mentorship.
Collaborate with Data Architects, Analysts, and Business Leaders to align data solutions with enterprise strategy.
Promote a culture of engineering excellence, reusability, and knowledge sharing across data organisation.
Influence enterprise‑wide standards for data engineering, automation, and governance.
Qualifications
Bachelor’s or master’s degree in computer science, data engineering, or a related field.
12+ years of experience in data engineering with at least 3 years in a lead or architect‑level role and a minimum of 8+ years on cloud platforms (AWS, Azure, or GCP).
Deep hands‑on experience with Python, SQL, and data modelling (relational and dimensional), Databricks, Spark, AWS Glue, Delta Lake, Snowflake, Synapse, or Redshift.
Proven experience with Medallion architecture, modern data warehousing principles, data governance, lineage, and CI/CD for data pipelines.
Excellent leadership, communication, and cross‑functional collaboration skills.
Experience in the Property & Casualty (P&C) Insurance domain such as Policy, Claims, or Billing data preferred.
Familiarity with event‑driven architectures (Kafka, Kinesis) and real‑time data streaming.
Knowledge of machine learning pipeline integration and feature engineering.
Proven ability to lead large‑scale data modernization or cloud migration initiatives.
Compensation
US$140,000 – $165,000 commensurate with experience, plus bonus eligibility.
Benefits
Competitive base salary plus incentive plans for eligible team members.
401(k) retirement plan with a company match of up to 6% of your eligible salary.
Free basic life and AD&D, long‑term disability and short‑term disability insurance.
Medical, dental, and vision plans to meet your unique healthcare needs.
Wellness incentives.
Generous time off programme that includes personal, holiday, and volunteer paid time off.
Flexible work schedules and hybrid/remote options for eligible positions.
Educational assistance.
#J-18808-Ljbffr