The Mutual Group
Lead Cloud Data Engineer - Remote Opportunity
The Mutual Group, Chicago, Illinois, United States
Lead Cloud Data Engineer - Remote Opportunity
Join to apply for the Lead Cloud Data Engineer - Remote Opportunity role at The Mutual Group.
We are building a next‑generation Cloud Data Platform to unify data from Policy, Claims, Billing, and Administration systems into a single source of truth. We are seeking a Lead Cloud Data Engineer who will be 75% hand‑on and play a critical role in designing, building, and optimizing our modern data ecosystem leveraging Medallion architecture, Delta Lake, and modern data warehouse technologies such as Snowflake, Synapse, or Redshift.
Responsibilities
Define the strategic roadmap for the enterprise data platform, ensuring scalability, performance, and interoperability across business domains.
Architect and implement cloud‑native, Medallion‑based data architectures (Bronze–Silver–Gold layers) for unified and governed data delivery.
Drive standardization of data models, pipelines, and quality frameworks across Policy, Claims, Billing, and Administrative data assets.
Evaluate and implement emerging data technologies to strengthen the platform’s performance, cost efficiency, and resilience.
Design, build, and optimize high‑performance ingestion pipelines, using AWS Glue, Databricks, or custom Spark applications.
Automate ingestion of structured, semi‑structured, and unstructured data from APIs, databases, and external data feeds.
Tune and monitor ingestion pipelines for throughput, cost control, and reliability across dev/test/prod environments.
Hands‑on development of ETL/ELT pipelines using Databricks or similar frameworks to transform raw data into curated and consumption‑ready datasets.
Design and develop relational, vault, and dimensional data models to support analytics, BI, and AI/ML workloads.
Define and enforce data quality standards, validation frameworks, and enrichment rules to ensure trusted business data.
Apply data quality, cleansing, and enrichment logic to ensure accuracy and completeness of business‑critical data.
Collaborate with DevOps and Cloud Engineering teams to design automated, infrastructure‑as‑code environments using Terraform, CloudFormation, or equivalent tools.
Implement CI/CD pipelines for data pipeline deployment, versioning, and testing.
Lead performance tuning and scalability optimization to ensure highly available, cost‑efficient data platform.
Implement and enforce data governance, cataloging, and lineage practices using tools such as Purview, Alation, or Collibra.
Partner with InfoSec to implement data privacy, access control, and compliance frameworks aligned with regulatory standards.
Drive consistency and accountability in data stewardship across business and IT teams.
Lead a team of data engineers, providing technical guidance, coaching, and performance mentorship.
Collaborate with Data Architects, Analysts, and Business Leaders to align data solutions with enterprise strategy.
Promote a culture of engineering excellence, reusability, and knowledge sharing across the data organization.
Influence enterprise‑wide standards for data engineering, automation, and governance.
Qualifications
Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
12+ years of experience in data engineering with at least 3+ years in a lead or architect‑level role and least 8+ years on cloud platforms (AWS, Azure, or GCP).
Deep hands‑on experience with Python, SQL, and data modeling (relational and dimensional), Databricks, Spark, AWS Glue, Delta Lake, Snowflake, Synapse, or Redshift.
Proven experience with Medallion architecture, modern data warehousing principles, data governance, lineage, and CI/CD for data pipelines.
Excellent leadership, communication, and cross‑functional collaboration skills.
Experience in the Property & Casualty (P&C) Insurance domain such as Policy, Claims, or Billing data preferred.
Familiarity with event‑driven architectures (Kafka, Kinesis) and real‑time data streaming.
Knowledge of machine learning pipeline integration and feature engineering.
Proven ability to lead large‑scale data modernization or cloud migration initiatives.
Compensation
$140,000 – $165,000 commensurate with experience, plus bonus eligibility.
Benefits
Competitive base salary plus incentive plans for eligible team members.
401(k) retirement plan with a company match up to 6% of your eligible salary.
Basic life, AD&D, long‑term disability, and short‑term disability insurance.
Medical, dental, and vision plans to meet your unique healthcare needs.
Wellness incentives.
Generous time off program that includes personal, holiday, and volunteer paid time off.
Flexible work schedules and hybrid/remote options for eligible positions.
Educational assistance.
Referrals increase your chances of interviewing at The Mutual Group.
#J-18808-Ljbffr
We are building a next‑generation Cloud Data Platform to unify data from Policy, Claims, Billing, and Administration systems into a single source of truth. We are seeking a Lead Cloud Data Engineer who will be 75% hand‑on and play a critical role in designing, building, and optimizing our modern data ecosystem leveraging Medallion architecture, Delta Lake, and modern data warehouse technologies such as Snowflake, Synapse, or Redshift.
Responsibilities
Define the strategic roadmap for the enterprise data platform, ensuring scalability, performance, and interoperability across business domains.
Architect and implement cloud‑native, Medallion‑based data architectures (Bronze–Silver–Gold layers) for unified and governed data delivery.
Drive standardization of data models, pipelines, and quality frameworks across Policy, Claims, Billing, and Administrative data assets.
Evaluate and implement emerging data technologies to strengthen the platform’s performance, cost efficiency, and resilience.
Design, build, and optimize high‑performance ingestion pipelines, using AWS Glue, Databricks, or custom Spark applications.
Automate ingestion of structured, semi‑structured, and unstructured data from APIs, databases, and external data feeds.
Tune and monitor ingestion pipelines for throughput, cost control, and reliability across dev/test/prod environments.
Hands‑on development of ETL/ELT pipelines using Databricks or similar frameworks to transform raw data into curated and consumption‑ready datasets.
Design and develop relational, vault, and dimensional data models to support analytics, BI, and AI/ML workloads.
Define and enforce data quality standards, validation frameworks, and enrichment rules to ensure trusted business data.
Apply data quality, cleansing, and enrichment logic to ensure accuracy and completeness of business‑critical data.
Collaborate with DevOps and Cloud Engineering teams to design automated, infrastructure‑as‑code environments using Terraform, CloudFormation, or equivalent tools.
Implement CI/CD pipelines for data pipeline deployment, versioning, and testing.
Lead performance tuning and scalability optimization to ensure highly available, cost‑efficient data platform.
Implement and enforce data governance, cataloging, and lineage practices using tools such as Purview, Alation, or Collibra.
Partner with InfoSec to implement data privacy, access control, and compliance frameworks aligned with regulatory standards.
Drive consistency and accountability in data stewardship across business and IT teams.
Lead a team of data engineers, providing technical guidance, coaching, and performance mentorship.
Collaborate with Data Architects, Analysts, and Business Leaders to align data solutions with enterprise strategy.
Promote a culture of engineering excellence, reusability, and knowledge sharing across the data organization.
Influence enterprise‑wide standards for data engineering, automation, and governance.
Qualifications
Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
12+ years of experience in data engineering with at least 3+ years in a lead or architect‑level role and least 8+ years on cloud platforms (AWS, Azure, or GCP).
Deep hands‑on experience with Python, SQL, and data modeling (relational and dimensional), Databricks, Spark, AWS Glue, Delta Lake, Snowflake, Synapse, or Redshift.
Proven experience with Medallion architecture, modern data warehousing principles, data governance, lineage, and CI/CD for data pipelines.
Excellent leadership, communication, and cross‑functional collaboration skills.
Experience in the Property & Casualty (P&C) Insurance domain such as Policy, Claims, or Billing data preferred.
Familiarity with event‑driven architectures (Kafka, Kinesis) and real‑time data streaming.
Knowledge of machine learning pipeline integration and feature engineering.
Proven ability to lead large‑scale data modernization or cloud migration initiatives.
Compensation
$140,000 – $165,000 commensurate with experience, plus bonus eligibility.
Benefits
Competitive base salary plus incentive plans for eligible team members.
401(k) retirement plan with a company match up to 6% of your eligible salary.
Basic life, AD&D, long‑term disability, and short‑term disability insurance.
Medical, dental, and vision plans to meet your unique healthcare needs.
Wellness incentives.
Generous time off program that includes personal, holiday, and volunteer paid time off.
Flexible work schedules and hybrid/remote options for eligible positions.
Educational assistance.
Referrals increase your chances of interviewing at The Mutual Group.
#J-18808-Ljbffr