Wellabe
Data Engineer- Hybrid, Des Moines, Iowa
Join Wellabe's Chief Data & Analytics Office as we modernize our data ecosystem and migrate to a cutting-edge Databricks Lakehouse architecture on Azure. We're hiring a Data Engineer to help build a Single Source of Truth that powers analytics, reporting, and AI across our insurance business. Position level will be determined based on individuals' skill, knowledge, and experience. At the Data Engineer level, you will collaborate with business stakeholders, data scientists, and technology teams to design, build, and optimize cloud-based data pipelines. Your work will ensure secure, scalable, and reliable data delivery across departments including sales, actuarial, finance, operations, and marketing. At the Senior Data Engineer level, you will take on a leadership role in designing and delivering the data platform. You'll provide architectural guidance, mentor junior engineers, and lead the development of scalable data solutions. You'll work closely with architects, actuaries, and business leaders to enable enterprise-wide analytics and AI/ML capabilities. Essential Functions Shared Responsibilities (Both Levels): Design, build, and maintain ETL/ELT pipelines in Databricks using Delta Live Tables. Migrate data from legacy Azure SQL Server environments to the Azure Databricks Lakehouse. Collaborate with data architects and business stakeholders to support enterprise-wide analytics. Ensure data quality, pipeline integrity, and secure handling of sensitive data (PII, PHI). Partner with analytics teams to enable AI/ML and self-service reporting. Support CI/CD and DevOps practices using Azure DevOps and GitHub Actions. Additional Responsibilities for Senior Data Engineer: Lead the development of end-to-end migration strategies and architectural decisions. Mentor junior engineers and foster a high-performance engineering culture. Establish and enforce best practices for data modeling, governance, and metadata management. Drive adoption of platform standards like Unity Catalog and Delta Lake. Stay current with emerging technologies and guide platform evolution. Success Profile by Level Knowledge, Skills, and Abilities Applicable to Both Data Engineer and Senior Data Engineer Roles: Proficiency in SQL and Python for data transformation and pipeline development. Hands-on experience with Databricks, including PySpark, Delta Live Tables, Lakehouse architecture, and Unity Catalog. Familiarity with data modeling, data warehousing, and ETL/ELT design patterns. Strong expertise in the Azure ecosystem, including Azure Data Lake, Data Factory, SQL Server, Synapse, Key Vault, and Event Hub. Understanding of insurance industry operations, KPIs, and data domains such as policy, claims, actuarial, and customer data. Knowledge of data governance, lineage, and compliance requirements in regulated industries. Ability to work effectively in Agile/Scrum delivery models. Exposure to real-time data ingestion tools like Kafka, Event Hub, or Stream Analytics. Additional for Senior Data Engineer: Advanced skills in SQL and Python for complex data orchestration and transformation. Proven ability to troubleshoot and optimize data workflows, including performance tuning and error handling. Demonstrated experience in large-scale data migrations and building enterprise-grade data platforms. Expertise in metadata management, data cataloging, and implementing data quality frameworks. Strong leadership and mentoring capabilities to support junior engineers and promote engineering best practices. Familiarity with DevOps, CI/CD pipelines, and infrastructure-as-code tools such as Azure DevOps, Terraform, and GitHub Actions. Strong business acumen with the ability to translate insurance data challenges into strategic technical solutions. Required Databricks certification and deep familiarity with emerging cloud and data technologies. Qualifications: Data Engineer: Bachelor's degree in Business Analytics, Data Science, Computer Science, or related field. 13 years of experience in cloud-based data engineering (insurance industry preferred). Proficiency in Databricks (PySpark, Delta Live Tables, Spark SQL), SQL, and Python. Familiarity with Azure services (Data Lake, Data Factory, Synapse, Key Vault). Exposure to DevOps, CI/CD, and real-time ingestion tools (Kafka/Event Hub). Senior Data Engineer: Bachelor's degree in Business Analytics, Data Science, Computer Science, or related field. 3+ years of experience in cloud-based data engineering with proven leadership. Advanced expertise in Databricks and Azure ecosystem. Experience in large-scale data migrations and building enterprise data platforms. Strong business acumen and ability to translate data needs into technical solutions. Databricks certification required. Work Environment & Physical Requirements: Primarily office-based or remote work Sedentary work with extensive computer use. Regular communication via email, video conferencing, and phone Travel Requirements: Minimal travel for training, seminars, or conferences This job description does not list all activities, duties, or responsibilities that may be required. The employee in this position may be assigned other duties at any time with or without notice. This job description does not constitute a contract of employment and the company may exercise its employment-at-will rights at any time. Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.
Join Wellabe's Chief Data & Analytics Office as we modernize our data ecosystem and migrate to a cutting-edge Databricks Lakehouse architecture on Azure. We're hiring a Data Engineer to help build a Single Source of Truth that powers analytics, reporting, and AI across our insurance business. Position level will be determined based on individuals' skill, knowledge, and experience. At the Data Engineer level, you will collaborate with business stakeholders, data scientists, and technology teams to design, build, and optimize cloud-based data pipelines. Your work will ensure secure, scalable, and reliable data delivery across departments including sales, actuarial, finance, operations, and marketing. At the Senior Data Engineer level, you will take on a leadership role in designing and delivering the data platform. You'll provide architectural guidance, mentor junior engineers, and lead the development of scalable data solutions. You'll work closely with architects, actuaries, and business leaders to enable enterprise-wide analytics and AI/ML capabilities. Essential Functions Shared Responsibilities (Both Levels): Design, build, and maintain ETL/ELT pipelines in Databricks using Delta Live Tables. Migrate data from legacy Azure SQL Server environments to the Azure Databricks Lakehouse. Collaborate with data architects and business stakeholders to support enterprise-wide analytics. Ensure data quality, pipeline integrity, and secure handling of sensitive data (PII, PHI). Partner with analytics teams to enable AI/ML and self-service reporting. Support CI/CD and DevOps practices using Azure DevOps and GitHub Actions. Additional Responsibilities for Senior Data Engineer: Lead the development of end-to-end migration strategies and architectural decisions. Mentor junior engineers and foster a high-performance engineering culture. Establish and enforce best practices for data modeling, governance, and metadata management. Drive adoption of platform standards like Unity Catalog and Delta Lake. Stay current with emerging technologies and guide platform evolution. Success Profile by Level Knowledge, Skills, and Abilities Applicable to Both Data Engineer and Senior Data Engineer Roles: Proficiency in SQL and Python for data transformation and pipeline development. Hands-on experience with Databricks, including PySpark, Delta Live Tables, Lakehouse architecture, and Unity Catalog. Familiarity with data modeling, data warehousing, and ETL/ELT design patterns. Strong expertise in the Azure ecosystem, including Azure Data Lake, Data Factory, SQL Server, Synapse, Key Vault, and Event Hub. Understanding of insurance industry operations, KPIs, and data domains such as policy, claims, actuarial, and customer data. Knowledge of data governance, lineage, and compliance requirements in regulated industries. Ability to work effectively in Agile/Scrum delivery models. Exposure to real-time data ingestion tools like Kafka, Event Hub, or Stream Analytics. Additional for Senior Data Engineer: Advanced skills in SQL and Python for complex data orchestration and transformation. Proven ability to troubleshoot and optimize data workflows, including performance tuning and error handling. Demonstrated experience in large-scale data migrations and building enterprise-grade data platforms. Expertise in metadata management, data cataloging, and implementing data quality frameworks. Strong leadership and mentoring capabilities to support junior engineers and promote engineering best practices. Familiarity with DevOps, CI/CD pipelines, and infrastructure-as-code tools such as Azure DevOps, Terraform, and GitHub Actions. Strong business acumen with the ability to translate insurance data challenges into strategic technical solutions. Required Databricks certification and deep familiarity with emerging cloud and data technologies. Qualifications: Data Engineer: Bachelor's degree in Business Analytics, Data Science, Computer Science, or related field. 13 years of experience in cloud-based data engineering (insurance industry preferred). Proficiency in Databricks (PySpark, Delta Live Tables, Spark SQL), SQL, and Python. Familiarity with Azure services (Data Lake, Data Factory, Synapse, Key Vault). Exposure to DevOps, CI/CD, and real-time ingestion tools (Kafka/Event Hub). Senior Data Engineer: Bachelor's degree in Business Analytics, Data Science, Computer Science, or related field. 3+ years of experience in cloud-based data engineering with proven leadership. Advanced expertise in Databricks and Azure ecosystem. Experience in large-scale data migrations and building enterprise data platforms. Strong business acumen and ability to translate data needs into technical solutions. Databricks certification required. Work Environment & Physical Requirements: Primarily office-based or remote work Sedentary work with extensive computer use. Regular communication via email, video conferencing, and phone Travel Requirements: Minimal travel for training, seminars, or conferences This job description does not list all activities, duties, or responsibilities that may be required. The employee in this position may be assigned other duties at any time with or without notice. This job description does not constitute a contract of employment and the company may exercise its employment-at-will rights at any time. Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.