COVU
Join to apply for the
Senior Data Engineer
role at
COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI‑powered insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI‑first company set to redefine the future of insurance distribution.
Role Overview We are seeking an experienced and product‑focused Senior Data Engineer to be a core member of our Platform product team. This high‑impact role will evolve our core data infrastructure, building a master data solution that serves as the single source of truth for all policy, commission, and client accounting information.
Responsibilities
Develop the Policy Journal: be a primary builder of our master data solution unifying policy, commission, and accounting data from sources like IVANS and Applied EPIC. Implement data models and pipelines that create the "gold record" powering our platform.
Ensure Data Quality and Reliability: implement robust data quality checks, monitoring, and alerting to guarantee accuracy and timeliness of all data pipelines. Champion best practices in data governance and engineering.
Build the Foundational Analytics Platform: implement and enhance our analytics framework using modern tooling (Snowflake, dbt, Airflow). Build and optimize critical data pipelines, transforming raw data into clean, reliable, dimensional models for business intelligence.
Modernize Core ETL Processes: systematically refactor our existing Java & SQL (PostgreSQL) ETL system. Identify and resolve core issues—data duplication, performance bottlenecks—rewriting critical components in Python and migrating orchestration to Airflow.
Implement Data Quality Frameworks: work within the company QA strategy to build and execute automated data validation frameworks. Write tests ensuring accuracy, completeness, and integrity of our data pipelines and the Policy Journal.
Collaborate and Contribute to Design: partner with product managers, the Lead Data Engineer, and business stakeholders to translate complex business requirements into well‑designed, maintainable solutions.
Qualifications
5+ years of data engineering experience, building and maintaining scalable production pipelines.
Expert‑level proficiency in Python and SQL.
Strong experience with a modern data stack: cloud data warehouse (Snowflake or Redshift), orchestration (Airflow preferred), and data transformation tools.
Hands‑on experience with AWS data services (S3, Glue, Lambda, RDS).
Experience in the insurtech industry and familiarity with insurance data concepts (policies, commissions, claims).
Demonstrated ability to design and implement robust data models (dimensional modeling) for analytics and reporting.
Pragmatic problem‑solver who can refactor complex legacy systems; understanding of Java/Hibernate logic is a plus.
Excellent communication skills and ability to collaborate with technical and non‑technical stakeholders.
Bonus Points For
Direct experience with Agency Management Systems such as Applied EPIC, Nowcerts, EZlynx, etc.
Experience with Carrier data (Accord XML, IVANS AL3).
Experience with BI tools like Tableau, Looker, or Power BI.
Prior startup or fast‑paced agile environment experience.
Application Process
Intro call with People team
Technical interviews
Final interview with leaders
Referrals increase your chances of interviewing at COVU by 2x
#J-18808-Ljbffr
Senior Data Engineer
role at
COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI‑powered insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI‑first company set to redefine the future of insurance distribution.
Role Overview We are seeking an experienced and product‑focused Senior Data Engineer to be a core member of our Platform product team. This high‑impact role will evolve our core data infrastructure, building a master data solution that serves as the single source of truth for all policy, commission, and client accounting information.
Responsibilities
Develop the Policy Journal: be a primary builder of our master data solution unifying policy, commission, and accounting data from sources like IVANS and Applied EPIC. Implement data models and pipelines that create the "gold record" powering our platform.
Ensure Data Quality and Reliability: implement robust data quality checks, monitoring, and alerting to guarantee accuracy and timeliness of all data pipelines. Champion best practices in data governance and engineering.
Build the Foundational Analytics Platform: implement and enhance our analytics framework using modern tooling (Snowflake, dbt, Airflow). Build and optimize critical data pipelines, transforming raw data into clean, reliable, dimensional models for business intelligence.
Modernize Core ETL Processes: systematically refactor our existing Java & SQL (PostgreSQL) ETL system. Identify and resolve core issues—data duplication, performance bottlenecks—rewriting critical components in Python and migrating orchestration to Airflow.
Implement Data Quality Frameworks: work within the company QA strategy to build and execute automated data validation frameworks. Write tests ensuring accuracy, completeness, and integrity of our data pipelines and the Policy Journal.
Collaborate and Contribute to Design: partner with product managers, the Lead Data Engineer, and business stakeholders to translate complex business requirements into well‑designed, maintainable solutions.
Qualifications
5+ years of data engineering experience, building and maintaining scalable production pipelines.
Expert‑level proficiency in Python and SQL.
Strong experience with a modern data stack: cloud data warehouse (Snowflake or Redshift), orchestration (Airflow preferred), and data transformation tools.
Hands‑on experience with AWS data services (S3, Glue, Lambda, RDS).
Experience in the insurtech industry and familiarity with insurance data concepts (policies, commissions, claims).
Demonstrated ability to design and implement robust data models (dimensional modeling) for analytics and reporting.
Pragmatic problem‑solver who can refactor complex legacy systems; understanding of Java/Hibernate logic is a plus.
Excellent communication skills and ability to collaborate with technical and non‑technical stakeholders.
Bonus Points For
Direct experience with Agency Management Systems such as Applied EPIC, Nowcerts, EZlynx, etc.
Experience with Carrier data (Accord XML, IVANS AL3).
Experience with BI tools like Tableau, Looker, or Power BI.
Prior startup or fast‑paced agile environment experience.
Application Process
Intro call with People team
Technical interviews
Final interview with leaders
Referrals increase your chances of interviewing at COVU by 2x
#J-18808-Ljbffr