Logo
IsoTalent

Sr. Data Engineer

IsoTalent, Salt Lake City, Utah, United States

Save Job

Overview

Our client seeks a skilled Sr. Data Engineer to join their growing technology team in the GovTech and healthcare space. Do you have deep experience building and optimizing large-scale data pipelines? Are you passionate about data quality, accuracy, and security? Do you thrive on solving complex integration challenges and collaborating across teams? If yes, this may be the perfect Sr. Data Engineer position for you. Perks

Competitive salary, $120,000-$190,000, based on experience 401k Generous PTO $100/month for a wellness benefit $50/month for your phone bill A Day in the Life

In this role, you’ll design, build, and maintain the data backbone that powers analytics, reporting, and operational workflows for our client’s platform. You’ll own end-to-end pipelines, integrations, and data modeling while collaborating with product, engineering, and compliance teams to ensure data is accurate, accessible, and secure. Responsibilities

Architect, develop, and operate scalable ETL/ELT pipelines in Python using Apache Airflow, AWS Glue, or similar tools Ingest and transform data from internal services and third-party APIs (REST, streaming, webhooks) Design and maintain schemas in AWS Redshift, Snowflake, or similar warehouses Optimize table structures, partitioning, and indexing for performance and cost efficiency Build and maintain integrations with external systems (payment gateways, identity providers, data vendors) Implement monitoring, retries, and alerting for reliable data flow Enforce data governance, encryption, and access controls in compliance with HIPAA, FedRAMP, and SOC-2 Provide self-service data access and documentation for stakeholders Tune performance and identify opportunities to reduce AWS costs Evaluate and prototype emerging data technologies such as Spark, Kafka, or dbt Requirements and Qualifications

7+ years in data engineering or analytics engineering roles Experience building and managing ETL/ELT pipelines in Python with tools like Airflow or AWS Glue Hands-on experience with cloud data warehouses (e.g., Redshift, Snowflake) Proven success in integrating and transforming data from APIs, streaming platforms, and message queues Familiarity with HIPAA, FedRAMP, and SOC-2 compliance Strong communication skills and ability to collaborate across teams Ability to work remotely in UT About the Hiring Company

Our client is a fast-growing technology company at the intersection of GovTech and healthcare. Their platform helps streamline workflows, enhance data-driven decision-making, and improve outcomes for organizations that serve the public. They are committed to building robust, secure, and high-performance data systems that drive meaningful impact. How to Apply

Start by filling out this 3-minute, mobile-friendly application here. We look forward to hearing from you! Note: This description may include related job postings and informational snippets not essential to the role itself.

#J-18808-Ljbffr