Logo
Vaco by Highspring

Data Engineer (ETL Developer) (Hybrid)

Vaco by Highspring, Tampa, Florida, us, 33646

Save Job

Position Summary Seeking skilled Data Engineer (ETL Developer) to modernize and optimize our data pipelines. The Data Engineer (ETL Developer) will be responsible for designing, building, and maintaining scalable ETL pipelines to support our analytics and reporting needs. This individual will leverage

Azure Data Factory (ADF) ,

SQL Server Integration Services (SSIS) ,

SQL Server , and modern BI platforms ( Domo

and

Power BI ) to ensure accurate, timely, and secure data delivery. The ideal candidate will have experience with healthcare data including: payer raw data, NCQA HEDIS quality data, and Medicare Risk Adjustment datasets.

*CAN'T work C2C or provide Sponsorship*

Duration:

9 month contract

Compensation:

$55-$65/hr

Key Responsibilities

Design, develop, and maintain modernized ETL pipelines using

Azure Data Factory, Fabric, Databricks

and/or

SSIS

to integrate multiple data sources into the enterprise data warehouse. Collaborate with analysts, data scientists, and business stakeholders to deliver clean, reliable, and well-structured data. Optimize and refine existing ETL processes to improve performance, scalability, and maintainability. Develop and maintain and/or convert

SQL stored procedures, views, functions, and scripts

for analytics and reporting. Implement data quality checks, error handling, and monitoring to ensure accurate and consistent data flows. Utilize a

code repository (Git or similar)

to manage, version, and document ETL code. Support integration of healthcare data sources, including claims, eligibility, provider rosters, and quality/risk adjustment data. Partner with business teams to enable

HEDIS measure calculation , quality reporting, and risk adjustment analytics. Collaborate with the BI team to deliver data models and datasets for dashboards in

Power BI and Domo . Ensure adherence to data governance, compliance, and security best practices (HIPAA, PHI/PII handling). Troubleshoot data issues, perform root cause analysis, and implement preventive measures.

Required Qualifications

Bachelor's degree in Computer Science, Information Systems, or a related field; or equivalent experience. 5+ years of professional experience in

ETL development

and

SQL-based data warehouse engineering . Strong expertise with

SQL Server Integration Services (SSIS) , and

T-SQL . Proven experience integrating and transforming

healthcare payer data

(claims, eligibility, encounters). Proficiency with

Git or other version control systems

for code management. Experience supporting data visualization/reporting in

Power BI and Domo . Strong understanding of data modeling, normalization, indexing, and query optimization. Strong problem-solving skills and ability to communicate technical concepts to non-technical stakeholders. Familiarity with HIPAA compliance, data privacy, and PHI/PII handling.

Preferred Skills

Experience with

Azure Data Factory or Fabric (or similar) , other Azure services (Azure SQL Database, Data Lake, Synapse). Working knowledge of

NCQA HEDIS quality measures

and related data sets. Experience with

Medicare Risk Adjustment

(HCC coding, RAF scores) data structures and reporting. Background in healthcare analytics, value-based care, or health plan operations. Familiarity with APIs, scripting (Python/PowerShell), and automation for data pipeline orchestration.