Logo
IMAGINEEER LLC

Sr ETL Specialist/ETL Engineer/ETL Developer

IMAGINEEER LLC, Washington, District of Columbia, us, 20022

Save Job

Benefits

401(k) matching Competitive salary Health insurance Paid time off About this Role

The Senior ETL Specialist / ETL Engineer will support enterprise data integration and modernization efforts for the U.S. Department of Health and Human Services (HHS). The role is responsible for designing, developing, and maintaining Extract, Transform, Load (ETL) processes that enable secure, accurate, and reliable data movement across federal systems and mission-critical programs. The ETL Specialist will work as part of a data engineering and analytics team to support data warehousing, business intelligence (BI), and enterprise reporting initiatives. The ideal candidate will possess deep ETL development expertise, strong SQL and data modeling skills, and experience in enterprise data management solutions. This role requires the ability to build scalable data pipelines, optimize data flows, and ensure data integrity across multiple platforms. Experience in a federal environment and knowledge of HHS mission priorities are highly desirable. Key Responsibilities

ETL Development & Data Integration

Design, build, and maintain scalable ETL/ELT pipelines to support enterprise data warehouse (EDW) and analytics environments. Develop data ingestion and transformation routines from multiple structured and unstructured data sources. Implement ETL solutions using tools such as

Informatica, Talend, SSIS, Azure Data Factory, AWS Glue, or ODI . Create reusable components, frameworks, and data integration templates to accelerate delivery. Ensure ETL workflows are optimized for performance, maintainability, and reliability. Data Engineering & Architecture Support

Collaborate with data architects and analysts to define data models (conceptual, logical, physical). Support integration across enterprise applications including financial, HR, grants, and program systems. Implement data quality rules, validation logic, and exception handling procedures. Support Master Data Management (MDM) and metadata initiatives. Implement best practices for ETL/data integration architecture aligned with enterprise data strategy. Testing, Optimization & Production Support

Conduct data profiling and validation to ensure ETL output accuracy. Perform unit testing, integration testing, and UAT validation for ETL solutions. Troubleshoot ETL failures, investigate root causes, and deploy corrective measures. Optimize ETL workflows for high performance and reduced processing time. Provide production support and participate in on-call rotation as necessary. Documentation & Collaboration

Develop and maintain ETL documentation including design specifications, data flow diagrams, and data lineage documentation. Collaborate with analysts, developers, database administrators, and federal stakeholders. Participate in Agile ceremonies including sprint planning, reviews, and retrospectives. Support data governance, data security, and compliance initiatives. Qualifications and Skills

U.S. Citizen

with ability to obtain and maintain a

Public Trust clearance . Bachelor’s degree

in Computer Science, Data Engineering, Information Systems, or related field. 7+ years of experience

in ETL development, data engineering, or enterprise data integration. Hands-on experience with

ETL tools

such as Informatica PowerCenter/IDQ, Talend, SSIS, ADF, AWS Glue, or DataStage. Strong proficiency in

SQL

and relational databases (Oracle, SQL Server, PostgreSQL, or Snowflake). Experience with

data warehousing concepts

(star schema, slowly changing dimensions, data marts). Proven experience creating high-performance, secure ETL processes. Experience working in

Agile

or

SDLC-driven

environments. Excellent communication and problem-solving skills. Desired Skills and Competencies

Prior experience supporting

HHS

or other federal agencies (CMS, NIH, HRSA, ACF, FDA). Experience building ETL solutions in

cloud environments

(AWS, Azure, or GCP). Familiarity with

data governance, metadata management, and data catalog tools . Experience with

Python, Shell scripting, or PowerShell

for automation. Knowledge of

API-based integrations, Kafka, or streaming data pipelines . Experience with

DevOps tools

(Git, Jenkins, Azure DevOps). Certifications such as

Informatica Developer, Talend Data Integration, AWS Data Analytics, Azure Data Flexible work from home options available.

#J-18808-Ljbffr