IMAGINEEER LLC
Sr ETL SpecialistETL EngineerETL Developer
IMAGINEEER LLC, Washington, District of Columbia, us, 20022
Overview
The Senior ETL Specialist / ETL Engineer will support enterprise data integration and modernization efforts for the U.S. Department of Health and Human Services (HHS). The role is responsible for designing, developing, and maintaining Extract, Transform, Load (ETL) processes that enable secure, accurate, and reliable data movement across federal systems and mission-critical programs. The ETL Specialist will work as part of a data engineering and analytics team to support data warehousing, business intelligence (BI), and enterprise reporting initiatives. The ideal candidate will possess deep ETL development expertise, strong SQL and data modeling skills, and experience in enterprise data management solutions. This role requires the ability to build scalable data pipelines, optimize data flows, and ensure data integrity across multiple platforms. Experience in a federal environment and knowledge of HHS mission priorities are highly desirable. Benefits: 401(k) matching Competitive salary Health insurance Paid time off Responsibilities
ETL Development & Data Integration
– Design, build, and maintain scalable ETL/ELT pipelines to support enterprise data warehouse (EDW) and analytics environments. Data Ingestion & Transformation
– Develop data ingestion and transformation routines from multiple structured and unstructured data sources. ETL Solutions
– Implement ETL solutions using tools such as Informatica, Talend, SSIS, Azure Data Factory, AWS Glue, or ODI. Reusable Components
– Create reusable components, frameworks, and data integration templates to accelerate delivery. Performance & Reliability
– Ensure ETL workflows are optimized for performance, maintainability, and reliability. Data Engineering & Architecture Support
– Collaborate with data architects and analysts to define data models (conceptual, logical, physical). Cross-Application Integration
– Support integration across enterprise applications including financial, HR, grants, and program systems. Data Quality & Governance
– Implement data quality rules, validation logic, and exception handling procedures; support Master Data Management (MDM) and metadata initiatives; implement best practices for ETL/data integration architecture aligned with enterprise data strategy. Testing & Production Support
– Conduct data profiling and validation; perform unit, integration, and UAT testing; troubleshoot ETL failures and deploy corrective measures; optimize workflows for performance; provide production support and participate in on-call rotation. Documentation & Collaboration
– Develop and maintain ETL documentation (design specs, data flow diagrams, data lineage); collaborate with analysts, developers, DBAs, and federal stakeholders; participate in Agile ceremonies; support data governance, security, and compliance initiatives. Qualifications
Citizenship & Clearance
– U.S. Citizen with ability to obtain and maintain a Public Trust clearance. Education
– Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field. Experience
– 7+ years of experience in ETL development, data engineering, or enterprise data integration. Technical Skills
– Hands-on experience with ETL tools such as Informatica PowerCenter/IDQ, Talend, SSIS, ADF, AWS Glue, or DataStage; strong SQL and relational databases (Oracle, SQL Server, PostgreSQL, or Snowflake); data warehousing concepts (star schema, slowly changing dimensions, data marts). Methodologies
– Experience working in Agile or SDLC-driven environments; excellent communication and problem-solving skills. Desired Skills
Prior Federal Experience
– Experience supporting HHS or other federal agencies (CMS, NIH, HRSA, ACF, FDA). Cloud & Data
– Experience building ETL solutions in cloud environments (AWS, Azure, or GCP); familiarity with data governance, metadata management, and data catalog tools. Automation & Tech
– Experience with Python, Shell scripting, or PowerShell; knowledge of API-based integrations, Kafka, or streaming data pipelines; experience with DevOps tools (Git, Jenkins, Azure DevOps). Certifications
– Certifications such as Informatica Developer, Talend Data Integration, AWS Data Analytics, Azure Data. Other
– Flexible work from home options available.
#J-18808-Ljbffr
The Senior ETL Specialist / ETL Engineer will support enterprise data integration and modernization efforts for the U.S. Department of Health and Human Services (HHS). The role is responsible for designing, developing, and maintaining Extract, Transform, Load (ETL) processes that enable secure, accurate, and reliable data movement across federal systems and mission-critical programs. The ETL Specialist will work as part of a data engineering and analytics team to support data warehousing, business intelligence (BI), and enterprise reporting initiatives. The ideal candidate will possess deep ETL development expertise, strong SQL and data modeling skills, and experience in enterprise data management solutions. This role requires the ability to build scalable data pipelines, optimize data flows, and ensure data integrity across multiple platforms. Experience in a federal environment and knowledge of HHS mission priorities are highly desirable. Benefits: 401(k) matching Competitive salary Health insurance Paid time off Responsibilities
ETL Development & Data Integration
– Design, build, and maintain scalable ETL/ELT pipelines to support enterprise data warehouse (EDW) and analytics environments. Data Ingestion & Transformation
– Develop data ingestion and transformation routines from multiple structured and unstructured data sources. ETL Solutions
– Implement ETL solutions using tools such as Informatica, Talend, SSIS, Azure Data Factory, AWS Glue, or ODI. Reusable Components
– Create reusable components, frameworks, and data integration templates to accelerate delivery. Performance & Reliability
– Ensure ETL workflows are optimized for performance, maintainability, and reliability. Data Engineering & Architecture Support
– Collaborate with data architects and analysts to define data models (conceptual, logical, physical). Cross-Application Integration
– Support integration across enterprise applications including financial, HR, grants, and program systems. Data Quality & Governance
– Implement data quality rules, validation logic, and exception handling procedures; support Master Data Management (MDM) and metadata initiatives; implement best practices for ETL/data integration architecture aligned with enterprise data strategy. Testing & Production Support
– Conduct data profiling and validation; perform unit, integration, and UAT testing; troubleshoot ETL failures and deploy corrective measures; optimize workflows for performance; provide production support and participate in on-call rotation. Documentation & Collaboration
– Develop and maintain ETL documentation (design specs, data flow diagrams, data lineage); collaborate with analysts, developers, DBAs, and federal stakeholders; participate in Agile ceremonies; support data governance, security, and compliance initiatives. Qualifications
Citizenship & Clearance
– U.S. Citizen with ability to obtain and maintain a Public Trust clearance. Education
– Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field. Experience
– 7+ years of experience in ETL development, data engineering, or enterprise data integration. Technical Skills
– Hands-on experience with ETL tools such as Informatica PowerCenter/IDQ, Talend, SSIS, ADF, AWS Glue, or DataStage; strong SQL and relational databases (Oracle, SQL Server, PostgreSQL, or Snowflake); data warehousing concepts (star schema, slowly changing dimensions, data marts). Methodologies
– Experience working in Agile or SDLC-driven environments; excellent communication and problem-solving skills. Desired Skills
Prior Federal Experience
– Experience supporting HHS or other federal agencies (CMS, NIH, HRSA, ACF, FDA). Cloud & Data
– Experience building ETL solutions in cloud environments (AWS, Azure, or GCP); familiarity with data governance, metadata management, and data catalog tools. Automation & Tech
– Experience with Python, Shell scripting, or PowerShell; knowledge of API-based integrations, Kafka, or streaming data pipelines; experience with DevOps tools (Git, Jenkins, Azure DevOps). Certifications
– Certifications such as Informatica Developer, Talend Data Integration, AWS Data Analytics, Azure Data. Other
– Flexible work from home options available.
#J-18808-Ljbffr