Eliassen Group
Senior Data Engineer – Apache Stack / ETL Platform Build
Eliassen Group, Westlake, Texas, United States
Description:
Hybrid: Westlake, TX
Our client, a leader in their industry, has an excellent opportunity for a Data Engineer to work in a contract position in Westlake, TX or Merrimack, NH. This position is hybrid onsite for two weeks a month and candidates must be local to the metro area.
This Data Engineer role is not your typical ETL developer role, it’s a strategic, hands‑on engineering position where you'll play a direct role in building a brand‑new internal ETL tool using open‑source technology to replace costly legacy solutions like Informatica.
Due to client requirement, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Responsibilities
Design, develop, and maintain robust data pipelines to support ETL/ELT use cases
Implement and optimize workflows using Apache Hop, Apache BEAM, NiFi, Spark, and Airflow
Deploy and manage containerized applications in a Kubernetes‑based cloud environment
Utilize Python and Java for API integration and data engineering solutions
Apply Infrastructure‑as‑Code practices using tools like Terraform, CFT, or ARM
Experience Requirements
10+ years of professional IT experience in data engineering related roles
Hands‑on experience with Apache tools like Hop, BEAM, NiFi, Spark, and Airflow
Strong proficiency in Python and Java for API connection work (must‑have)
Strong background in cloud platforms such as AWS or Azure.
ETL tool expertise including Informatica, Python, or DBT (with preference for open‑source tools)
Infrastructure‑as‑Code practices using tools like Terraform, CloudFormation, or ARM
Kubernetes – ability to containarize and deploy an ETL job to the cloud
Preferred
Certification in Azure or AWS cloud platforms
Skilled in CI/CD pipelines using GitHub, Jenkins, and Artifact
Experience with Snaplogic or Informatica for data integration
Monitoring and alert configuration experience for high‑availability environments
Familiarity with marketing data sets or customer data platforms
Effective communicator and problem solver in agile, fast‑paced teams
Education Requirements Bachelors Degree Required
#J-18808-Ljbffr
Our client, a leader in their industry, has an excellent opportunity for a Data Engineer to work in a contract position in Westlake, TX or Merrimack, NH. This position is hybrid onsite for two weeks a month and candidates must be local to the metro area.
This Data Engineer role is not your typical ETL developer role, it’s a strategic, hands‑on engineering position where you'll play a direct role in building a brand‑new internal ETL tool using open‑source technology to replace costly legacy solutions like Informatica.
Due to client requirement, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Responsibilities
Design, develop, and maintain robust data pipelines to support ETL/ELT use cases
Implement and optimize workflows using Apache Hop, Apache BEAM, NiFi, Spark, and Airflow
Deploy and manage containerized applications in a Kubernetes‑based cloud environment
Utilize Python and Java for API integration and data engineering solutions
Apply Infrastructure‑as‑Code practices using tools like Terraform, CFT, or ARM
Experience Requirements
10+ years of professional IT experience in data engineering related roles
Hands‑on experience with Apache tools like Hop, BEAM, NiFi, Spark, and Airflow
Strong proficiency in Python and Java for API connection work (must‑have)
Strong background in cloud platforms such as AWS or Azure.
ETL tool expertise including Informatica, Python, or DBT (with preference for open‑source tools)
Infrastructure‑as‑Code practices using tools like Terraform, CloudFormation, or ARM
Kubernetes – ability to containarize and deploy an ETL job to the cloud
Preferred
Certification in Azure or AWS cloud platforms
Skilled in CI/CD pipelines using GitHub, Jenkins, and Artifact
Experience with Snaplogic or Informatica for data integration
Monitoring and alert configuration experience for high‑availability environments
Familiarity with marketing data sets or customer data platforms
Effective communicator and problem solver in agile, fast‑paced teams
Education Requirements Bachelors Degree Required
#J-18808-Ljbffr