Programmers.io
Programmers.io is currently looking for a
ETL Developer
FULL TIME ROLE (ONLY USC OR GC) - NO C2C
Job Description:
We are seeking a highly experienced ETL Developer professional experience in HDFS, Hive, Impala, PySpark, Python, and DevOps automation tools such as uDeploy and Jenkins. This role is responsible for managing end-to-end data operations, including HDFS table management, ETL pipeline development, multi-environment codebase governance, platform upgrades, and production support.
The ideal candidate will have strong expertise in Linux system operations, Big Data ecosystem tools, and experience with incident/change management using ServiceNow. This role plays a key part in ensuring the stability, scalability, and efficiency of enterprise data platforms while enabling seamless development-to-production workflows.
Required Qualifications:
Bachelor’s degree in Computer Science, Information Technology, or related field.
8+ years of experience in Big Data engineering and DevOps practices.
Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux.
Proven experience with CI/CD tools such as Jenkins and uDeploy.
Strong understanding of ETL development, orchestration, and performance optimization.
Experience with ServiceNow for incident/change/problem management.
Excellent analytical, troubleshooting, and communication skills.
If you are interested, please apply or feel free to share your updated resume at
anas.khan@programmers.io
#J-18808-Ljbffr
ETL Developer
FULL TIME ROLE (ONLY USC OR GC) - NO C2C
Job Description:
We are seeking a highly experienced ETL Developer professional experience in HDFS, Hive, Impala, PySpark, Python, and DevOps automation tools such as uDeploy and Jenkins. This role is responsible for managing end-to-end data operations, including HDFS table management, ETL pipeline development, multi-environment codebase governance, platform upgrades, and production support.
The ideal candidate will have strong expertise in Linux system operations, Big Data ecosystem tools, and experience with incident/change management using ServiceNow. This role plays a key part in ensuring the stability, scalability, and efficiency of enterprise data platforms while enabling seamless development-to-production workflows.
Required Qualifications:
Bachelor’s degree in Computer Science, Information Technology, or related field.
8+ years of experience in Big Data engineering and DevOps practices.
Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux.
Proven experience with CI/CD tools such as Jenkins and uDeploy.
Strong understanding of ETL development, orchestration, and performance optimization.
Experience with ServiceNow for incident/change/problem management.
Excellent analytical, troubleshooting, and communication skills.
If you are interested, please apply or feel free to share your updated resume at
anas.khan@programmers.io
#J-18808-Ljbffr