Cynet systems Inc
Overview
The Senior Ab Initio Data Integration Developer will work closely with Business Analysts and the Product team to gather data requirements, design and build ETL pipelines, and transform data into consumable layers for various applications.
The role includes supporting and enhancing existing data pipelines, documenting technical designs, and ensuring high-quality, reusable code.
Essential Functions
Collaborate with Business Analysts and Product teams to gather and understand data requirements.
Design and develop Ab Initio data graphs and data pipelines to extract data from various databases, flat files, and message queues.
Transform and prepare data to create a consumable data layer for multiple application use cases.
Support data pipelines by fixing bugs and implementing enhancements.
Document technical designs, operational runbooks, and related artifacts.
Perform unit testing, troubleshoot production issues, and ensure pipeline reliability.
Mentor junior developers and provide guidance on best practices.
Qualifications And Skills
Bachelor’s Degree in Computer Science, Information Technology, Engineering, or related field.
10+ years of IT experience, predominantly in Data Integration or Data Warehousing.
5+ years of ETL design and development experience using Ab Initio.
1-2 years of Data Integration project experience on Hadoop platforms, preferably Cloudera.
Experience with Ab Initio CDC (Change Data Capture) in ETL projects is a plus.
Working knowledge of HDFS, Hive, Impala, and other Hadoop technologies.
Familiarity with AWS services is desirable.
Strong SQL skills with the ability to write optimized queries.
Solid understanding of OLTP and OLAP data models and data warehouse fundamentals.
Experience with Unix/Linux shell scripting.
Knowledge of Agile development practices.
Familiarity with Java development is a plus.
Commitment to high code quality, automated testing, and reusable code components.
Ability to work independently and collaboratively in a team environment.
#J-18808-Ljbffr
The role includes supporting and enhancing existing data pipelines, documenting technical designs, and ensuring high-quality, reusable code.
Essential Functions
Collaborate with Business Analysts and Product teams to gather and understand data requirements.
Design and develop Ab Initio data graphs and data pipelines to extract data from various databases, flat files, and message queues.
Transform and prepare data to create a consumable data layer for multiple application use cases.
Support data pipelines by fixing bugs and implementing enhancements.
Document technical designs, operational runbooks, and related artifacts.
Perform unit testing, troubleshoot production issues, and ensure pipeline reliability.
Mentor junior developers and provide guidance on best practices.
Qualifications And Skills
Bachelor’s Degree in Computer Science, Information Technology, Engineering, or related field.
10+ years of IT experience, predominantly in Data Integration or Data Warehousing.
5+ years of ETL design and development experience using Ab Initio.
1-2 years of Data Integration project experience on Hadoop platforms, preferably Cloudera.
Experience with Ab Initio CDC (Change Data Capture) in ETL projects is a plus.
Working knowledge of HDFS, Hive, Impala, and other Hadoop technologies.
Familiarity with AWS services is desirable.
Strong SQL skills with the ability to write optimized queries.
Solid understanding of OLTP and OLAP data models and data warehouse fundamentals.
Experience with Unix/Linux shell scripting.
Knowledge of Agile development practices.
Familiarity with Java development is a plus.
Commitment to high code quality, automated testing, and reusable code components.
Ability to work independently and collaboratively in a team environment.
#J-18808-Ljbffr