Jobs via Dice
Job Summary
We are seeking a highly skilled
Ab Initio Senior Developer
to design, develop, and implement enterprise-grade ETL solutions. The ideal candidate will have extensive experience with
Ab Initio components ,
data integration , and
data pipeline optimization
in large-scale environments, with a strong understanding of
data warehousing
and
data architecture principles .
Key Responsibilities
Design, develop, and deploy ETL workflows using Ab Initio (Graphical Development Environment, Express, IT, and Control Center).
Work on end-to-end data integration projects, including data extraction, transformation, loading, and validation.
Optimize existing Ab Initio graphs and improve performance through tuning and efficient design.
Collaborate with data architects, business analysts, and QA teams to translate business requirements into technical specifications.
Develop and implement data quality and validation frameworks within ETL processes.
Manage metadata, data lineage, and data mapping documentation to ensure transparency and compliance.
Support data migration and modernization initiatives across on-premise and cloud environments.
Troubleshoot and resolve performance bottlenecks and production issues in Ab Initio jobs.
Integrate Ab Initio with external systems, databases, and cloud data platforms as required.
Follow SDLC and Agile methodologies, ensuring code quality, version control, and deployment best practices.
Required Skills & Qualifications
7–10 years of experience in Ab Initio ETL design and development.
Strong hands‑on experience with Ab Initio components such as GDE, EME, Co., operating system, and Control Center.
Expertise in data warehousing concepts, ETL architecture, and data modeling.
Proficient in SQL, UNIX/Linux scripting, and performance tuning.
Experience integrating with RDBMS (Oracle, Teradata, DB2, or SQL Server) and big data ecosystems (HDFS, Hive, etc.).
Solid understanding of data governance, quality, and validation frameworks.
Experience working with Agile/Scrum methodologies and collaboration tools (JIRA, Confluence).
Strong analytical, problem‑solving, and communication skills.
Good to Have
Experience with Ab Initio Continuous Flows, Metadata Hub, or Express.
Exposure to AWS, Azure, or Google Cloud Platform data platforms.
Familiarity with DevOps and CI/CD processes for ETL deployment.
Experience in financial, banking, or insurance domains.
Education
Bachelor's or Master's degree in Computer Science, Information Technology, or a related discipline.
#J-18808-Ljbffr
Ab Initio Senior Developer
to design, develop, and implement enterprise-grade ETL solutions. The ideal candidate will have extensive experience with
Ab Initio components ,
data integration , and
data pipeline optimization
in large-scale environments, with a strong understanding of
data warehousing
and
data architecture principles .
Key Responsibilities
Design, develop, and deploy ETL workflows using Ab Initio (Graphical Development Environment, Express, IT, and Control Center).
Work on end-to-end data integration projects, including data extraction, transformation, loading, and validation.
Optimize existing Ab Initio graphs and improve performance through tuning and efficient design.
Collaborate with data architects, business analysts, and QA teams to translate business requirements into technical specifications.
Develop and implement data quality and validation frameworks within ETL processes.
Manage metadata, data lineage, and data mapping documentation to ensure transparency and compliance.
Support data migration and modernization initiatives across on-premise and cloud environments.
Troubleshoot and resolve performance bottlenecks and production issues in Ab Initio jobs.
Integrate Ab Initio with external systems, databases, and cloud data platforms as required.
Follow SDLC and Agile methodologies, ensuring code quality, version control, and deployment best practices.
Required Skills & Qualifications
7–10 years of experience in Ab Initio ETL design and development.
Strong hands‑on experience with Ab Initio components such as GDE, EME, Co., operating system, and Control Center.
Expertise in data warehousing concepts, ETL architecture, and data modeling.
Proficient in SQL, UNIX/Linux scripting, and performance tuning.
Experience integrating with RDBMS (Oracle, Teradata, DB2, or SQL Server) and big data ecosystems (HDFS, Hive, etc.).
Solid understanding of data governance, quality, and validation frameworks.
Experience working with Agile/Scrum methodologies and collaboration tools (JIRA, Confluence).
Strong analytical, problem‑solving, and communication skills.
Good to Have
Experience with Ab Initio Continuous Flows, Metadata Hub, or Express.
Exposure to AWS, Azure, or Google Cloud Platform data platforms.
Familiarity with DevOps and CI/CD processes for ETL deployment.
Experience in financial, banking, or insurance domains.
Education
Bachelor's or Master's degree in Computer Science, Information Technology, or a related discipline.
#J-18808-Ljbffr