Damco Solutions
Key Responsibilities:
Design, develop, and maintain ETL pipelines using
Ab Initio
(GDE, Co>Op, EME, Continuous Flows). Work with large data sets to extract, transform, and load data into target data stores such as data lakes or data warehouses. Perform
data analysis ,
profiling , and
cleansing
as required. Collaborate with data architects, analysts, and business stakeholders to gather and refine requirements. Optimize performance of Ab Initio graphs and flows to ensure scalability and reliability. Create reusable and modular ETL components and frameworks. Conduct code reviews and enforce best practices in ETL development. Troubleshoot and resolve production issues related to Ab Initio jobs. Maintain and manage metadata using EME.
Required Skills:
5+ years of
Ab Initio
development experience. Strong understanding of
ETL concepts ,
data warehousing , and
data modeling . Hands-on experience with
Ab Initio GDE ,
Co>Operating System ,
EME ,
Conduct>It ,
Continuous Flows ,
Express>It , and
Metadata Hub . Proficiency in
SQL ,
Unix/Linux shell scripting , and performance tuning. Familiarity with job schedulers like
Control-M
or similar. Experience working with
RDBMS
(e.g., Oracle, Teradata, DB2, PostgreSQL). Strong problem-solving and debugging skills.
Design, develop, and maintain ETL pipelines using
Ab Initio
(GDE, Co>Op, EME, Continuous Flows). Work with large data sets to extract, transform, and load data into target data stores such as data lakes or data warehouses. Perform
data analysis ,
profiling , and
cleansing
as required. Collaborate with data architects, analysts, and business stakeholders to gather and refine requirements. Optimize performance of Ab Initio graphs and flows to ensure scalability and reliability. Create reusable and modular ETL components and frameworks. Conduct code reviews and enforce best practices in ETL development. Troubleshoot and resolve production issues related to Ab Initio jobs. Maintain and manage metadata using EME.
Required Skills:
5+ years of
Ab Initio
development experience. Strong understanding of
ETL concepts ,
data warehousing , and
data modeling . Hands-on experience with
Ab Initio GDE ,
Co>Operating System ,
EME ,
Conduct>It ,
Continuous Flows ,
Express>It , and
Metadata Hub . Proficiency in
SQL ,
Unix/Linux shell scripting , and performance tuning. Familiarity with job schedulers like
Control-M
or similar. Experience working with
RDBMS
(e.g., Oracle, Teradata, DB2, PostgreSQL). Strong problem-solving and debugging skills.