ComResource
Overview
ComResource is seeking a Data Engineer (Platform) to help with a multi-year project to migrate enterprise-level data from DB2 and legacy systems to a Cloud Environment. The expectation is that this individual will directly contribute to the build responsible for migration pipelines, data integrity, database stability, and streamlined migration pipelines into AWS and other related cloud platforms. Key Responsibilities
Design and implement data migration pipelines and roadmap from DB2 to Cloud. Build and optimize ETL pipelines and use Python to automate data processing and reconciliation. Define and execute migration phases, checkpoints, and quality controls. Perform data validation and reconciliation to ensure accuracy and compliance. Collaborate with DBAs, architects, application teams, and business stakeholders across multiple lines of business. Qualifications
Proven experience with large-scale database migrations - numerous business divisions, customers, and/or systems. Expertise in ETL processes, SQL, and Python for automation. Familiarity with Cloud data services (AWS, Azure, or GCP). Knowledge of
data quality, reconciliation, and compliance best practices . Technical Skills - Must Have
Previous role as Data Engineer, Data Platform Engineer, Database Developer, or related role SQL expertise Experience with relational databases Experience in an enterprise environment Nice To Have
Experience supporting Java-based application systems Knowledge of data modelling, data sciences, and advanced ETL automation Seniority level
Associate Employment type
Full-time Job function
Information Technology Industries IT Services and IT Consulting We have removed boilerplate and non-relevant postings to focus on the role and requirements.
#J-18808-Ljbffr
ComResource is seeking a Data Engineer (Platform) to help with a multi-year project to migrate enterprise-level data from DB2 and legacy systems to a Cloud Environment. The expectation is that this individual will directly contribute to the build responsible for migration pipelines, data integrity, database stability, and streamlined migration pipelines into AWS and other related cloud platforms. Key Responsibilities
Design and implement data migration pipelines and roadmap from DB2 to Cloud. Build and optimize ETL pipelines and use Python to automate data processing and reconciliation. Define and execute migration phases, checkpoints, and quality controls. Perform data validation and reconciliation to ensure accuracy and compliance. Collaborate with DBAs, architects, application teams, and business stakeholders across multiple lines of business. Qualifications
Proven experience with large-scale database migrations - numerous business divisions, customers, and/or systems. Expertise in ETL processes, SQL, and Python for automation. Familiarity with Cloud data services (AWS, Azure, or GCP). Knowledge of
data quality, reconciliation, and compliance best practices . Technical Skills - Must Have
Previous role as Data Engineer, Data Platform Engineer, Database Developer, or related role SQL expertise Experience with relational databases Experience in an enterprise environment Nice To Have
Experience supporting Java-based application systems Knowledge of data modelling, data sciences, and advanced ETL automation Seniority level
Associate Employment type
Full-time Job function
Information Technology Industries IT Services and IT Consulting We have removed boilerplate and non-relevant postings to focus on the role and requirements.
#J-18808-Ljbffr