Blue Fish Technologies Inc
Data Engineer – Ab Initio to GCP Migration
Blue Fish Technologies Inc, Dallas, Texas, United States
Job Title: Data Engineer – Ab Initio to GCP Migration Location: Dallas, TX We are seeking a highly skilled Data Engineer to support a large-scale Ab Initio to Google Cloud Platform (GCP) migration initiative. The ideal candidate will have strong hands-on experience in ETL development, data pipeline design, and cloud migration, with deep expertise in Ab Initio and modern GCP data services. This role will focus on re-engineering legacy Ab Initio processes, designing scalable cloud-native solutions, and ensuring a smooth migration with minimal disruption. Key Responsibilities: Collaborate with architects and business stakeholders to gather migration requirements from Ab Initio to GCP. Re-engineer and optimize ETL pipelines built in Ab Initio to leverage GCP-native tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Dataproc. Develop, test, and deploy data ingestion, transformation, and integration workflows on GCP. Ensure data quality, governance, lineage, and compliance throughout the migration process. Optimize the performance and scalability of cloud-based data pipelines. Partner with DevOps teams to implement CI/CD pipelines and automation for data workflows. Provide technical expertise in Ab Initio to GCP mapping, workload analysis, and re-platforming strategies. Support troubleshooting, validation, and performance tuning of migrated workloads. Document migration steps, architecture, and best practices for ongoing knowledge sharing. Required Skills & Experience 5–8+ years of experience in Data Engineering with expertise in ETL and data pipeline development. Strong hands-on expertise with Ab Initio (plans, graphs, PDL, EME, etc.). Proven track record of migrating data pipelines from Ab Initio to GCP or similar modernization initiatives. Proficiency with GCP services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Composer (Airflow). Strong programming skills in SQL, Python, and Shell scripting. Experience in data modeling, performance tuning, and large-scale data processing. Good knowledge of data governance, security, and compliance in cloud environments. Familiarity with CI/CD, Git, and DevOps practices for data engineering. Nice to Have Experience with other ETL migration tools/accelerators. Knowledge of healthcare or retail domain data systems. Exposure to Agile methodologies and collaborative delivery models.