Logo
Aifa Solutions

Data Engineer - GCP & Python/Pyspark

Aifa Solutions, Los Gatos, California, United States, 95032

Save Job

Job Title: Data Engineer - GCP & Python/Pyspark Location: Phoenix, AZ // New York ( hybrid) Job Type: 6-12 months Contract Role

Position: W2

About the Role We are seeking an experienced Data Engineer with strong hands-on expertise in Google Cloud Platform (GCP) services and Python programming. The ideal candidate will build and optimize scalable data pipelines, manage workflows, and ensure seamless data ingestion, transformation, and processing.

Key Responsibilities • Design, develop, and maintain data pipelines and workflows using GCP services (Airflow, Pub/Sub, Dataproc, BigQuery). • Develop Airflow DAGs for orchestrating complex ETL/ELT processes. • Implement scalable data ingestion, transformation, and storage solutions on GCP. • Write and optimize SQL queries to extract, transform, and load data efficiently. • Leverage Python programming to build reusable data processing components and scripts. • Collaborate with cross-functional teams (data scientists, analysts, engineers) to support data-driven initiatives. • Ensure data quality, reliability, and security across all pipelines.

Required Skills & Experience • Strong hands-on experience with GCP services including Airflow, Pub/Sub, Dataproc, and BigQuery. • Proficiency in Python programming for data engineering tasks. • Expertise in SQL and relational database concepts. • Solid understanding of ETL/ELT design patterns and data modeling. • Experience with workflow orchestration tools and scalable data pipelines. • Strong problem-solving skills and ability to work in an agile environment.

Nice-to-Have • Experience with Cloud Storage, Dataflow, or other GCP services. • Familiarity with CI/CD pipelines and containerization (Docker/Kubernetes). • Knowledge of data governance and security best practices.