eTeam
Must Have Skills:
• GCP BQ
• Python
• SQL
Nice to Have Skills:
Detailed Job Description: Job Summary: We are seeking an experienced Senior Developer with strong expertise in Google Cloud Platform (GCP), BigQuery, and Apache Airflow to lead the development of scalable data pipelines and analytics solutions. The ideal candidate will have a deep understanding of cloud-native data engineering practices and a proven track record in building robust, automated workflows.
Key Responsibilities: • Design, develop, and maintain scalable data pipelines using Apache Airflow on GCP. • Build and optimize BigQuery data models and SQL queries for analytics and reporting. • Integrate data from various sources using GCP services such as Cloud Storage, Pub/Sub, and Dataflow. • Implement data quality checks, monitoring, and alerting mechanisms. • Collaborate with data scientists, analysts, and business teams to understand requirements and deliver solutions. • Ensure performance, reliability, and cost-efficiency of data workflows. • Document technical designs, workflows, and operational procedures.
Required Skills: • 8+ years of experience in data engineering or backend development. • Strong proficiency in Python, especially for scripting and automation. • Hands-on experience with Apache Airflow for orchestrating data workflows. • Expertise in BigQuery: schema design, query optimization, partitioning, and clustering. • Familiarity with GCP services: Cloud Functions, Cloud Storage, Pub/Sub, Dataflow, etc. • Experience with CI/CD, Git, and containerization tools like Docker. • Strong understanding of data modeling, ETL/ELT processes, and data governance.
Minimum Years of Experience: 8+ years
Top 3 responsibilities you would expect the Subcon to shoulder and execute: • Accountability • Responsibility • Time management
Nice to Have Skills:
Detailed Job Description: Job Summary: We are seeking an experienced Senior Developer with strong expertise in Google Cloud Platform (GCP), BigQuery, and Apache Airflow to lead the development of scalable data pipelines and analytics solutions. The ideal candidate will have a deep understanding of cloud-native data engineering practices and a proven track record in building robust, automated workflows.
Key Responsibilities: • Design, develop, and maintain scalable data pipelines using Apache Airflow on GCP. • Build and optimize BigQuery data models and SQL queries for analytics and reporting. • Integrate data from various sources using GCP services such as Cloud Storage, Pub/Sub, and Dataflow. • Implement data quality checks, monitoring, and alerting mechanisms. • Collaborate with data scientists, analysts, and business teams to understand requirements and deliver solutions. • Ensure performance, reliability, and cost-efficiency of data workflows. • Document technical designs, workflows, and operational procedures.
Required Skills: • 8+ years of experience in data engineering or backend development. • Strong proficiency in Python, especially for scripting and automation. • Hands-on experience with Apache Airflow for orchestrating data workflows. • Expertise in BigQuery: schema design, query optimization, partitioning, and clustering. • Familiarity with GCP services: Cloud Functions, Cloud Storage, Pub/Sub, Dataflow, etc. • Experience with CI/CD, Git, and containerization tools like Docker. • Strong understanding of data modeling, ETL/ELT processes, and data governance.
Minimum Years of Experience: 8+ years
Top 3 responsibilities you would expect the Subcon to shoulder and execute: • Accountability • Responsibility • Time management