Exadel open positions
We’re an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks.
From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next.
What powers it all? Our people are ambitious, collaborative, and constantly evolving.
What You’ll Do
Build and maintain data pipelines using Apache Beam and Dataflow
Develop batch and streaming ingestion patterns that support analytics and ML workloads
Design effective BigQuery tables with thoughtful partitioning, clustering, and lifecycle strategies
Write structured SQL and Python for transformations, validations, and automation tasks
Implement data quality checks, schema evolution processes, and observability for data assets
Transformation and Modeling
Use dbt or Dataform to manage transformations, testing, and model lineage
Design dimensional and event-based models that serve analytics and operational needs
Guide consistent modeling patterns across teams
Architecture Collaboration
Contribute to design discussions and ensure solutions align with established standards
Work with data architects to plan patterns that scale with long-term growth
Bring practical insight into cost-conscious design within BigQuery and Dataflow
Production Ownership
Operate production pipelines through monitoring, incident analysis, and continuous improvement
Tune performance across BigQuery and Dataflow using GCP-native monitoring and logging tools
Maintain CI and CD processes for data workflows
Cross-Functional Work
Partner with analysts, scientists, and product groups to translate requirements into technical solutions
Mentor junior engineers and support effective code reviews
Communicate design choices with clarity across teams
What You Bring Technical Expertise
Strong experience with Dataflow, Apache Beam, BigQuery, Cloud Storage, and Cloud Composer
Proficiency in Python for data processing and automation
Strong SQL skills with an understanding of tuning and cost management in BigQuery
Hands‑on experience with dbt or Dataform for transformations, testing, and documentation
Understanding of data modeling principles used in modern analytics environments
Experience with data pipeline CI and CD, versioning, and automated testing
Experience applying structured data quality frameworks
Strong approach to observability and proactive monitoring
Professional Skills
Clear communication with technical and non-technical partners
Curiosity and initiative that drives continuous improvement
Capacity to work independently while contributing to a collaborative team
Nice to Have
Some practical experience with Dataproc or Spark from legacy environments
Familiarity with Vertex AI pipelines or ML‑oriented workflows
Experience with event streaming patterns such as Pub/Sub or Kafka
Knowledge of modern orchestration trends beyond Airflow
Legal & Hiring Information
Exadel is proud to be an Equal Opportunity Employer committed to inclusion across minority, gender identity, sexual orientation, disability, age, and more
Reasonable accommodations are available to enable individuals with disabilities to perform essential functions
Please note: this job description is not exhaustive. Duties and responsibilities may evolve based on business needs
Your Benefits at Exadel Exadel benefits vary by location and contract type. Your recruiter will fill you in on the details.
International projects
In‑office, hybrid, or remote flexibility
Medical healthcare
Recognition program
Ongoing learning & reimbursement
Team events & local benefits
Sports compensation
We lead with trust, respect, and purpose. We believe in open dialogue, creative freedom, and mentorship that helps you grow, lead, and make a real difference. Ours is a culture where ideas are challenged, voices are heard, and your impact matters.
#J-18808-Ljbffr
From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next.
What powers it all? Our people are ambitious, collaborative, and constantly evolving.
What You’ll Do
Build and maintain data pipelines using Apache Beam and Dataflow
Develop batch and streaming ingestion patterns that support analytics and ML workloads
Design effective BigQuery tables with thoughtful partitioning, clustering, and lifecycle strategies
Write structured SQL and Python for transformations, validations, and automation tasks
Implement data quality checks, schema evolution processes, and observability for data assets
Transformation and Modeling
Use dbt or Dataform to manage transformations, testing, and model lineage
Design dimensional and event-based models that serve analytics and operational needs
Guide consistent modeling patterns across teams
Architecture Collaboration
Contribute to design discussions and ensure solutions align with established standards
Work with data architects to plan patterns that scale with long-term growth
Bring practical insight into cost-conscious design within BigQuery and Dataflow
Production Ownership
Operate production pipelines through monitoring, incident analysis, and continuous improvement
Tune performance across BigQuery and Dataflow using GCP-native monitoring and logging tools
Maintain CI and CD processes for data workflows
Cross-Functional Work
Partner with analysts, scientists, and product groups to translate requirements into technical solutions
Mentor junior engineers and support effective code reviews
Communicate design choices with clarity across teams
What You Bring Technical Expertise
Strong experience with Dataflow, Apache Beam, BigQuery, Cloud Storage, and Cloud Composer
Proficiency in Python for data processing and automation
Strong SQL skills with an understanding of tuning and cost management in BigQuery
Hands‑on experience with dbt or Dataform for transformations, testing, and documentation
Understanding of data modeling principles used in modern analytics environments
Experience with data pipeline CI and CD, versioning, and automated testing
Experience applying structured data quality frameworks
Strong approach to observability and proactive monitoring
Professional Skills
Clear communication with technical and non-technical partners
Curiosity and initiative that drives continuous improvement
Capacity to work independently while contributing to a collaborative team
Nice to Have
Some practical experience with Dataproc or Spark from legacy environments
Familiarity with Vertex AI pipelines or ML‑oriented workflows
Experience with event streaming patterns such as Pub/Sub or Kafka
Knowledge of modern orchestration trends beyond Airflow
Legal & Hiring Information
Exadel is proud to be an Equal Opportunity Employer committed to inclusion across minority, gender identity, sexual orientation, disability, age, and more
Reasonable accommodations are available to enable individuals with disabilities to perform essential functions
Please note: this job description is not exhaustive. Duties and responsibilities may evolve based on business needs
Your Benefits at Exadel Exadel benefits vary by location and contract type. Your recruiter will fill you in on the details.
International projects
In‑office, hybrid, or remote flexibility
Medical healthcare
Recognition program
Ongoing learning & reimbursement
Team events & local benefits
Sports compensation
We lead with trust, respect, and purpose. We believe in open dialogue, creative freedom, and mentorship that helps you grow, lead, and make a real difference. Ours is a culture where ideas are challenged, voices are heard, and your impact matters.
#J-18808-Ljbffr