Saxon Global
Tech TAG - Corporate Manager @ Saxon | Recruitment & Client Services | Diversity Recruiting | Modern Ai Tech stack
The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands‑on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects.
Key Responsibilities
Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud‑native solutions.
Apply test‑driven development (TDD) practices and automate unit/integration tests for data pipelines.
Implement secure coding best practices and design patterns throughout the development lifecycle.
Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross‑Team Knowledge Sharing
Minimum of 10+ years overall IT experience
Experienced in waterfall, iterative, and agile methodologies
Technical Requirements
Hands‑on data engineering: Minimum 5+ years of practical experience building production‑grade data pipelines using Python and PySpark.
Airflow expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
CI/CD for data projects: Ability to build and maintain CI/CD pipelines for data engineering workflows, including automated testing and deployment.
Cloud & containers: Experience with containerization (Docker) and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve‑factor design principles.
Python fluency: Ability to write object‑oriented Python code, manage dependencies, and follow industry best practices.
Version control: Proficiency with Git for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
Unix/Linux: Strong command‑line skills in Unix‑like environments.
SQL: Solid understanding of SQL for data ingestion and analysis.
Collaborative development: Comfortable with code reviews, pair programming, and using remote collaboration tools effectively.
Engineering mindset: Writes code with an eye for maintainability and testability; excited to build production‑grade software.
Education: Bachelor’s or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
Seniority level
Mid‑Senior level
Employment type
Contract
Job function
Information Technology
Staffing and Recruiting
Referrals increase your chances of interviewing at Saxon Global by 2x.
#J-18808-Ljbffr
The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands‑on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects.
Key Responsibilities
Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud‑native solutions.
Apply test‑driven development (TDD) practices and automate unit/integration tests for data pipelines.
Implement secure coding best practices and design patterns throughout the development lifecycle.
Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross‑Team Knowledge Sharing
Minimum of 10+ years overall IT experience
Experienced in waterfall, iterative, and agile methodologies
Technical Requirements
Hands‑on data engineering: Minimum 5+ years of practical experience building production‑grade data pipelines using Python and PySpark.
Airflow expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
CI/CD for data projects: Ability to build and maintain CI/CD pipelines for data engineering workflows, including automated testing and deployment.
Cloud & containers: Experience with containerization (Docker) and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve‑factor design principles.
Python fluency: Ability to write object‑oriented Python code, manage dependencies, and follow industry best practices.
Version control: Proficiency with Git for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
Unix/Linux: Strong command‑line skills in Unix‑like environments.
SQL: Solid understanding of SQL for data ingestion and analysis.
Collaborative development: Comfortable with code reviews, pair programming, and using remote collaboration tools effectively.
Engineering mindset: Writes code with an eye for maintainability and testability; excited to build production‑grade software.
Education: Bachelor’s or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
Seniority level
Mid‑Senior level
Employment type
Contract
Job function
Information Technology
Staffing and Recruiting
Referrals increase your chances of interviewing at Saxon Global by 2x.
#J-18808-Ljbffr