Mastek
Job Title: Architect / Senior Data Engineer
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices.
Key Responsibilities
Architect, design, and implement
scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt.
Develop
end-to-end data pipelines
(batch and streaming) to support analytics, machine learning, and business intelligence needs.
Lead the
modernization and migration
of legacy data systems to cloud-native architectures.
Define and enforce
data engineering best practices
including coding standards, CI/CD, testing, and monitoring.
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
Optimize Snowflake performance through
query tuning, warehouse sizing, and cost management .
Establish and maintain
data governance, security, and compliance
standards across the data platform.
Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
8+ years
of experience in Data Engineering, with at least
3+ years
in a cloud-native data environment.
Hands‑on expertise in
AWS services
such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
Strong experience with
Snowflake
– data modeling, warehouse design, performance optimization, and cost governance.
Proven experience with
dbt (data build tool)
– model development, documentation, and deployment automation.
Proficient in
SQL, Python, and ETL/ELT pipeline development .
Experience with
CI/CD pipelines , version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
Familiarity with
data governance and security best practices , including role‑based access control and data masking.
Strong understanding of
data modeling techniques
(Kimball, Data Vault, etc.) and
data architecture principles .
Preferred Qualifications
AWS Certification (e.g.,
AWS Certified Data Analytics – Specialty ,
Solutions Architect ).
Strong communication and collaboration skills, with a track record of working in agile environments.
Seniority level Mid‑Senior level
Employment type Full‑time
Job function Information Technology and Design
IT Services and IT Consulting and Software Development
#J-18808-Ljbffr
We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices.
Key Responsibilities
Architect, design, and implement
scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt.
Develop
end-to-end data pipelines
(batch and streaming) to support analytics, machine learning, and business intelligence needs.
Lead the
modernization and migration
of legacy data systems to cloud-native architectures.
Define and enforce
data engineering best practices
including coding standards, CI/CD, testing, and monitoring.
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
Optimize Snowflake performance through
query tuning, warehouse sizing, and cost management .
Establish and maintain
data governance, security, and compliance
standards across the data platform.
Mentor and guide junior data engineers, providing technical leadership and direction.
Required Skills & Qualifications
8+ years
of experience in Data Engineering, with at least
3+ years
in a cloud-native data environment.
Hands‑on expertise in
AWS services
such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.
Strong experience with
Snowflake
– data modeling, warehouse design, performance optimization, and cost governance.
Proven experience with
dbt (data build tool)
– model development, documentation, and deployment automation.
Proficient in
SQL, Python, and ETL/ELT pipeline development .
Experience with
CI/CD pipelines , version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).
Familiarity with
data governance and security best practices , including role‑based access control and data masking.
Strong understanding of
data modeling techniques
(Kimball, Data Vault, etc.) and
data architecture principles .
Preferred Qualifications
AWS Certification (e.g.,
AWS Certified Data Analytics – Specialty ,
Solutions Architect ).
Strong communication and collaboration skills, with a track record of working in agile environments.
Seniority level Mid‑Senior level
Employment type Full‑time
Job function Information Technology and Design
IT Services and IT Consulting and Software Development
#J-18808-Ljbffr