Atlantic Partners
Job Description
Our client is seeking Senior Data Engineers with deep expertise in building scalable, cloud-native data platforms on AWS. This is a hands?on engineering role focused on designing and implementing modern lakehouse architectures using AWS managed services, Open Table formats (Iceberg), and compute running in our EKS/Argo Workflows environments. Team Culture/ Work Environment 4-5 data teams All teams run through SAFe Sprints/deliverables Highly collaborative Fast pace Most of the team is hybrid Culture of ownership and initiative
Key Projects
Data systems to the cloud
Daily Responsibilities
Advanced Python Engineering Skills Strong proficiency in Python for data engineering tasks Experience with modular, testable code and production?grade pipelines Not a SQL?heavy DBA or analyst rolethis is a software engineering role
AWS Lakehouse Architecture Expertise
Proven experience designing and implementing lakehouse architectures on AWS Familiarity with key AWS services: S3, Glue, Athena, Glue Data Catalog, Lake Formation, QuickSight, CloudWatch, etc. Experience with QuickSight (preferred), Tableau or Cognos ETL Pipeline Development Bonus: Experience with EKS?based orchestration using EMR on EKS or Argo Workflows
Open Table Formats
Deep understanding of Apache Iceberg (preferred), Delta Lake, or Apache Hudi Experience implementing time?travel, schema evolution, and partitioning strategies
Medallion Architecture Implementation
Experience designing and implementing Bronze ? Silver ? Gold data layers Understanding of ingestion, transformation, and curation best practices
Slowly Changing Dimensions (SCD Type 2)
Solid grasp of SCD2 semantics and implementation strategies in distributed data systems
Strong communication and documentation skills Ability to work independently and collaborate with cross?functional teams including tech leads, architects, and product managers
Degree or Certification Required
None
Years of Experience
4-5 years within data engineering / AWS
Nice to Haves
Experience with DataOps practices and CI/CD for data pipelines Familiarity with Terraform or CloudFormation for infrastructure?as?code Exposure to data quality frameworks like Deequ or Great Expectations Undergraduate degree Iceberg on AWS
#J-18808-Ljbffr
Our client is seeking Senior Data Engineers with deep expertise in building scalable, cloud-native data platforms on AWS. This is a hands?on engineering role focused on designing and implementing modern lakehouse architectures using AWS managed services, Open Table formats (Iceberg), and compute running in our EKS/Argo Workflows environments. Team Culture/ Work Environment 4-5 data teams All teams run through SAFe Sprints/deliverables Highly collaborative Fast pace Most of the team is hybrid Culture of ownership and initiative
Key Projects
Data systems to the cloud
Daily Responsibilities
Advanced Python Engineering Skills Strong proficiency in Python for data engineering tasks Experience with modular, testable code and production?grade pipelines Not a SQL?heavy DBA or analyst rolethis is a software engineering role
AWS Lakehouse Architecture Expertise
Proven experience designing and implementing lakehouse architectures on AWS Familiarity with key AWS services: S3, Glue, Athena, Glue Data Catalog, Lake Formation, QuickSight, CloudWatch, etc. Experience with QuickSight (preferred), Tableau or Cognos ETL Pipeline Development Bonus: Experience with EKS?based orchestration using EMR on EKS or Argo Workflows
Open Table Formats
Deep understanding of Apache Iceberg (preferred), Delta Lake, or Apache Hudi Experience implementing time?travel, schema evolution, and partitioning strategies
Medallion Architecture Implementation
Experience designing and implementing Bronze ? Silver ? Gold data layers Understanding of ingestion, transformation, and curation best practices
Slowly Changing Dimensions (SCD Type 2)
Solid grasp of SCD2 semantics and implementation strategies in distributed data systems
Strong communication and documentation skills Ability to work independently and collaborate with cross?functional teams including tech leads, architects, and product managers
Degree or Certification Required
None
Years of Experience
4-5 years within data engineering / AWS
Nice to Haves
Experience with DataOps practices and CI/CD for data pipelines Familiarity with Terraform or CloudFormation for infrastructure?as?code Exposure to data quality frameworks like Deequ or Great Expectations Undergraduate degree Iceberg on AWS
#J-18808-Ljbffr