Atlantic Partners
Overview
We are seeking Senior Data Engineers with deep expertise in building scalable, cloud-native data platforms on AWS. This is a hands-on engineering role focused on designing and implementing modern lakehouse architectures using some AWS managed services, open table formats (Iceberg), and compute running in our EKS/Argo Workflows environments. Responsibilities Advanced Python Engineering Skills
Strong proficiency in Python for data engineering tasks.
Experience with modular, testable code and production-grade pipelines.
Not looking for SQL-heavy DBAs or analysts; this is a software engineering role.
AWS Lakehouse Architecture Expertise
Proven experience designing and implementing lakehouse architectures on AWS.
Familiarity with key AWS services: S3, Glue, Athena, Glue Data Catalog, Lake Formation, QuickSight, CloudWatch, etc.
Experience with AWS QuickSight (Preferred), Tableau or Cognos
ETL Pipeline Development
Bonus: Experience with EKS-based orchestration using EMR on EKS or Argo Workflows.
Open Table Formats
Deep understanding of Apache Iceberg (preferred), Delta Lake, or Apache Hudi.
Experience implementing time-travel, schema evolution, and partitioning strategies.
Medallion Architecture Implementation
Experience designing and implementing Bronze → Silver → Gold data layers.
Understanding of ingestion, transformation, and curation best practices.
Slowly Changing Dimensions (SCD Type 2)
Solid grasp of SCD2 semantics and implementation strategies in distributed data systems.
Soft Skills
Ability to work independently and collaborate with cross-functional teams including tech leads, architects, and product managers.
Years of Experience: 4-5 within data engineering / AWS
Nice to Haves Experience with DataOps practices and CI/CD for data pipelines.
Familiarity with Terraform or CloudFormation for infrastructure-as-code.
Exposure to data quality frameworks like Deequ or Great Expectations.
Undergrad degree
Iceberg on AW
Terraform data source
AWS iceberg python pipelines en
Need to be able to take the initiative, not wait for permission to do something
If they have no Python experience
If they no experience in the cloud/AWS
If they no experience building pipelines
Not looking for administrative based experience
They need to be a more modern based engineer
#J-18808-Ljbffr
We are seeking Senior Data Engineers with deep expertise in building scalable, cloud-native data platforms on AWS. This is a hands-on engineering role focused on designing and implementing modern lakehouse architectures using some AWS managed services, open table formats (Iceberg), and compute running in our EKS/Argo Workflows environments. Responsibilities Advanced Python Engineering Skills
Strong proficiency in Python for data engineering tasks.
Experience with modular, testable code and production-grade pipelines.
Not looking for SQL-heavy DBAs or analysts; this is a software engineering role.
AWS Lakehouse Architecture Expertise
Proven experience designing and implementing lakehouse architectures on AWS.
Familiarity with key AWS services: S3, Glue, Athena, Glue Data Catalog, Lake Formation, QuickSight, CloudWatch, etc.
Experience with AWS QuickSight (Preferred), Tableau or Cognos
ETL Pipeline Development
Bonus: Experience with EKS-based orchestration using EMR on EKS or Argo Workflows.
Open Table Formats
Deep understanding of Apache Iceberg (preferred), Delta Lake, or Apache Hudi.
Experience implementing time-travel, schema evolution, and partitioning strategies.
Medallion Architecture Implementation
Experience designing and implementing Bronze → Silver → Gold data layers.
Understanding of ingestion, transformation, and curation best practices.
Slowly Changing Dimensions (SCD Type 2)
Solid grasp of SCD2 semantics and implementation strategies in distributed data systems.
Soft Skills
Ability to work independently and collaborate with cross-functional teams including tech leads, architects, and product managers.
Years of Experience: 4-5 within data engineering / AWS
Nice to Haves Experience with DataOps practices and CI/CD for data pipelines.
Familiarity with Terraform or CloudFormation for infrastructure-as-code.
Exposure to data quality frameworks like Deequ or Great Expectations.
Undergrad degree
Iceberg on AW
Terraform data source
AWS iceberg python pipelines en
Need to be able to take the initiative, not wait for permission to do something
If they have no Python experience
If they no experience in the cloud/AWS
If they no experience building pipelines
Not looking for administrative based experience
They need to be a more modern based engineer
#J-18808-Ljbffr