Logo
Xebia

AWS Data Engineer Xebia

Xebia, Poland, Maine, us, 04274

Save Job

Xebia is a trusted advisor in the modern era of digital transformation, serving hundreds of leading brands worldwide with end-to-end IT solutions. The company has experts specializing in technology consulting, software engineering, AI, digital products and platforms, data, cloud, intelligent automation, agile transformation, and industry digitization. In addition to providing high-quality digital consulting and state-of-the-art software development, Xebia has a host of standardized solutions that substantially reduce the time-to-market for businesses. Xebia also offers a diverse portfolio of training courses to help support forward-thinking organizations as they look to upskill and educate their workforce to capitalize on the latest digital capabilities. The company has a strong presence across 16 countries with development centres across the US, Latin America, Western Europe, Poland, the Nordics, the Middle East, and Asia Pacific. We are seeking an experienced

AWS Data Engineer

to design and implement scalable data pipelines and solutions using AWS cloud technologies. You will be responsible for building robust data infrastructure to support analytics, reporting, and machine learning workloads. Key Responsibilities

Design and develop

ETL/ELT pipelines

using AWS-native tools Build scalable data processing systems using

AWS Glue ,

Lambda ,

Step Functions , and

Athena Work with

S3 ,

Redshift , and

RDS

for data storage, warehousing, and transformation Develop and maintain

Python and SQL scripts

for data manipulation and automation Handle

real-time and batch data ingestion

using services like

Kinesis

or

Kafka Collaborate with data scientists, analysts, and engineering teams to support business use cases Monitor data workflows, ensure data quality, and troubleshoot pipeline issues Required Skills

Strong hands-on experience with

AWS data services : Glue, S3, Redshift, Lambda, Athena Proficiency in

Python

and

advanced SQL Experience with

workflow orchestration

tools like Airflow or Step Functions Familiarity with

data lake and data warehouse

architectures on AWS Good understanding of

data security, IAM roles , and

cloud best practices Strong problem-solving and collaboration skills Nice to Have

Experience with

Kinesis ,

Kafka , or other streaming platforms Exposure to

Terraform/CloudFormation

for infrastructure as code Familiarity with

Spark

or

EMR

for big data processing Experience with

data cataloging ,

governance , or

ML workflows Interested in building your career at APAC? Get future opportunities sent straight to your email. Apply for this job

Selected fields and resume submission are handled via the standard application process. #J-18808-Ljbffr