Logo
Energy Jobline ZR

Senior AWS Data Engineer in Dallas

Energy Jobline ZR, Dallas, Texas, United States, 75215

Save Job

Energy Jobline is the largest and fastest growing global Energy Job Board and Energy Hub. We have an audience reach of over 7 million energy professionals, 400,000+ monthly advertised global energy and engineering jobs, and work with the leading energy companies worldwide.

We focus on the Oil & Gas, Renewables, Engineering, Power, and Nuclear markets as well as emerging technologies in EV, Battery, and Fusion. We are committed to ensuring that we offer the most exciting career opportunities from around the world for our jobseekers.

Benefits

Hybrid

Competitive salary

Opportunity for advancement

Job Title Job Title:

AWS Data Engineer

Location:

Dallas, TX (Hybrid 3 days onsite)

Experience:

812 years

Interview Process:

In-Person

Profiles:

Non- Can Apply

Overview We are looking for an experienced AWS Data Engineer with strong expertise in ETL, cloud migration, and large-scale data engineering. The ideal candidate is hands‑on with AWS, Python/PySpark, and SQL, and can design, optimize, and manage complex data pipelines. This role requires collaboration across teams to deliver secure, scalable, and high‑quality data solutions that drive business intelligence and operational efficiency.

Key Responsibilities

Design, build, and maintain scalable ETL pipelines across AWS and SQL‑based technologies.

Assemble large, complex datasets that meet business and technical requirements.

Implement process improvements by re‑architecting infrastructure, optimizing data delivery, and automating workflows.

Ensure data quality and integrity across multiple sources and targets.

Orchestrate workflows with Apache Airflow (MWAA) and support large‑scale cloud migration projects.

Conduct ETL testing, apply test‑driven development (TDD), and participate in code reviews.

Monitor, troubleshoot, and optimize pipelines for performance, reliability, and security.

Collaborate with cross‑functional teams and participate in Agile ceremonies (sprints, reviews, stand‑ups).

Requirements

812 years of experience in Data Engineering, with deep focus on ETL, cloud pipelines, and Python development.

5+ years of hands‑on coding with Python (primary), PySpark, and SQL.

Proven experience with AWS services: Glue, EMR (Spark), S3, Lambda, ECS/EKS, MWAA (Airflow), IAM.

Experience with AuroraDB, DynamoDB, Redshift, and AWS Data Lakes.

Strong knowledge of data modeling, database design, and advanced ETL processes (including Alteryx).

Proficiency with structured and semi‑structured file types (Delimited Text, Fixed Width, XML, JSON, Parquet).

Experience with ServiceBus or equivalent AWS streaming/messaging tools (SNS, SQS, Kinesis, Kafka).

CI/CD expertise with GitLab or similar, plus hands‑on Infrastructure‑as‑Code (Terraform, Python, Jinja, YAML).

Familiarity with unit testing, code quality tools, containerization, and security best practices.

Solid Agile development background, with experience in Agile ceremonies and practices.

Flexible work from home options available.

If you are interested in applying for this job please press the Apply Button and follow the application process. Energy Jobline wishes you the very best of luck in your next career move.

#J-18808-Ljbffr