Logo
R2 Technologies

Python Data Engineer (Airflow, Kafka, Spark) - Q125

R2 Technologies, Alpharetta, Georgia, United States, 30239

Save Job

R2 Technologies Corporation (R2) , headquartered in Alpharetta, GA, is a leading IT services provider specializing in Java, .NET, Big Data, Cloud Computing (AWS, GCP, Azure), Artificial Intelligence (AI), Machine Learning (ML), software development, project management, SAP, and enterprise resource planning (ERP). We empower clients-from startups to Fortune 1000 companies-with scalable, platform-based solutions and data-driven insights using modern cloud technologies. Our commitment to blending highly skilled talent with innovative productivity platforms ensures rapid delivery of business value, making us one of the most respected and trusted technology companies in the United States. At R2, we're passionate about driving operational excellence and competitive advantage for our clients through cutting-edge AI, ML, and cloud solutions. Join our team and help shape the future of technology innovation!

Python Data Engineer (Airflow, Kafka, Spark)

Location:

Alpharetta, GA (willing to travel to client locations)

Employment Type:

Full-Time (W2)

Role Overview

We are seeking a proficient Python Data Engineer to build and manage data pipelines using Airflow, Kafka, and Spark. This role focuses on developing scalable ETL/ELT workflows for streaming and batch data processing.

Key Responsibilities

Design and implement data pipelines using Python with Airflow for orchestration and scheduling. Integrate Kafka for streaming data and Spark for processing large-scale datasets. Develop ETL/ELT workflows to transform and load data into storage or analytics systems. Optimize pipelines for performance, reliability, and scalability in distributed environments. Collaborate with data teams to ensure data quality and availability for downstream applications. Monitor and troubleshoot data workflows to maintain seamless operations. Required Qualifications

Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent experience). 3 years of experience as a Data Engineer with Python, focusing on Airflow, Kafka, and Spark. Proficiency in building ETL/ELT pipelines for streaming and batch data processing. Experience with Kafka for real-time data streaming and Spark for distributed data processing. Strong understanding of data engineering principles and pipeline automation. Preferred Qualifications

Familiarity with alternative orchestration tools like Luigi for data workflow management. Exposure to cloud platforms like AWS or GCP for hosting data pipelines. Knowledge of monitoring tools like Grafana for pipeline observability. Compensation & Benefits

Competitive salary and comprehensive benefits package (healthcare, PTO, 401k). Opportunities for professional growth and upskilling in AI and cloud technologies.

R2 Technologies Corporation is an equal opportunity employer and values diversity in the workplace.

Skills:

Python, Airflow, Kafka, Spark, Data Engineer, ETL, ELT, Streaming