Logo
Compunnel

Python + PySpark - Junior Developer

Compunnel, New York, New York, us, 10261

Save Job

The Python + PySpark + AWS Consultant will be responsible for designing, developing, and optimizing data engineering solutions using Python and PySpark. The role requires hands-on experience in building scalable ETL pipelines, working with cloud platforms such as AWS or Azure, and ensuring data integrity and performance optimization. The candidate should have strong SQL skills and a deep understanding of data engineering principles. Key Responsibilities Develop and maintain data processing workflows using Python and PySpark Design and implement ETL pipelines for structured and unstructured data Optimize Spark-based data processing for efficiency and scalability Deploy and manage data solutions on AWS or Azure Write and optimize SQL queries for data transformation and analysis Troubleshoot and resolve performance issues in data pipelines Work closely with cross-functional teams to ensure data reliability and integrity Required Qualifications 5+ years of experience in data engineering Strong proficiency in Python and object-oriented programming Hands-on experience with PySpark for large-scale data processing Proficiency in SQL for data manipulation and query performance tuning Experience with AWS or Azure for cloud-based data solutions Knowledge of ETL processes and data pipeline automation Experience with Hadoop is acceptable Preferred Qualifications Experience in optimizing Spark jobs for performance and cost efficiency Familiarity with DevOps practices for data engineering Understanding of data governance, security, and compliance best practices #J-18808-Ljbffr