Logo
Compunnel, Inc.

Data Engineer - Python, SQL, AWS

Compunnel, Inc., Durham, North Carolina, United States, 27703

Save Job

We are seeking an experienced Data Engineer with a strong background in Python, SQL, and AWS to join our team. This role involves designing and developing scalable data pipelines, performing data analysis and modeling, and contributing to the creation of an enterprise-wide Data Lake on AWS. The ideal candidate will be passionate about data, enjoy working in collaborative environments, and have a strong desire to innovate and learn. Key Responsibilities

Design and implement scalable ETL/ELT pipelines using AWS Glue, Lambda, Step Functions, and other AWS services. Integrate structured and unstructured data from diverse sources into data lakes and warehouses (e.g., S3, Redshift, RDS, Athena). Build and maintain cloud infrastructure for data analytics platforms using Terraform, CloudFormation, or similar IaC tools. Collaborate with data engineers, data scientists, and analysts to deliver high-quality platforms for data loading, reporting, and machine learning. Optimize data models and queries for performance and scalability. Monitor data pipelines and troubleshoot issues to ensure reliability and data integrity. Implement CI/CD pipelines for data engineering workflows using GitLab, Bitbucket, Jenkins, or GitHub Actions. Ensure compliance with data governance and security best practices. Implement access controls and encryption for sensitive data. Required Qualifications

Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Extensive experience with relational databases such as Oracle or Snowflake. Experience in data warehousing, data modeling, and creation of data marts. Hands‑on experience with AWS services including S3, Glue, Lambda, Redshift, RDS, Athena, and Step Functions. Experience with ETL technologies such as Informatica or SnapLogic. Proficiency in SQL and PySpark. Familiarity with orchestration tools like Apache Airflow or MWAA. Understanding of DevOps tools and practices (CDK, CI/CD, Git, Terraform). Experience with Agile methodologies (Kanban and SCRUM). Preferred Qualifications

Experience with big data tools (Spark, Hive, Kafka). Knowledge of containerization (Docker, Kubernetes). Familiarity with data visualization tools (e.g., Power BI). AWS certifications (e.g., AWS Certified Data Analytics – Specialty). Experience with Business Intelligence and dashboard development. Exposure to DevOps, Continuous Integration, and Continuous Delivery tools (Maven, Jenkins, Stash, Ansible, Docker).

#J-18808-Ljbffr