The ideal candidate will have strong expertise in data modeling, SQL, big data technologies, and cloud platforms. You will collaborate with data scientists, analysts, and business teams to ensure efficient data flow and accessibility for decision-making.
Certifications in AWS/GCP/Azure Data Engineering
Position Overview
We are seeking a skilled Data Engineer to design, develop, and maintain scalable data pipelines, ETL processes, and data infrastructure. The ideal candidate will have strong expertise in data modeling, SQL, big data technologies, and cloud platforms. You will collaborate with data scientists, analysts, and business teams to ensure efficient data flow and accessibility for decision-making.
Key Responsibilities
Design & Build Data Pipelines: Develop and optimize ETL/ELT workflows to ingest, process, and transform large datasets.
Data Warehousing & Modeling: Design and maintain data warehouses (e.g., Snowflake, Redshift, BigQuery) and implement efficient data models.
Big Data Processing: Work with distributed systems (e.g., Spark, Hadoop) to handle large-scale data processing.
Cloud & Infrastructure: Deploy and manage data solutions on cloud platforms (AWS, GCP, Azure) using services like S3, Databricks, Airflow, etc.
Data Quality & Governance: Ensure data accuracy, consistency, and security through monitoring, validation, and governance practices.
Collaboration: Work closely with data scientists, analysts, and business teams to understand data requirements and deliver scalable solutions.
Performance Optimization: Improve query performance, reduce latency, and enhance system reliability.
Automation & CI/CD: Implement automation for data workflows and integrate with DevOps practices.
Required Skills & Qualifications
Education: Bachelor’s/Master’s in Computer Science, Engineering, or related field.
Experience: at least 5 years in data engineering or related roles.
Programming: Strong proficiency in Python, SQL, and optionally Java/Scala.
Databases & Big Data: Experience with SQL/NoSQL databases (PostgreSQL, MySQL, MongoDB), Spark, Hadoop, Kafka.
ETL & Data Pipelines: Hands-on experience with Airflow, dbt, Talend, Informatica, or similar tools.
Cloud Platforms: Expertise in AWS (Redshift, Glue, Lambda), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse).
Data Warehousing: Knowledge of Snowflake, Redshift, or BigQuery.
DevOps & Version Control: Familiarity with Git, Docker, Kubernetes, CI/CD pipelines.
Analytical Skills: Strong problem-solving and data optimization abilities.
Preferred Skills
Experience with real-time data processing (Kafka, Flink).
Knowledge of machine learning pipelines and MLOps.
Interested candidates, please click "APPLY" to begin your job search journey and submit your CV directly through the official PERSOLKELLY job application platform - GO Mobile.
We regret to inform you that only shortlisted candidates will be notified.
Saravanan | REG No : R1871815
PERSOLKELLY SINGAPORE PTE LTD | EA License No : 01C4394
This is in partnership with Employment and Employability Institute Pte Ltd (“e2i”). e2i is the empowering network for workers and employers seeking employment and employability solutions. e2i serves as a bridge between workers and employers, connecting with workers to offer job security through job-matching, career guidance and skills upgrading services, and partnering employers to address their manpower needs through recruitment, training and job redesign solutions. e2i is a tripartite initiative of the National Trades Union Congress set up to support nation-wide manpower and skills upgrading initiatives. By applying for this role, you consent to
e2i’s PDPA
.
By sending us your personal data and curriculum vitae (CV), you are deemed to consent to PERSOLKELLY Singapore Pte Ltd and its affiliates to collect, use and disclose your personal data for the purposes set out in the Privacy Policy available at
https://www.persolkelly.com.sg/policies . You acknowledge that you have read, understood, and agree with the Privacy Policy.
#J-18808-Ljbffr
See details and apply
Data Engineer | Agency Contract | SARS