Compunnel
Job Summary
We are seeking a skilled GCP Data Engineer to design, build, and maintain scalable data infrastructure that supports the organization’s data-driven initiatives.
This role involves developing robust data pipelines, optimizing workflows, and ensuring data quality and integrity.
The ideal candidate will collaborate with cross-functional teams to enable efficient data processing, storage, and retrieval across cloud environments.
Key Responsibilities Design and implement scalable data pipelines for extracting, transforming, and loading data from various sources. Develop solutions that support efficient data storage, retrieval, and analysis in data warehouses and data lakes. Implement data validation and quality checks to ensure accuracy and consistency. Document data engineering processes, workflows, and systems for knowledge sharing. Identify opportunities to streamline data engineering processes and improve efficiency. Provide guidance and mentorship to junior data engineers. Collaborate with data scientists, analysts, and software engineers to support data initiatives. Required Qualifications
5+ years of experience with SQL and NoSQL databases. 5+ years of experience with Python or a comparable scripting language. 5+ years of hands-on experience building modern data pipelines within GCP. 5+ years of experience with data warehouses and infrastructure components. 5+ years of experience with ETL/ELT processes and high-volume data pipelines. 5+ years of experience with reporting and analytics tools. 5+ years of experience in query optimization, data structures, transformation, metadata, and workload management. 5+ years of experience with big data and cloud architecture. 3–5 years of experience deploying and scaling applications in containerized environments. 5+ years of experience with real-time and streaming technologies. 3+ years of experience gathering complex requirements and managing stakeholder relationships. 3+ years of experience independently managing deliverables. Preferred Qualifications
Experience designing and building data engineering solutions in cloud environments (preferably GCP). Familiarity with Git, CI/CD pipelines, and DevOps best practices. Experience with Bash scripting, UNIX utilities, and commands. Exposure to ML/AI workflows. Understanding of software development methodologies (Agile, Waterfall). Ability to work with multiple tools and languages to analyze and manipulate data from diverse sources. Knowledge of API development and microservices architecture. Experience with schema design and dimensional data modeling. Strong collaboration and communication skills across teams. Experience with data visualization and reporting tools. Experience designing, building, and maintaining data processing systems.
Education:
Bachelors Degree
We are seeking a skilled GCP Data Engineer to design, build, and maintain scalable data infrastructure that supports the organization’s data-driven initiatives.
This role involves developing robust data pipelines, optimizing workflows, and ensuring data quality and integrity.
The ideal candidate will collaborate with cross-functional teams to enable efficient data processing, storage, and retrieval across cloud environments.
Key Responsibilities Design and implement scalable data pipelines for extracting, transforming, and loading data from various sources. Develop solutions that support efficient data storage, retrieval, and analysis in data warehouses and data lakes. Implement data validation and quality checks to ensure accuracy and consistency. Document data engineering processes, workflows, and systems for knowledge sharing. Identify opportunities to streamline data engineering processes and improve efficiency. Provide guidance and mentorship to junior data engineers. Collaborate with data scientists, analysts, and software engineers to support data initiatives. Required Qualifications
5+ years of experience with SQL and NoSQL databases. 5+ years of experience with Python or a comparable scripting language. 5+ years of hands-on experience building modern data pipelines within GCP. 5+ years of experience with data warehouses and infrastructure components. 5+ years of experience with ETL/ELT processes and high-volume data pipelines. 5+ years of experience with reporting and analytics tools. 5+ years of experience in query optimization, data structures, transformation, metadata, and workload management. 5+ years of experience with big data and cloud architecture. 3–5 years of experience deploying and scaling applications in containerized environments. 5+ years of experience with real-time and streaming technologies. 3+ years of experience gathering complex requirements and managing stakeholder relationships. 3+ years of experience independently managing deliverables. Preferred Qualifications
Experience designing and building data engineering solutions in cloud environments (preferably GCP). Familiarity with Git, CI/CD pipelines, and DevOps best practices. Experience with Bash scripting, UNIX utilities, and commands. Exposure to ML/AI workflows. Understanding of software development methodologies (Agile, Waterfall). Ability to work with multiple tools and languages to analyze and manipulate data from diverse sources. Knowledge of API development and microservices architecture. Experience with schema design and dimensional data modeling. Strong collaboration and communication skills across teams. Experience with data visualization and reporting tools. Experience designing, building, and maintaining data processing systems.
Education:
Bachelors Degree