Compunnel, Inc.
Overview
We are seeking a skilled Data Engineer to design, build, and maintain robust data pipelines and data models that support enterprise analytics and reporting. The ideal candidate will have strong SQL skills, experience with modern ETL/ELT tools, and a solid understanding of data warehousing concepts. This role requires working with large datasets, optimizing performance, and collaborating with cross-functional teams to ensure data quality and accessibility. Responsibilities
Develop and maintain efficient ETL/ELT pipelines using tools such as Apache Airflow, DBT, or similar. Design and implement scalable data models and data warehousing solutions (e.g., star and snowflake schemas). Write complex SQL queries and optimize database performance across platforms like PostgreSQL, MySQL, SQL Server, and Snowflake. Work with large datasets to ensure high performance and reliability of data processing workflows. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data quality, integrity, and consistency across systems. Utilize version control systems (e.g., Git) for code management and collaboration. Write scripts and automation tools using Python or Scala to support data engineering tasks. Required Qualifications
Strong experience in SQL and relational database development (e.g., PostgreSQL, MySQL, SQL Server, Snowflake). Proficiency in building ETL/ELT pipelines using tools like Apache Airflow, DBT, or similar. Solid understanding of data warehousing concepts and data modeling techniques. Experience working with large datasets and optimizing data processing performance. Familiarity with scripting or programming languages such as Python or Scala. Experience with version control systems such as Git. Strong problem-solving skills and attention to detail. Bachelor’s degree in Computer Science, Information Systems, or a related field. Preferred Qualifications
Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with data governance and data quality frameworks. Exposure to real-time data streaming tools and architectures.
#J-18808-Ljbffr
We are seeking a skilled Data Engineer to design, build, and maintain robust data pipelines and data models that support enterprise analytics and reporting. The ideal candidate will have strong SQL skills, experience with modern ETL/ELT tools, and a solid understanding of data warehousing concepts. This role requires working with large datasets, optimizing performance, and collaborating with cross-functional teams to ensure data quality and accessibility. Responsibilities
Develop and maintain efficient ETL/ELT pipelines using tools such as Apache Airflow, DBT, or similar. Design and implement scalable data models and data warehousing solutions (e.g., star and snowflake schemas). Write complex SQL queries and optimize database performance across platforms like PostgreSQL, MySQL, SQL Server, and Snowflake. Work with large datasets to ensure high performance and reliability of data processing workflows. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data quality, integrity, and consistency across systems. Utilize version control systems (e.g., Git) for code management and collaboration. Write scripts and automation tools using Python or Scala to support data engineering tasks. Required Qualifications
Strong experience in SQL and relational database development (e.g., PostgreSQL, MySQL, SQL Server, Snowflake). Proficiency in building ETL/ELT pipelines using tools like Apache Airflow, DBT, or similar. Solid understanding of data warehousing concepts and data modeling techniques. Experience working with large datasets and optimizing data processing performance. Familiarity with scripting or programming languages such as Python or Scala. Experience with version control systems such as Git. Strong problem-solving skills and attention to detail. Bachelor’s degree in Computer Science, Information Systems, or a related field. Preferred Qualifications
Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with data governance and data quality frameworks. Exposure to real-time data streaming tools and architectures.
#J-18808-Ljbffr