Compunnel, Inc.
We are seeking an experienced Snowflake DBT Engineer to design, develop, and maintain scalable ELT pipelines and data models.
The ideal candidate will have strong expertise in Snowflake, DBT, and modern data engineering practices, with a focus on performance optimization, automation, and data governance.
Key Responsibilities
Design, develop, and maintain ELT pipelines using Snowflake and DBT. Build and optimize data models in Snowflake to support analytics and reporting. Implement modular, testable SQL transformations using DBT. Integrate DBT workflows into CI/CD pipelines and manage infrastructure as code using Terraform. Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions. Optimize Snowflake performance through clustering, partitioning, indexing, and materialized views. Automate data ingestion and transformation workflows using Airflow or similar orchestration tools. Ensure data quality, governance, and security across pipelines. Troubleshoot and resolve performance bottlenecks and data issues. Maintain documentation for data architecture, pipelines, and operational procedures. Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. 7+ years of experience in data engineering, with at least 2 years focused on Snowflake and DBT. Strong proficiency in SQL and Python. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with Git, CI/CD, and Infrastructure as Code tools (Terraform, CloudFormation). Knowledge of data modeling techniques (star schema, normalization) and ELT best practices.
#J-18808-Ljbffr
Design, develop, and maintain ELT pipelines using Snowflake and DBT. Build and optimize data models in Snowflake to support analytics and reporting. Implement modular, testable SQL transformations using DBT. Integrate DBT workflows into CI/CD pipelines and manage infrastructure as code using Terraform. Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions. Optimize Snowflake performance through clustering, partitioning, indexing, and materialized views. Automate data ingestion and transformation workflows using Airflow or similar orchestration tools. Ensure data quality, governance, and security across pipelines. Troubleshoot and resolve performance bottlenecks and data issues. Maintain documentation for data architecture, pipelines, and operational procedures. Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. 7+ years of experience in data engineering, with at least 2 years focused on Snowflake and DBT. Strong proficiency in SQL and Python. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with Git, CI/CD, and Infrastructure as Code tools (Terraform, CloudFormation). Knowledge of data modeling techniques (star schema, normalization) and ELT best practices.
#J-18808-Ljbffr