CriticalRiver Inc.
Title:
Senior Data Engineer (Must have DBT experience)
Location:
Pleasanton, California (hybrid work)
Responsibilities:
We are seeking a Senior Data Engineer to own and architect core data infrastructure. In this strategic role, you will design and implement scalable ELT pipelines using Postgres, dbt, and Snowflake, enabling data products that power product strategy and business operations.
You’ll collaborate across Finance, Product, and Marketing teams to ensure high-quality, trusted data flows through robust and secure systems.
You’ll optimize data models across transactional and cloud environments, implement advanced Snowflake features and build hybrid pipelines from Postgres to Snowflake.
You'll also lead the development of CI/CD workflows, data quality frameworks, and observability systems.
Requirements:
10+ years in Data Engineering, including 3+ years in Snowflake & dbt.
Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
Proficient in SQL and Python, including API integrations and automation.
Strong understanding of data warehousing, dimensional modeling and system design principles.
Experience with AWS (mandatory); GCP or Azure is a plus.
Seniority level Mid-Senior level
Employment type Contract
Job function Information Technology
Industries IT Services and IT Consulting
#J-18808-Ljbffr
Senior Data Engineer (Must have DBT experience)
Location:
Pleasanton, California (hybrid work)
Responsibilities:
We are seeking a Senior Data Engineer to own and architect core data infrastructure. In this strategic role, you will design and implement scalable ELT pipelines using Postgres, dbt, and Snowflake, enabling data products that power product strategy and business operations.
You’ll collaborate across Finance, Product, and Marketing teams to ensure high-quality, trusted data flows through robust and secure systems.
You’ll optimize data models across transactional and cloud environments, implement advanced Snowflake features and build hybrid pipelines from Postgres to Snowflake.
You'll also lead the development of CI/CD workflows, data quality frameworks, and observability systems.
Requirements:
10+ years in Data Engineering, including 3+ years in Snowflake & dbt.
Advanced knowledge of Snowflake (data modeling, performance tuning, governance).
Proficient in SQL and Python, including API integrations and automation.
Strong understanding of data warehousing, dimensional modeling and system design principles.
Experience with AWS (mandatory); GCP or Azure is a plus.
Seniority level Mid-Senior level
Employment type Contract
Job function Information Technology
Industries IT Services and IT Consulting
#J-18808-Ljbffr