Logo
USM

Sr. Data Engineer

USM, Pleasanton, California, United States, 94566

Save Job

Start Date: Interview Types Skills Data Engineering, EL.. Visa Types Green Card, US Citiz..

Title: Senior Data Engineer

Location: Pleasanton, California (hybrid work)

Job type: Contract

Rate: $80/hr on C2C

Role Overview:

As a Senior/Lead Data Engineer, you will lead the design, development, and ownership of core data infrastructure-from pipelines to storage to data products. You'll be a strategic partner across teams, ensuring that our data systems are robust, scalable, and optimized for performance. With executive visibility and deep cross-functional collaboration, the solutions you build will directly influence product strategy and operational excellence.

This is a unique opportunity to build from the ground up while working with cutting-edge technologies such as Postgres SQL, dbt, Snowflake, and modern orchestration frameworks.

Key Responsibilities:

• Architect, design, and implement scalable ELT pipelines using Snowflake, dbt, and Postgres.

• Optimize data models in both Snowflake (cloud DW) and Postgres (transactional/operational data).

• Implement advanced Snowflake features (Snowpipe, Streams, Tasks, Dynamic Tables, RBAC, Security).

• Design and maintain hybrid pipelines (Postgres Snowflake) for seamless data integration.

• Establish data quality and testing frameworks using dbt tests and metadata-driven validation.

• Implement CI/CD workflows (Git, GitHub Actions, or similar) for dbt/Snowflake/Postgres projects.

• Drive observability, monitoring, and performance tuning of pipelines (logs, lineage, metrics).

• Provide technical leadership and mentorship to engineers and analysts.

• Collaborate with Finance, Product, Marketing, and GTM teams to deliver trusted, business-critical data models.

• Support financial data processes (consolidation, reconciliation, close automation).

• Evaluate and experiment with emerging AI and data technologies, providing feedback to influence product direction.

Requirements:



Experience:

8+ years in Data Engineering, including 3+ years in Snowflake & dbt.



Database Expertise:

o Deep hands-on experience with dbt (Core/Cloud) - macros, testing, documentation, and packages.

o Strong expertise in Postgres (schema design, optimization, stored procedures, large-scale workloads).

o Advanced knowledge of Snowflake (data modeling, performance tuning, governance).



Programming:

Proficient in SQL and Python, including API integrations and automation.



Orchestration & ETL:

Hands-on with Airflow, Dagster, Prefect (or similar), and ETL/ELT tools like Fivetran, Nifi.



Data Architecture:

Strong understanding of data warehousing, dimensional modeling, medallion architecture, and system design principles.



Cloud:

Experience with AWS (mandatory); GCP or Azure is a plus.



DevOps:

Experience with Git/GitOps CI/CD pipelines for data workflows.



Leadership:

Proven ability to mentor teams, collaborate cross-functionally, and deliver impact in fast-paced environments.



Communication:

Excellent written and verbal communication skills.