Logo
BayRockLabs

Snowflake Data Engineer

BayRockLabs, San Jose, California, United States, 95199

Save Job

About BayRock Labs

At BayRock Labs, we pioneer innovative tech solutions that drive business transformation. As a leading product engineering firm based in Silicon Valley, we provide full-cycle product development, leveraging cutting‑edge technologies in AI, ML, and data analytics. Our collaborative, inclusive culture fosters professional growth and work‑life balance. Join us to work on ground‑breaking projects and be part of a team that values excellence, integrity, and innovation. Together, let's redefine what's possible in technology. We are seeking a highly skilled

Snowflake Data Engineer

to design, develop, and optimize our enterprise data foundation, specifically for our production‑level AI applications built on Snowflake Cortex. This role is crucial for ensuring the AI agents receive

clean, aggregated, and optimized data

efficiently. Key Responsibilities

Snowflake Architecture & Design:

Design and implement scalable and high‑performance data models (e.g., Data Vault, Dimensional Modeling) within Snowflake, specifically structuring data for AI/ML consumption. Data Aggregation & Optimization:

Lead the effort to reduce our existing columns down to the necessary, non‑duplicated, and optimized feature set required by the AI Agents. ETL/ELT Development:

Develop robust and performant ELT pipelines using

Snowpipe, Tasks, Streams, and Dynamic Tables

to aggregate data from diverse sources into Snowflake. Performance Tuning:

Optimize Snowflake queries, clustering keys, and warehouse sizing to ensure low latency data retrieval for real‑time agent workflows and baseline report generation. Collaboration:

Work closely with the AI/ML Agent Developers to expose data via optimized views, UDFs, and Stored Procedures that can be easily called by

Snowpark

or

Cortex Analyst

tools. Data Governance:

Ensure data quality, lineage, and adherence to security policies (e.g., Row Access Policies, Data Masking) within the Snowflake environment. Required Skills & Qualifications

Expert-level proficiency in Snowflake

architecture, optimization, and advanced features (e.g., Streams, Dynamic Tables, Time Travel). Deep expertise in

SQL

and data modeling for high‑volume, complex datasets. Strong hands‑on experience with

Python

and

Snowpark

for custom data transformation logic. Proven ability to perform data cleansing, feature engineering, and dimensional reduction (reducing columns).

#J-18808-Ljbffr