DPR Construction
Snowflake Architect/Admin
DPR is looking for an experienced Snowflake Data Architect/Admin to join our Data and AI team and work closely with the Data Platform, BI and Enterprise architecture teams to influence the technical direction of DPR’s data engineering and analytics initiatives.
You will work closely with cross-functional teams, including business stakeholders, data engineers, AI Team and technical leads, to ensure alignment between business needs and data architecture and define data models for specific focus areas.
Responsibilities
Design, build, and own the overall data architecture across the Snowflake data platform — including the data lake, data warehouse, and data consumption layers.
Monitor and optimize Snowflake performance, including query performance tuning, resource allocation, and cost management.
Develop, optimize, and manage conceptual and logical data architectures and integrations across both internal and external systems.
Experiment with prototype solutions using the latest Snowflake features, demonstrating practical use cases and driving early adoption across the business.
Implement and manage Snowflake's security features, such as access controls, encryption, and data masking.
Collaborate closely with engineering, data, and analytics teams to deliver business-critical data solutions.
Drive high priority data initiatives using Azure/AWS as well as Snowflake & DBT. The role may also extend into some advanced Analytics and AI concepts
Leverage Snowflake Cortex to enable natural language query experiences, document understanding, and AI-driven insights directly within the Snowflake environment.
Support the AI team with generative AI use cases powered by Snowflake’s LLM functions, such as text summarization, classification, and Q&A on enterprise data.
Implement and manage vectorized data pipelines for semantic search and retrieval-augmented generation (RAG) within Snowflake.
Stay current with evolving Snowflake AI capabilities (Cortex, Snowpark Container Services, Document AI, and Feature Store) and apply them to improve data accessibility and intelligence.
Design scalable, secure, and high-performance data pipelines to support evolving business needs.
Partner with strategic customers to understand their vision and ensure future requirements are incorporated into the platform roadmap.
Participate in all phases of the project lifecycle and lead data architecture initiatives.
Qualifications
10+ years of experience in data architecture and engineering, with at least 5+ years of experience in Snowflake designing and delivering solutions at scale in cloud environments.
Hands-on experience with secure and scalable enterprise data architectures using Microsoft Azure or AWS.
Deep knowledge of Snowflake and DBT, with experience building robust data ingestion and ETL/ELT pipelines.
Experience in designing data structures for data lakes and cloud data warehouses to support analytics and reporting.
Hands-on experience with Snowflake Cortex, Snowpark ML, or Snowflake’s AI/ML features for model training, deployment, or inference.
Understanding of vector embeddings, model governance, and prompt-driven analytics within Snowflake.
Strong proficiency in SQL, python, git and working with Snowpark DataFrames and UDFs for AI model integration.
Familiarity with agile methodologies, and experience working closely with cross functional teams to manage technical backlogs.
Skilled in orchestrating and automating data pipelines within a DevOps framework.
Strong communicator with the ability to present ideas clearly and influence stakeholders — with a passion for enabling data-driven transformation.
#J-18808-Ljbffr
You will work closely with cross-functional teams, including business stakeholders, data engineers, AI Team and technical leads, to ensure alignment between business needs and data architecture and define data models for specific focus areas.
Responsibilities
Design, build, and own the overall data architecture across the Snowflake data platform — including the data lake, data warehouse, and data consumption layers.
Monitor and optimize Snowflake performance, including query performance tuning, resource allocation, and cost management.
Develop, optimize, and manage conceptual and logical data architectures and integrations across both internal and external systems.
Experiment with prototype solutions using the latest Snowflake features, demonstrating practical use cases and driving early adoption across the business.
Implement and manage Snowflake's security features, such as access controls, encryption, and data masking.
Collaborate closely with engineering, data, and analytics teams to deliver business-critical data solutions.
Drive high priority data initiatives using Azure/AWS as well as Snowflake & DBT. The role may also extend into some advanced Analytics and AI concepts
Leverage Snowflake Cortex to enable natural language query experiences, document understanding, and AI-driven insights directly within the Snowflake environment.
Support the AI team with generative AI use cases powered by Snowflake’s LLM functions, such as text summarization, classification, and Q&A on enterprise data.
Implement and manage vectorized data pipelines for semantic search and retrieval-augmented generation (RAG) within Snowflake.
Stay current with evolving Snowflake AI capabilities (Cortex, Snowpark Container Services, Document AI, and Feature Store) and apply them to improve data accessibility and intelligence.
Design scalable, secure, and high-performance data pipelines to support evolving business needs.
Partner with strategic customers to understand their vision and ensure future requirements are incorporated into the platform roadmap.
Participate in all phases of the project lifecycle and lead data architecture initiatives.
Qualifications
10+ years of experience in data architecture and engineering, with at least 5+ years of experience in Snowflake designing and delivering solutions at scale in cloud environments.
Hands-on experience with secure and scalable enterprise data architectures using Microsoft Azure or AWS.
Deep knowledge of Snowflake and DBT, with experience building robust data ingestion and ETL/ELT pipelines.
Experience in designing data structures for data lakes and cloud data warehouses to support analytics and reporting.
Hands-on experience with Snowflake Cortex, Snowpark ML, or Snowflake’s AI/ML features for model training, deployment, or inference.
Understanding of vector embeddings, model governance, and prompt-driven analytics within Snowflake.
Strong proficiency in SQL, python, git and working with Snowpark DataFrames and UDFs for AI model integration.
Familiarity with agile methodologies, and experience working closely with cross functional teams to manage technical backlogs.
Skilled in orchestrating and automating data pipelines within a DevOps framework.
Strong communicator with the ability to present ideas clearly and influence stakeholders — with a passion for enabling data-driven transformation.
#J-18808-Ljbffr