Jobs via Dice
Overview
Data Architect - Snowflake Location: Tampa, FL; Jersey City, NJ; Dallas, TX (Hybrid onsite) Key responsibilities
Architecture Design: Design and maintain scalable, high-performance Snowflake data warehouse architectures to support both batch and real-time data ingestion. Define and enforce data modeling standards (e.g., dimensional, normalized, or hybrid) tailored to Snowflake's architecture. Architect data integration frameworks that support complex transformations, multi-source ingestion, and cross-domain data harmonization. Internal data ingestion and integration: Design and implement ingestion pipelines from diverse sources. Integration
Integrate data from diverse sources (e.g., APIs, on-prem databases, cloud storage, third-party feeds) into Snowflake. Ensure data pipelines are resilient, fault-tolerant, and optimized for performance and cost. Data Governance / Quality
Define and implement data quality frameworks, validation rules, and monitoring processes. Collaborate with data governance teams to ensure compliance with data privacy, security, and regulatory requirements. Support metadata management, lineage tracking, and data catalog integration. Performance / Optimization
Monitor and tune Snowflake workloads, including query performance, warehouse sizing, and storage optimization. Collaboration / Leadership
Provide architectural guidance and mentorship to engineering teams on best practices for Snowflake and data integration. Lead design reviews and contribute to strategic planning for enterprise data initiatives. Qualifications / Education / Experience
Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 7 years of experience in data architecture, data engineering, or enterprise data warehousing. Proven experience designing and implementing large-scale Snowflake data platforms in production environments. Snowflake Expertise: Deep understanding of Snowflake architecture, including virtual warehouses, Snowpipe, Streams, Tasks, and Time Travel. Experience with Snowflake performance tuning, cost optimization, and security best practices. Familiarity with Snowflake data sharing, data marketplace, and multi-cluster warehouse configurations. Data Integration Pipeline Skills: Strong experience with both batch and streaming data ingestion frameworks (e.g., Apache Spark, Kafka, Python, dbt, Airflow). Ability to design and manage complex ETL/ELT pipelines integrating data from diverse sources (e.g., APIs, cloud storage, on-prem systems). Proficiency in SQL and scripting languages (e.g., Python, Shell) for data transformation and orchestration. Data Modeling Governance: Expertise in data modeling techniques (dimensional, normalized, data vault) and metadata management. Understanding of data governance, data quality frameworks, and regulatory compliance (e.g., GDPR, SOX). Experience with data cataloging tools and lineage tracking (e.g., Alation, Collibra, Microsoft Purview). Leadership / Communication: Strong collaboration skills to work with cross-functional teams; ability to translate business requirements into scalable data architecture; excellent communication and documentation skills.
#J-18808-Ljbffr
Data Architect - Snowflake Location: Tampa, FL; Jersey City, NJ; Dallas, TX (Hybrid onsite) Key responsibilities
Architecture Design: Design and maintain scalable, high-performance Snowflake data warehouse architectures to support both batch and real-time data ingestion. Define and enforce data modeling standards (e.g., dimensional, normalized, or hybrid) tailored to Snowflake's architecture. Architect data integration frameworks that support complex transformations, multi-source ingestion, and cross-domain data harmonization. Internal data ingestion and integration: Design and implement ingestion pipelines from diverse sources. Integration
Integrate data from diverse sources (e.g., APIs, on-prem databases, cloud storage, third-party feeds) into Snowflake. Ensure data pipelines are resilient, fault-tolerant, and optimized for performance and cost. Data Governance / Quality
Define and implement data quality frameworks, validation rules, and monitoring processes. Collaborate with data governance teams to ensure compliance with data privacy, security, and regulatory requirements. Support metadata management, lineage tracking, and data catalog integration. Performance / Optimization
Monitor and tune Snowflake workloads, including query performance, warehouse sizing, and storage optimization. Collaboration / Leadership
Provide architectural guidance and mentorship to engineering teams on best practices for Snowflake and data integration. Lead design reviews and contribute to strategic planning for enterprise data initiatives. Qualifications / Education / Experience
Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 7 years of experience in data architecture, data engineering, or enterprise data warehousing. Proven experience designing and implementing large-scale Snowflake data platforms in production environments. Snowflake Expertise: Deep understanding of Snowflake architecture, including virtual warehouses, Snowpipe, Streams, Tasks, and Time Travel. Experience with Snowflake performance tuning, cost optimization, and security best practices. Familiarity with Snowflake data sharing, data marketplace, and multi-cluster warehouse configurations. Data Integration Pipeline Skills: Strong experience with both batch and streaming data ingestion frameworks (e.g., Apache Spark, Kafka, Python, dbt, Airflow). Ability to design and manage complex ETL/ELT pipelines integrating data from diverse sources (e.g., APIs, cloud storage, on-prem systems). Proficiency in SQL and scripting languages (e.g., Python, Shell) for data transformation and orchestration. Data Modeling Governance: Expertise in data modeling techniques (dimensional, normalized, data vault) and metadata management. Understanding of data governance, data quality frameworks, and regulatory compliance (e.g., GDPR, SOX). Experience with data cataloging tools and lineage tracking (e.g., Alation, Collibra, Microsoft Purview). Leadership / Communication: Strong collaboration skills to work with cross-functional teams; ability to translate business requirements into scalable data architecture; excellent communication and documentation skills.
#J-18808-Ljbffr