Virtusa
Description
Job Summary:
We are seeking a highly experienced and technically proficient Senior Snowflake Engineer / Architect to join our data platform team. The ideal candidate will possess 7-10 years of hands-on experience in designing, developing, and optimizing data solutions within the Snowflake ecosystem. A critical aspect of this role involves leading and executing complex data migrations from legacy platforms such as Cloudera, Netezza, Teradata, Oracle Exadata, etc., to Snowflake. This role requires a deep understanding of data warehousing concepts, ETL/ELT processes, performance tuning, and best practices for cloud data platforms. The successful candidate will play a pivotal role in shaping our data strategy, ensuring scalability, reliability, and efficiency of our data infrastructure.
Responsibilities:
Snowflake Architecture & Design: Design, develop, and implement scalable, robust, and high-performance data warehousing solutions on Snowflake. Define and enforce best practices for Snowflake architecture, data modeling, security, and governance. Lead the selection and implementation of appropriate Snowflake features and tools to meet business requirements. Data Migration & Integration:
Lead end-to-end data migration projects from legacy on-premises data platforms (Cloudera, Netezza, Teradata, Oracle Exadata, etc.) to Snowflake. Develop and execute comprehensive migration strategies, including data profiling, schema conversion, data loading, and validation. Design and implement efficient ETL/ELT pipelines for data ingestion, transformation, and loading into Snowflake using tools like Matillion, DBT, Fivetran, Azure Data Factory, AWS Glue, or similar. Performance Optimization & Tuning:
Identify and resolve performance bottlenecks within Snowflake, optimizing queries, data loads, and overall system efficiency. Implement and manage Snowflake cost optimization strategies (e.g., warehouse sizing, auto-suspend, query optimization). Monitor Snowflake usage and performance, providing recommendations for continuous improvement. Data Governance & Security:
Implement and enforce data governance policies within Snowflake, including role-based access control (RBAC), data masking, and encryption. Ensure compliance with data privacy regulations (e.g., GDPR, CCPA) within the Snowflake environment. Collaboration & Leadership:
Collaborate closely with data analysts, data scientists, business stakeholders, and other engineering teams to understand data requirements and deliver effective solutions. Mentor junior engineers, sharing expertise and promoting a culture of continuous learning. Contribute to the overall data strategy and roadmap, staying abreast of industry trends and emerging technologies. Documentation:
Create and maintain comprehensive documentation for data models, architecture, ETL/ELT processes, and operational procedures. Required Qualifications:
Experience: 7-10 years of hands-on experience with data warehousing, ETL/ELT development, and data platform management. Snowflake Expertise: Minimum of 4-5 years of dedicated, hands-on experience with Snowflake data warehousing, including advanced SQL, stored procedures, UDFs, Snowpipe, Streams, Tasks, and External Functions. Proven experience with Snowflake performance tuning and cost optimization techniques. Strong understanding of Snowflake's architecture, micro-partitions, clustering, and caching mechanisms. Migration Experience:
Extensive experience leading and executing data migration projects from at least two legacy platforms (e.g., Cloudera, Netezza, Teradata, Oracle Exadata, SQL Server) to Snowflake.
ETL/ELT Tools: Proficient with at least one major ETL/ELT tool (e.g., Matillion, DBT, Fivetran, Talend, Informatica, SSIS, Azure Data Factory, AWS Glue). Programming: Strong proficiency in at least one scripting language (Python preferred) for data manipulation, automation, and API integration. Cloud Platforms: Experience with at least one major public cloud provider (AWS, Azure, or GCP) and their relevant data services. Data Modeling: Solid understanding of various data modeling techniques (e.g., Star Schema, Snowflake Schema, Data Vault). SQL Mastery: Expert-level SQL skills for complex query writing, optimization, and data analysis. Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills with the ability to articulate complex technical concepts to non-technical stakeholders. Preferred Qualifications:
Snowflake Certifications (e.g., SnowPro Core, SnowPro Advanced Architect). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Familiarity with CI/CD pipelines and DevOps practices for data solutions. Experience with version control systems (e.g., Git). Understanding of data lake concepts and integration with Snowflake.
Job Summary:
We are seeking a highly experienced and technically proficient Senior Snowflake Engineer / Architect to join our data platform team. The ideal candidate will possess 7-10 years of hands-on experience in designing, developing, and optimizing data solutions within the Snowflake ecosystem. A critical aspect of this role involves leading and executing complex data migrations from legacy platforms such as Cloudera, Netezza, Teradata, Oracle Exadata, etc., to Snowflake. This role requires a deep understanding of data warehousing concepts, ETL/ELT processes, performance tuning, and best practices for cloud data platforms. The successful candidate will play a pivotal role in shaping our data strategy, ensuring scalability, reliability, and efficiency of our data infrastructure.
Responsibilities:
Snowflake Architecture & Design: Design, develop, and implement scalable, robust, and high-performance data warehousing solutions on Snowflake. Define and enforce best practices for Snowflake architecture, data modeling, security, and governance. Lead the selection and implementation of appropriate Snowflake features and tools to meet business requirements. Data Migration & Integration:
Lead end-to-end data migration projects from legacy on-premises data platforms (Cloudera, Netezza, Teradata, Oracle Exadata, etc.) to Snowflake. Develop and execute comprehensive migration strategies, including data profiling, schema conversion, data loading, and validation. Design and implement efficient ETL/ELT pipelines for data ingestion, transformation, and loading into Snowflake using tools like Matillion, DBT, Fivetran, Azure Data Factory, AWS Glue, or similar. Performance Optimization & Tuning:
Identify and resolve performance bottlenecks within Snowflake, optimizing queries, data loads, and overall system efficiency. Implement and manage Snowflake cost optimization strategies (e.g., warehouse sizing, auto-suspend, query optimization). Monitor Snowflake usage and performance, providing recommendations for continuous improvement. Data Governance & Security:
Implement and enforce data governance policies within Snowflake, including role-based access control (RBAC), data masking, and encryption. Ensure compliance with data privacy regulations (e.g., GDPR, CCPA) within the Snowflake environment. Collaboration & Leadership:
Collaborate closely with data analysts, data scientists, business stakeholders, and other engineering teams to understand data requirements and deliver effective solutions. Mentor junior engineers, sharing expertise and promoting a culture of continuous learning. Contribute to the overall data strategy and roadmap, staying abreast of industry trends and emerging technologies. Documentation:
Create and maintain comprehensive documentation for data models, architecture, ETL/ELT processes, and operational procedures. Required Qualifications:
Experience: 7-10 years of hands-on experience with data warehousing, ETL/ELT development, and data platform management. Snowflake Expertise: Minimum of 4-5 years of dedicated, hands-on experience with Snowflake data warehousing, including advanced SQL, stored procedures, UDFs, Snowpipe, Streams, Tasks, and External Functions. Proven experience with Snowflake performance tuning and cost optimization techniques. Strong understanding of Snowflake's architecture, micro-partitions, clustering, and caching mechanisms. Migration Experience:
Extensive experience leading and executing data migration projects from at least two legacy platforms (e.g., Cloudera, Netezza, Teradata, Oracle Exadata, SQL Server) to Snowflake.
ETL/ELT Tools: Proficient with at least one major ETL/ELT tool (e.g., Matillion, DBT, Fivetran, Talend, Informatica, SSIS, Azure Data Factory, AWS Glue). Programming: Strong proficiency in at least one scripting language (Python preferred) for data manipulation, automation, and API integration. Cloud Platforms: Experience with at least one major public cloud provider (AWS, Azure, or GCP) and their relevant data services. Data Modeling: Solid understanding of various data modeling techniques (e.g., Star Schema, Snowflake Schema, Data Vault). SQL Mastery: Expert-level SQL skills for complex query writing, optimization, and data analysis. Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills with the ability to articulate complex technical concepts to non-technical stakeholders. Preferred Qualifications:
Snowflake Certifications (e.g., SnowPro Core, SnowPro Advanced Architect). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Familiarity with CI/CD pipelines and DevOps practices for data solutions. Experience with version control systems (e.g., Git). Understanding of data lake concepts and integration with Snowflake.