NTT DATA, Inc.
Job Overview
Select how often (in days) to receive an alert: Create Alert
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Snowflake Architect to join our team in Chicago, Illinois (US-IL), United States (US).
Responsibilities
Lead the end-to-end architecture and design of Snowflake-based data platforms on Azure, including integration with Azure services (ADF, Synapse pipelines, Azure Functions, Key Vault, ADLS, etc.).
Define and implement data modeling standards (star/snowflake schema, data vault, dimensional modeling) tailored for analytics, BI, and downstream data products.
Design secure, scalable, and cost-efficient Snowflake environments, including warehouses, databases, schemas, roles, resource monitors, and virtual warehouses.
Lead migration strategy and roadmap for moving data from legacy/on-prem systems to Snowflake on Azure.
stakeholders to assess current state (source systems, ETL, reporting, data quality) and design target-state architecture on Snowflake.
Define migration waves/phases, including data profiling, schema conversion, historical load, incremental load, and cutover strategy.
Oversee and implement data ingestion pipelines from various sources (databases, flat files, APIs, streaming) into ADLS / Landing zones and then into Snowflake using tools like Azure Data Factory, Synapse pipelines, or Databricks, plus CDC where applicable.
Manage data reconciliation and validation to ensure completeness, accuracy, and performance parity (or improvement) compared to legacy platforms.
Lead a team of data engineers / ETL developers delivering Snowflake-based solutions and migration workstreams.
Define and enforce coding standards, code review practices, and CI/CD pipelines for Snowflake objects (SQL, stored procedures, views, tasks, streams).
Design & build ELT/ETL patterns (staging → raw → curated → semantic layers), using tools such as dbt, ADF, Synapse, Databricks, or other orchestration tools.
Implement automated testing frameworks (unit tests, regression tests, data quality checks) and monitoring (SLAs).
Monitor query performance and optimize Snowflake workloads using query profiling, clustering, partitioning, and warehouse sizing strategies.
Implement resource monitors, auto-scaling, and auto-suspend policies to optimize compute usage and manage Snowflake consumption costs.
8+ years overall experience in Data Engineering / Data Warehousing / Analytics.
5+ years hands-on experience with Snowflake in production environments.
Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.).
Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub.
Qualifications
8+ years overall experience in Data Engineering / Data Warehousing / Analytics.
5+ years hands-on experience with Snowflake in production environments.
Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.).
Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub.
About NTT DATA NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Nearest Major Market Chicago
Job Segment Developer, Database, SQL, Data Warehouse, Oracle, Technology
#J-18808-Ljbffr
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Snowflake Architect to join our team in Chicago, Illinois (US-IL), United States (US).
Responsibilities
Lead the end-to-end architecture and design of Snowflake-based data platforms on Azure, including integration with Azure services (ADF, Synapse pipelines, Azure Functions, Key Vault, ADLS, etc.).
Define and implement data modeling standards (star/snowflake schema, data vault, dimensional modeling) tailored for analytics, BI, and downstream data products.
Design secure, scalable, and cost-efficient Snowflake environments, including warehouses, databases, schemas, roles, resource monitors, and virtual warehouses.
Lead migration strategy and roadmap for moving data from legacy/on-prem systems to Snowflake on Azure.
stakeholders to assess current state (source systems, ETL, reporting, data quality) and design target-state architecture on Snowflake.
Define migration waves/phases, including data profiling, schema conversion, historical load, incremental load, and cutover strategy.
Oversee and implement data ingestion pipelines from various sources (databases, flat files, APIs, streaming) into ADLS / Landing zones and then into Snowflake using tools like Azure Data Factory, Synapse pipelines, or Databricks, plus CDC where applicable.
Manage data reconciliation and validation to ensure completeness, accuracy, and performance parity (or improvement) compared to legacy platforms.
Lead a team of data engineers / ETL developers delivering Snowflake-based solutions and migration workstreams.
Define and enforce coding standards, code review practices, and CI/CD pipelines for Snowflake objects (SQL, stored procedures, views, tasks, streams).
Design & build ELT/ETL patterns (staging → raw → curated → semantic layers), using tools such as dbt, ADF, Synapse, Databricks, or other orchestration tools.
Implement automated testing frameworks (unit tests, regression tests, data quality checks) and monitoring (SLAs).
Monitor query performance and optimize Snowflake workloads using query profiling, clustering, partitioning, and warehouse sizing strategies.
Implement resource monitors, auto-scaling, and auto-suspend policies to optimize compute usage and manage Snowflake consumption costs.
8+ years overall experience in Data Engineering / Data Warehousing / Analytics.
5+ years hands-on experience with Snowflake in production environments.
Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.).
Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub.
Qualifications
8+ years overall experience in Data Engineering / Data Warehousing / Analytics.
5+ years hands-on experience with Snowflake in production environments.
Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.).
Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub.
About NTT DATA NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Nearest Major Market Chicago
Job Segment Developer, Database, SQL, Data Warehouse, Oracle, Technology
#J-18808-Ljbffr