Cygnus Professionals Inc.
Snowflake Data Engineer - W2 Only, No C2C - 8+YEARS
Cygnus Professionals Inc., Indianapolis, Indiana, us, 46262
Snowflake Data Engineer - W2 Only, No C2C - 8+YEARS
We are seeking a
Snowflake Data Engineer
to design, build, and optimize scalable data solutions on the Snowflake Data Cloud.
This role will support analytics, reporting, and AI/ML initiatives across commercial, manufacturing, R&D, and quality systems.
The ideal candidate has strong expertise in cloud data engineering, ELT pipelines, and enterprise-grade data platforms within regulated environments.
Required Qualifications
Bachelor’s degree in computer science, Engineering, Data Science, or related field
8+ years of hands‑on experience in data engineering with strong focus on Snowflake Data Warehouse design and management
Extensive experience designing, building, and managing
enterprise‑scale Snowflake data warehouses
Strong hands‑on experience with
Snowflake
(SQL, Virtual Warehouses, Snowpipe, Streams, Tasks, Time Travel, Zero Copy Cloning)
Proven expertise in
Snowflake warehouse management , including sizing, multi‑cluster warehouses, workload isolation, concurrency scaling, and cost optimization
Proficiency in
SQL
and
Python
for data transformations and orchestration
Experience with cloud platforms:
AWS ,
Azure , or
GCP
Experience building robust ELT pipelines and working with structured and semi‑structured data (JSON, Parquet, Avro)
Strong knowledge of
data modeling for data warehouses
(star/snowflake schemas, dimensional modeling)
Experience implementing
data governance, security, and access controls
in Snowflake (RBAC, masking policies, row access policies)
Experience with Git-based version control and CI/CD pipelines
Key Responsibilities
Design, develop, and maintain scalable data pipelines using
Snowflake
as the core data platform
Build and optimize ELT workflows using tools such as
dbt ,
Airflow ,
Matillion , or
Fivetran
Implement data models (star/snowflake schemas) to support analytics, BI, and advanced analytics use cases
Optimize Snowflake performance and cost (warehouses, clustering, caching, resource monitors)
Integrate data from diverse sources: ERP (SAP), CRM (Salesforce), manufacturing systems, LIMS, IoT, and external data feeds
Ensure data quality, governance, lineage, and metadata management in compliance with regulatory standards (GxP, FDA, ISO)
Collaborate with data analysts, data scientists, product teams, and business stakeholders
Implement CI/CD, version control, and automated testing for data pipelines
Support data security, access controls, and compliance requirements
Participate in architecture reviews and contribute to enterprise data strategy
#J-18808-Ljbffr
Snowflake Data Engineer
to design, build, and optimize scalable data solutions on the Snowflake Data Cloud.
This role will support analytics, reporting, and AI/ML initiatives across commercial, manufacturing, R&D, and quality systems.
The ideal candidate has strong expertise in cloud data engineering, ELT pipelines, and enterprise-grade data platforms within regulated environments.
Required Qualifications
Bachelor’s degree in computer science, Engineering, Data Science, or related field
8+ years of hands‑on experience in data engineering with strong focus on Snowflake Data Warehouse design and management
Extensive experience designing, building, and managing
enterprise‑scale Snowflake data warehouses
Strong hands‑on experience with
Snowflake
(SQL, Virtual Warehouses, Snowpipe, Streams, Tasks, Time Travel, Zero Copy Cloning)
Proven expertise in
Snowflake warehouse management , including sizing, multi‑cluster warehouses, workload isolation, concurrency scaling, and cost optimization
Proficiency in
SQL
and
Python
for data transformations and orchestration
Experience with cloud platforms:
AWS ,
Azure , or
GCP
Experience building robust ELT pipelines and working with structured and semi‑structured data (JSON, Parquet, Avro)
Strong knowledge of
data modeling for data warehouses
(star/snowflake schemas, dimensional modeling)
Experience implementing
data governance, security, and access controls
in Snowflake (RBAC, masking policies, row access policies)
Experience with Git-based version control and CI/CD pipelines
Key Responsibilities
Design, develop, and maintain scalable data pipelines using
Snowflake
as the core data platform
Build and optimize ELT workflows using tools such as
dbt ,
Airflow ,
Matillion , or
Fivetran
Implement data models (star/snowflake schemas) to support analytics, BI, and advanced analytics use cases
Optimize Snowflake performance and cost (warehouses, clustering, caching, resource monitors)
Integrate data from diverse sources: ERP (SAP), CRM (Salesforce), manufacturing systems, LIMS, IoT, and external data feeds
Ensure data quality, governance, lineage, and metadata management in compliance with regulatory standards (GxP, FDA, ISO)
Collaborate with data analysts, data scientists, product teams, and business stakeholders
Implement CI/CD, version control, and automated testing for data pipelines
Support data security, access controls, and compliance requirements
Participate in architecture reviews and contribute to enterprise data strategy
#J-18808-Ljbffr