Verito Solutions
Data Warehouse Engineer |Remote| Minnetonka Mills, MN
Verito Solutions, Minnetonka Mills, Minnesota, United States
Data Warehouse Engineer
Remote || Minnetonka Mills, MN
Phone / Skype: (detail omitted)
Top Skills / Must Have
ADLS Gen2 with Apache Iceberg
dbt on Snowflake
Python / PySpark (Snowflake / Microsoft Fabric-aligned)
Snowflake-based data warehousing and analytics engineering
Additional Required Experience
Data Warehouse / ETL development using Informatica
Data analysis experience (not limited to ETL development)
Strong data quality focus and metrics-driven mindset
Strong SQL skills and cloud data engineering experience (Azure preferred)
Supporting / Preferred Tools
Kafka
GoldenGate
IDMC
Azure Data Factory
Airflow
Metadata-driven pipeline management
Environment / Tech Stack
Informatica
Snowflake
Azure (ADLS Gen2, Azure Data Factory, Microsoft Fabric)
Kafka, GoldenGate
dbt (Snowflake), IDMC
PySpark, Airflow
Metadata-driven pipeline management tools
Preferred Background
5 years of software development and database applications experience, focused on data strategy, modeling, integration, and architecture
Deep experience designing and building data warehouses, data marts, and ODS solutions
Strong SQL, query optimization, and RDBMS design (3NF and dimensional models)
End-to-end Snowflake implementation experience (RBAC, performance tuning, cloning, optimization)
Healthcare payer data experience (member, enrollment, claims, provider) preferred
Cloud platform experience with Azure and/or AWS
Nice to Have: CDMP or CBIP certification
Experience with ERwin, ER/Studio, or PowerDesigner
Informatica preferred (DataStage or SSIS acceptable)
Agile and/or ITIL certification
Key Responsibilities
Develop and modify Yaskawa robot programs based on project specs.
Diagnose and troubleshoot robotic issues.
Collaborate with engineering and production for system integration.
Perform system testing and validation.
Provide technical support and training to team members.
Maintain documentation for robotic systems and programming changes.
#J-18808-Ljbffr
Phone / Skype: (detail omitted)
Top Skills / Must Have
ADLS Gen2 with Apache Iceberg
dbt on Snowflake
Python / PySpark (Snowflake / Microsoft Fabric-aligned)
Snowflake-based data warehousing and analytics engineering
Additional Required Experience
Data Warehouse / ETL development using Informatica
Data analysis experience (not limited to ETL development)
Strong data quality focus and metrics-driven mindset
Strong SQL skills and cloud data engineering experience (Azure preferred)
Supporting / Preferred Tools
Kafka
GoldenGate
IDMC
Azure Data Factory
Airflow
Metadata-driven pipeline management
Environment / Tech Stack
Informatica
Snowflake
Azure (ADLS Gen2, Azure Data Factory, Microsoft Fabric)
Kafka, GoldenGate
dbt (Snowflake), IDMC
PySpark, Airflow
Metadata-driven pipeline management tools
Preferred Background
5 years of software development and database applications experience, focused on data strategy, modeling, integration, and architecture
Deep experience designing and building data warehouses, data marts, and ODS solutions
Strong SQL, query optimization, and RDBMS design (3NF and dimensional models)
End-to-end Snowflake implementation experience (RBAC, performance tuning, cloning, optimization)
Healthcare payer data experience (member, enrollment, claims, provider) preferred
Cloud platform experience with Azure and/or AWS
Nice to Have: CDMP or CBIP certification
Experience with ERwin, ER/Studio, or PowerDesigner
Informatica preferred (DataStage or SSIS acceptable)
Agile and/or ITIL certification
Key Responsibilities
Develop and modify Yaskawa robot programs based on project specs.
Diagnose and troubleshoot robotic issues.
Collaborate with engineering and production for system integration.
Perform system testing and validation.
Provide technical support and training to team members.
Maintain documentation for robotic systems and programming changes.
#J-18808-Ljbffr