Eliassen Group
Principal Data Engineer, Snowflake, Java, Python, SQL
Eliassen Group, Smithfield, Rhode Island, us, 02917
Overview
2 weeks a month onsite in Rhode Island. No relocation or travel expenses will be provided.
Our direct client is seeking a Principal Data Engineer to join their Business Enablement Technology data squad in Smithfield, RI. The data infrastructure is going through significant growth and modernization, and this is an opportunity to take a leading role in shaping FI capability in this area. The work involves solution design, data analysis, innovation and collaboration while maintaining the operational and analytical capability in FI’s data platforms while testing, production rollout and quality execution on project activities in data lakes using Snowflake, AWS, and Python.
Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $65 - $75 / hr. w2
Responsibilities
Data is in Salesforce and Snowflake.
Data is moving from/to: Salesforce to Salesforce.
Salesforce to Snowflake.
Salesforce to Oracle.
Experience Requirements
10+ years of experience in Data Warehousing, Data mart concepts & implementations
4+ years of experience developing ELT/ETL pipelines to move data to and from Snowflake data store
4+ years of experience using AWS services such as EC2, IAM, S3, EKS, KMS, SMS, CloudWatch, CloudFormation, etc.
4+ years of experience in object-oriented programming languages (Strong programming skills required in Python and Java)
Your passion for Data Analysis with the ability to navigate and master complex transactional and warehouse databases
Hands‑on experience on SQL query optimization and performance tuning is required
Experience in job scheduling tools (Control‑M preferred)
Advanced SQL/Snow SQL knowledge is required
Strong data modeling skills doing either Dimensional or Data Vault models
Experience in Container technologies like Docker and Kubernetes
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
Experience in Agile methodologies (Kanban and SCRUM) is a plus
Proven track record to handle ambiguity and work in a fast‑paced environment
Good interpersonal skills to work with multiple teams in the organization
Education Requirements Bachelor’s or master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10+ years of experience
#J-18808-Ljbffr
Our direct client is seeking a Principal Data Engineer to join their Business Enablement Technology data squad in Smithfield, RI. The data infrastructure is going through significant growth and modernization, and this is an opportunity to take a leading role in shaping FI capability in this area. The work involves solution design, data analysis, innovation and collaboration while maintaining the operational and analytical capability in FI’s data platforms while testing, production rollout and quality execution on project activities in data lakes using Snowflake, AWS, and Python.
Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $65 - $75 / hr. w2
Responsibilities
Data is in Salesforce and Snowflake.
Data is moving from/to: Salesforce to Salesforce.
Salesforce to Snowflake.
Salesforce to Oracle.
Experience Requirements
10+ years of experience in Data Warehousing, Data mart concepts & implementations
4+ years of experience developing ELT/ETL pipelines to move data to and from Snowflake data store
4+ years of experience using AWS services such as EC2, IAM, S3, EKS, KMS, SMS, CloudWatch, CloudFormation, etc.
4+ years of experience in object-oriented programming languages (Strong programming skills required in Python and Java)
Your passion for Data Analysis with the ability to navigate and master complex transactional and warehouse databases
Hands‑on experience on SQL query optimization and performance tuning is required
Experience in job scheduling tools (Control‑M preferred)
Advanced SQL/Snow SQL knowledge is required
Strong data modeling skills doing either Dimensional or Data Vault models
Experience in Container technologies like Docker and Kubernetes
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
Experience in Agile methodologies (Kanban and SCRUM) is a plus
Proven track record to handle ambiguity and work in a fast‑paced environment
Good interpersonal skills to work with multiple teams in the organization
Education Requirements Bachelor’s or master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10+ years of experience
#J-18808-Ljbffr