Logo
Vitaver & Associates

Snowflake Developer (onsite)

Vitaver & Associates, Tallahassee, Florida, us, 32318

Save Job

14080 - Snowflake Developer(Onsite) - Tallahassee, FL Start Date:

ASAP Type:

Temporary Project Estimated Duration:

12 months with possible extension Work Setting:

100% of the time at the Client's site. No telecommuting or remote work. This is a non-negotiable requirement from the client.

Required

: • Experience with data engineering, analytics, or cloud data warehousing (3+ years); • Hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform (2+ years); • Experience with SQL programming; • Proven experience with Snowflake platform architecture and data warehousing concepts; • Experience with building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares; • Experience with ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran); • Experience with data governance, security roles, masking policies, and RBAC within Snowflake; • Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake; • Experience with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools; • Experience with current data governance concepts and best practices; • Experience with data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake

Preferred

: • Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend; • Experience working with financial, ERP, or general ledger data in a reporting or analytics capacity; • Experience with mainframe systems, legacy flat files, and their integration with cloud-based platforms; • Experience with Agile/SCRUM frameworks and experience working in iterative development cycles; Experience with Oracle Data Warehouse; • Experience with DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions)

Responsibilities: • Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake; • Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake; • Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by translating them into Snowflake SQL and optimizing performance; • Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow); • Partner with analysts and business users to build efficient, reusable data models and secure views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or Looker); • Optimize query performance and data governance by implementing best practices in Snowflake for security, access control, caching, clustering, and cost monitoring; • Support training, documentation, and knowledge transfer to internal teams, ensuring smooth adoption and use of Snowflake-based solutions.