Logo
Expedite Technology Solutions LLC

Data Engineer - Snowflake and Databricks

Expedite Technology Solutions LLC, Seattle, Washington, us, 98127

Save Job

Data Engineer - Snowflake and Databricks

Develop, optimize, and maintain data pipelines using Azure Data Factory (ADF), DBT Labs, Snowflake, and Databricks. Develop reusable jobs and configuration-based integration framework to optimize development and scalability. Manage data ingestion for structured and unstructured data (landing/lake house: ADLS, Sources: ADLS, Salesforce, SharePoint Documents Libraries, Partner Data: Client, IHME, WASDE etc.). Implement and optimize ELT processes, source-to-target mapping, and transformation logic in DBT Labs, Azure Data Factory, Databricks Notebook, Snow SQL etc. Collaborate with data scientists, analysts, data engineers, report developers and infrastructure engineers for end-to-end support. Co-develop CI/CD best practices, automation, and pipelines with Infrastructure engineers for code deployments using GitHub Actions. Bring in automation from source-to-target mappings to data pipelines and data lineage in Collibra. Responsibilities

Develop, optimize, and maintain data pipelines using Azure Data Factory (ADF), DBT Labs, Snowflake, and Databricks. Develop reusable jobs and configuration-based integration framework to optimize development and scalability. Manage data ingestion for structured and unstructured data. Implement and optimize ELT processes, source-to-target mapping, and transformation logic. Collaborate with data scientists, analysts, data engineers, report developers and infrastructure engineers for end-to-end support. Co-develop CI/CD best practices, automation, and pipelines with Infrastructure engineers for code deployments using GitHub Actions. Bring in automation from source-to-target mappings to data pipelines and data lineage in Collibra. Required Experience

Hands-on experience building pipelines with ADF, Snowflake, Databricks, and DBT Labs. Expertise in Azure Cloud with Databricks, Snowflake, and ADLS Gen2 integration. Data Warehousing and Lakehouse Knowledge: Proficient with ELT processes. Experience with Databricks Unity Catalog and data sharing technologies. Strong skills in CI/CD (Azure DevOps, GitHub Actions) and version control (GitHub). Seniority level

Mid-Senior level Employment type

Full-time Job function

Other - IT Services and IT Consulting Strong cross-functional collaboration and technical support experience for data scientists, report developers and analysts.

#J-18808-Ljbffr