Logo
Gurgaon Portal

Etl Developer

Gurgaon Portal, Snowflake, Arizona, United States, 85937

Save Job

Role Overview: We are seeking a highly skilled and experienced

ETL Developer

with strong expertise in

Snowflake and dbt (Jinja + SQL) . The ideal candidate will be responsible for designing, building, and maintaining robust ETL pipelines and data workflows to support enterprise-level analytics and business intelligence initiatives. Secondary expertise in

Python

for automation and scripting will be an added advantage.

This role requires hands-on technical expertise, problem-solving skills, and the ability to collaborate with cross-functional teams to deliver scalable and efficient data solutions.

Key Responsibilities:

ETL Development & Data Integration:

Design, develop, and optimize ETL pipelines using

Snowflake

and

dbt (Data Build Tool) .

Implement transformations using

Jinja templates

and

SQL models

in dbt.

Ensure data is cleansed, validated, and transformed accurately to meet business requirements.

Data Modeling & Architecture:

Work with stakeholders to design and implement

star schema / snowflake schema

and other optimized data models.

Support data warehousing initiatives and ensure alignment with data architecture best practices.

Performance Optimization:

Monitor, troubleshoot, and optimize SQL queries, dbt models, and Snowflake pipelines for better efficiency.

Implement best practices for performance tuning, query optimization, and cost management in Snowflake.

Automation & Scripting:

Leverage

Python

for automating ETL workflows, data quality checks, and operational tasks.

Integrate dbt with orchestration frameworks (e.g., Airflow, dbt Cloud, or equivalent).

Data Governance & Quality:

Implement data validation frameworks, audit checks, and reconciliation processes.

Maintain documentation of ETL workflows, data models, and transformations for transparency and governance.

Collaboration & Support:

Partner with business analysts, data scientists, and BI developers to provide high-quality, reliable datasets.

Provide production support for ETL jobs, ensuring timely resolution of issues.

Required Skills (Mandatory):

Snowflake:

Advanced knowledge of Snowflake features (Warehouses, Schemas, Cloning, Micro-partitioning, Streams, Tasks).

Experience in query optimization, performance tuning, and cost-effective scaling in Snowflake.

dbt (Data Build Tool):

Strong experience in developing and maintaining dbt models.

Proficiency in

Jinja templating

and

SQL transformations

in dbt.

Knowledge of dbt testing frameworks and deployment practices.

SQL Expertise:

Advanced SQL programming skills for data transformation, analysis, and performance optimization.

Secondary Skills (Preferred):

Python:

Strong scripting and automation capabilities.

Experience in integrating Python scripts into ETL workflows.

Familiarity with data manipulation libraries (e.g., Pandas, PySpark).

Familiarity with cloud platforms (AWS/Azure/GCP) and orchestration tools like

Airflow .

Knowledge of version control (Git/GitHub/GitLab) and CI/CD practices.

Exposure to data visualization tools (Tableau, Power BI) is a plus.

#J-18808-Ljbffr