Dice
Job Title:
Data Engineering Support Lead
Location:
Charlotte, NC (Remote EST hours preferred) Type:
Contract (W2 only No benefits) Experience Required:
12+ Years Job Overview:
We are seeking a
Data Engineering Support Lead
with deep experience in modern data stack tools and ETL/ELT pipelines. The ideal candidate will have a strong background in
Snowflake, SQL, Airflow, dbt, Fivetran, and AWS
with hands-on troubleshooting and production support expertise. Core Technical Skills (Required):
Snowflake:
Querying, monitoring query history, task scheduling, data validation SQL:
Strong ability to write, debug, and optimize complex queries Airflow:
DAG dependencies, task retries, scheduling, manual triggers Python:
Ability to read/modify scripts used in ETL jobs or Lambda functions AWS Lambda:
Review logs and execution results for event-based jobs ETL / ELT:
Strong understanding of data ingestion and transformation flows Investigate
Snowflake table/view refresh failures
and review logs Check and
re-run Airflow DAGs or dbt models Validate ingestion via
Fivetran
and confirm source loads using SQL Review
ETL/ELT job dependencies
and manually trigger failed runs Identify and resolve
timing or sequencing issues
in scheduled jobs Preferred Background:
Strong analytical and troubleshooting mindset Experience supporting data pipelines in
production environments Excellent communication and coordination skills across teams
#J-18808-Ljbffr
Data Engineering Support Lead
Location:
Charlotte, NC (Remote EST hours preferred) Type:
Contract (W2 only No benefits) Experience Required:
12+ Years Job Overview:
We are seeking a
Data Engineering Support Lead
with deep experience in modern data stack tools and ETL/ELT pipelines. The ideal candidate will have a strong background in
Snowflake, SQL, Airflow, dbt, Fivetran, and AWS
with hands-on troubleshooting and production support expertise. Core Technical Skills (Required):
Snowflake:
Querying, monitoring query history, task scheduling, data validation SQL:
Strong ability to write, debug, and optimize complex queries Airflow:
DAG dependencies, task retries, scheduling, manual triggers Python:
Ability to read/modify scripts used in ETL jobs or Lambda functions AWS Lambda:
Review logs and execution results for event-based jobs ETL / ELT:
Strong understanding of data ingestion and transformation flows Investigate
Snowflake table/view refresh failures
and review logs Check and
re-run Airflow DAGs or dbt models Validate ingestion via
Fivetran
and confirm source loads using SQL Review
ETL/ELT job dependencies
and manually trigger failed runs Identify and resolve
timing or sequencing issues
in scheduled jobs Preferred Background:
Strong analytical and troubleshooting mindset Experience supporting data pipelines in
production environments Excellent communication and coordination skills across teams
#J-18808-Ljbffr