Insight Global
Overview
Insight Global is seeking a highly skilled
Data Engineer
to support one of our insurance clients in a dynamic hybrid environment. This is a
6-month contract
opportunity with the potential for extension. This team has a hole in their Python and Databricks experience, and searching for resources that can fill that gap! The ideal candidate will bring
5+ years of data engineering experience , with a strong focus on
Python-based ETL development
and
Azure Databricks . Base pay range
$50.00/hr - $60.00/hr Responsibilities
Design, build, and maintain scalable ETL pipelines using
Python
and
Azure Data Factory Develop and optimize
Databricks notebooks
for data ingestion, transformation, and analytics Collaborate with data scientists, analysts, and business stakeholders to deliver clean, reliable datasets Implement data quality checks, monitoring, and governance practices Work with cloud-native tools to manage data lakes, warehouses, and real-time data streams Participate in Agile ceremonies and contribute to sprint planning and retrospectives Required Skills
5+ years
of hands-on experience in
data engineering Strong proficiency in
Python , especially for ETL workflows Deep experience with
Azure Databricks
(including PySpark, Delta Lake, and notebook orchestration) Solid understanding of
Azure cloud services
(Data Factory, Blob Storage, Synapse, etc.) Experience with
SQL
and data modeling Familiarity with CI/CD pipelines and version control (e.g., Git, Azure DevOps) Preferred Qualifications
Experience in the
insurance or financial services
industry Knowledge of
data governance tools
like Unity Catalog or Microsoft Purview Exposure to
Apache Spark ,
Kafka , or
Airflow Seniority level
Mid-Senior level Employment type
Contract Job function
Information Technology Industries
Staffing and Recruiting
#J-18808-Ljbffr
Insight Global is seeking a highly skilled
Data Engineer
to support one of our insurance clients in a dynamic hybrid environment. This is a
6-month contract
opportunity with the potential for extension. This team has a hole in their Python and Databricks experience, and searching for resources that can fill that gap! The ideal candidate will bring
5+ years of data engineering experience , with a strong focus on
Python-based ETL development
and
Azure Databricks . Base pay range
$50.00/hr - $60.00/hr Responsibilities
Design, build, and maintain scalable ETL pipelines using
Python
and
Azure Data Factory Develop and optimize
Databricks notebooks
for data ingestion, transformation, and analytics Collaborate with data scientists, analysts, and business stakeholders to deliver clean, reliable datasets Implement data quality checks, monitoring, and governance practices Work with cloud-native tools to manage data lakes, warehouses, and real-time data streams Participate in Agile ceremonies and contribute to sprint planning and retrospectives Required Skills
5+ years
of hands-on experience in
data engineering Strong proficiency in
Python , especially for ETL workflows Deep experience with
Azure Databricks
(including PySpark, Delta Lake, and notebook orchestration) Solid understanding of
Azure cloud services
(Data Factory, Blob Storage, Synapse, etc.) Experience with
SQL
and data modeling Familiarity with CI/CD pipelines and version control (e.g., Git, Azure DevOps) Preferred Qualifications
Experience in the
insurance or financial services
industry Knowledge of
data governance tools
like Unity Catalog or Microsoft Purview Exposure to
Apache Spark ,
Kafka , or
Airflow Seniority level
Mid-Senior level Employment type
Contract Job function
Information Technology Industries
Staffing and Recruiting
#J-18808-Ljbffr