Highbridge Consulting LLC
Our client is a
leading global financial advisory and investment banking firm
undertaking a
strategic data transformation initiative . The project centers on building a
secure, governed, and scalable data platform
leveraging
Azure Databricks, Unity Catalog, and Microsoft Fabric , with a strong emphasis on
data governance, compliance, and analytics readiness .
Role Overview
We are seeking an experienced
Azure Databricks Data Engineer (Contractor/Consultant)
to design and implement modern
ETL pipelines, data transformations, and Medallion architecture-based solutions . The consultant will work closely with the
Head of Data Strategy, DevOps, and Information Security teams
to operationalize a
data strategy built on Azure Databricks and Unity Catalog , ensuring scalability, performance, and governance.
This is a
hands-on delivery role , suited for a consultant who can contribute immediately and help shape the technical foundation of the program.
Key Responsibilities
Architect, build, and optimize
ETL/ELT data pipelines
within
Azure Databricks . Implement
Unity Catalog
for governance, data lineage, and secure access management. Develop advanced
PySpark, Python, and SQL (pySQL)
solutions for data processing and transformation. Apply
Medallion Architecture (Bronze/Silver/Gold)
to organize and curate data layers. Integrate
Azure Data Lake and Microsoft Fabric
for enterprise-wide data access and analytics. Collaborate with
DevOps and InfoSec
to ensure pipelines meet compliance and security requirements. Deliver documentation, operational guidelines, and knowledge transfer to internal teams. Support ongoing monitoring, performance tuning, and troubleshooting of pipelines. Required Skills & Experience
5+ years
of professional data engineering experience, with
proven hands-on expertise in Azure Databricks
(mandatory; AWS Databricks not applicable). Demonstrated experience with
Unity Catalog
setup and governance. Strong programming skills in
Python, PySpark, and SQL
for data transformation at scale. Solid experience in
ETL/ELT workflows , orchestration, and pipeline optimization. In-depth knowledge of
Medallion Architecture (Bronze/Silver/Gold) . Experience with
Azure Data Lake and Microsoft Fabric . Familiarity with
DevOps practices, CI/CD pipelines, and cloud security principles . Strong communication skills and ability to operate in
cross-functional project teams . Prior experience in
financial services or other regulated industries
is a plus.
leading global financial advisory and investment banking firm
undertaking a
strategic data transformation initiative . The project centers on building a
secure, governed, and scalable data platform
leveraging
Azure Databricks, Unity Catalog, and Microsoft Fabric , with a strong emphasis on
data governance, compliance, and analytics readiness .
Role Overview
We are seeking an experienced
Azure Databricks Data Engineer (Contractor/Consultant)
to design and implement modern
ETL pipelines, data transformations, and Medallion architecture-based solutions . The consultant will work closely with the
Head of Data Strategy, DevOps, and Information Security teams
to operationalize a
data strategy built on Azure Databricks and Unity Catalog , ensuring scalability, performance, and governance.
This is a
hands-on delivery role , suited for a consultant who can contribute immediately and help shape the technical foundation of the program.
Key Responsibilities
Architect, build, and optimize
ETL/ELT data pipelines
within
Azure Databricks . Implement
Unity Catalog
for governance, data lineage, and secure access management. Develop advanced
PySpark, Python, and SQL (pySQL)
solutions for data processing and transformation. Apply
Medallion Architecture (Bronze/Silver/Gold)
to organize and curate data layers. Integrate
Azure Data Lake and Microsoft Fabric
for enterprise-wide data access and analytics. Collaborate with
DevOps and InfoSec
to ensure pipelines meet compliance and security requirements. Deliver documentation, operational guidelines, and knowledge transfer to internal teams. Support ongoing monitoring, performance tuning, and troubleshooting of pipelines. Required Skills & Experience
5+ years
of professional data engineering experience, with
proven hands-on expertise in Azure Databricks
(mandatory; AWS Databricks not applicable). Demonstrated experience with
Unity Catalog
setup and governance. Strong programming skills in
Python, PySpark, and SQL
for data transformation at scale. Solid experience in
ETL/ELT workflows , orchestration, and pipeline optimization. In-depth knowledge of
Medallion Architecture (Bronze/Silver/Gold) . Experience with
Azure Data Lake and Microsoft Fabric . Familiarity with
DevOps practices, CI/CD pipelines, and cloud security principles . Strong communication skills and ability to operate in
cross-functional project teams . Prior experience in
financial services or other regulated industries
is a plus.