Macpower Digital Assets Edge
BFSI Azure Technical Lead (Lead Azure Data Engineer)
Macpower Digital Assets Edge, Oakland, California, United States, 94616
Job Overview:
We are looking for a skilled
zure Technical Lead (Lead Azure Data Engineer)
with strong experience in building scalable data pipelines using Azure Data Factory and Azure Databricks. The ideal candidate will have a solid background in data engineering, particularly in migrating traditional ETL workloads to modern ELT architectures on the cloud. You will work directly with client stakeholders and must be comfortable operating in overlapping U.S. business hours.
Key Responsibilities:
Design, develop, and maintain
zure Data Factory (ADF)
pipelines and
zure Databricks (ADB)
notebooks using
PySpark . Migrate legacy RDBMS/ETL workloads into modern
Data Lake/ELT
architectures on Azure. nalyze, design, and implement scalable solutions involving
zure Data Lake
and data marts. Collaborate directly with business and technical stakeholders to gather requirements and deliver solutions. Ensure best practices in performance, security, and DevOps for data pipeline development.
Must-Have Skills:
Proven expertise in
zure Data Engineering
tools:
zure Data Factory (ADF) zure Databricks (ADB) zure Data Lake Storage Gen2 zure SQL DB zure DevOps Key Vault, Storage Accounts
Strong hands-on experience with
PySpark
and
PL/SQL Experience migrating traditional ETL workloads to
cloud-based ELT/data lake architectures
Desirable Skills:
Experience with
DataStage
and
SQL Server Strong communication and collaboration skills bility to work in a
client-facing role
with overlapping
U.S. time zone
availability
We are looking for a skilled
zure Technical Lead (Lead Azure Data Engineer)
with strong experience in building scalable data pipelines using Azure Data Factory and Azure Databricks. The ideal candidate will have a solid background in data engineering, particularly in migrating traditional ETL workloads to modern ELT architectures on the cloud. You will work directly with client stakeholders and must be comfortable operating in overlapping U.S. business hours.
Key Responsibilities:
Design, develop, and maintain
zure Data Factory (ADF)
pipelines and
zure Databricks (ADB)
notebooks using
PySpark . Migrate legacy RDBMS/ETL workloads into modern
Data Lake/ELT
architectures on Azure. nalyze, design, and implement scalable solutions involving
zure Data Lake
and data marts. Collaborate directly with business and technical stakeholders to gather requirements and deliver solutions. Ensure best practices in performance, security, and DevOps for data pipeline development.
Must-Have Skills:
Proven expertise in
zure Data Engineering
tools:
zure Data Factory (ADF) zure Databricks (ADB) zure Data Lake Storage Gen2 zure SQL DB zure DevOps Key Vault, Storage Accounts
Strong hands-on experience with
PySpark
and
PL/SQL Experience migrating traditional ETL workloads to
cloud-based ELT/data lake architectures
Desirable Skills:
Experience with
DataStage
and
SQL Server Strong communication and collaboration skills bility to work in a
client-facing role
with overlapping
U.S. time zone
availability