Hitachi Vantara Corporation
Senior Software Engineer (Python) - Azure & Databricks IRC278449
Hitachi Vantara Corporation, Romania, Pennsylvania, United States
Description
Experienced and hands‑on Senior Software Engineer with expertise in Azure‑based big data platforms. Proven track record designing, deploying, and managing large‑scale data lake infrastructure using Databricks, IaC tools like Bicep and Pulumi, and strong background networking, identity management, and cost optimization within enterprise‑grade environments.
Platform overview Data platform is a strategic cloud‑native platform developed for a German OEM to standardize and accelerate deployment of AI, analytics, and digital services across the enterprise. Built on Microsoft Azure, it provides a modular, scalable, and secure foundation for delivering DataOps, ML workflows, and self‑service environments to internal teams and partners.
Key capabilities
Unified infrastructure provisioning across business units
Automation of cloud resource deployment using IaC (Pulumi/Bicep)
Integration of services like Azure ML, Databricks, OpenAI, AI Search, and Power BI/Fabric
Fine‑grained access control using Entra ID, Managed Identities, and Service Principals
Full support for multi‑environment compliance, secure networking, and cost transparency
Core delivery backbone for AI‑driven innovation, enabling rapid onboarding, operational consistency, and secure collaboration across teams
Requirements
5+ years in development/DevOps roles, including 3+ years on data platforms
Strong Azure knowledge: ADLS Gen2, Event Hub, Key Vault, AAD, Private Link
Databricks expertise: Unity Catalog, cluster policies, Repos, DBFS
IaC: Pulumi (Python)
Strong understanding of secure multi‑tenant architecture
CI/CD with GitHub Actions
Tech Stack Azure, Databricks, Unity Catalog, EventHub, Pulumi, REST APIs, ADX, Synapse, Power BI, Azure Monitor, Python/PowerShell/Bash
Job responsibilities
Own the design, deployment, and lifecycle management of Azure Databricks modules, Unity Catalog, networking, identity, governance, and cost‑optimization components, focusing on automation, security, and platform scalability
Design and deploy Databricks infrastructure: networking, clusters, VNet integration, security using Python (FastAPI) and Pulumi
Implement Unity Catalog setup and workspace governance
Build Customer Managed Ingest (CMI) pipelines and automation modules
Implement billing/cost‑tracking telemetry for platform usage
Create whitelisting, role‑assignment, and access‑management APIs
Work with multiple agile teams (SAFe) and participate in architectural workshops
What we offer Culture of caring.
At GlobalLogic, we prioritize a culture of caring…
Learning and development.
We are committed to your continuous learning and development…
Interesting & meaningful work.
GlobalLogic is known for engineering impact for and with clients around the world…
Balance and flexibility.
We believe in the importance of balance and flexibility…
High‑trust organization.
We are a high‑trust organization where integrity is key…
About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward‑thinking companies…
#J-18808-Ljbffr
Platform overview Data platform is a strategic cloud‑native platform developed for a German OEM to standardize and accelerate deployment of AI, analytics, and digital services across the enterprise. Built on Microsoft Azure, it provides a modular, scalable, and secure foundation for delivering DataOps, ML workflows, and self‑service environments to internal teams and partners.
Key capabilities
Unified infrastructure provisioning across business units
Automation of cloud resource deployment using IaC (Pulumi/Bicep)
Integration of services like Azure ML, Databricks, OpenAI, AI Search, and Power BI/Fabric
Fine‑grained access control using Entra ID, Managed Identities, and Service Principals
Full support for multi‑environment compliance, secure networking, and cost transparency
Core delivery backbone for AI‑driven innovation, enabling rapid onboarding, operational consistency, and secure collaboration across teams
Requirements
5+ years in development/DevOps roles, including 3+ years on data platforms
Strong Azure knowledge: ADLS Gen2, Event Hub, Key Vault, AAD, Private Link
Databricks expertise: Unity Catalog, cluster policies, Repos, DBFS
IaC: Pulumi (Python)
Strong understanding of secure multi‑tenant architecture
CI/CD with GitHub Actions
Tech Stack Azure, Databricks, Unity Catalog, EventHub, Pulumi, REST APIs, ADX, Synapse, Power BI, Azure Monitor, Python/PowerShell/Bash
Job responsibilities
Own the design, deployment, and lifecycle management of Azure Databricks modules, Unity Catalog, networking, identity, governance, and cost‑optimization components, focusing on automation, security, and platform scalability
Design and deploy Databricks infrastructure: networking, clusters, VNet integration, security using Python (FastAPI) and Pulumi
Implement Unity Catalog setup and workspace governance
Build Customer Managed Ingest (CMI) pipelines and automation modules
Implement billing/cost‑tracking telemetry for platform usage
Create whitelisting, role‑assignment, and access‑management APIs
Work with multiple agile teams (SAFe) and participate in architectural workshops
What we offer Culture of caring.
At GlobalLogic, we prioritize a culture of caring…
Learning and development.
We are committed to your continuous learning and development…
Interesting & meaningful work.
GlobalLogic is known for engineering impact for and with clients around the world…
Balance and flexibility.
We believe in the importance of balance and flexibility…
High‑trust organization.
We are a high‑trust organization where integrity is key…
About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward‑thinking companies…
#J-18808-Ljbffr