HonorVet Technologies
Backend Data Engineer
The mission of the Data & Analytics (D&A) team is to enable data users to easily client, understand, and access trusted data products. A critical enabler of this mission is robust governance and automation within Databricks and Unity Catalog. The Senior Backend Engineer will design, build, and scale automation capabilities that enforce governance standards, improve data quality, and provide transparency into metadata, lineage, and usage. This role will ensure that the Metadata Catalog UI and supporting services are powered by trusted, well-governed, and observable data infrastructure. Key Responsibilities Databricks & Unity Catalog Engineering Build and maintain backend services leveraging Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows) Administer Unity Catalog including metadata, permissions, lineage, and tags. Integrate Unity Catalog APIs to surface data into the Metadata Catalog UI. Governance Automation Develop automation scripts and pipelines to enforce access controls, tagging, and role-based policies. Implement governance workflows integrating with tools such as ServiceNow for request and approval processes. Automate compliance checks for regulatory and security requirements (IAM, PII handling, encryption). Data Quality & Observability Implement data quality frameworks (Great Expectations, Deequ, or equivalent) to validate datasets. Build monitoring and observability pipelines for logging, usage metrics, audit trails, and alerts. Ensure high system reliability and proactive issue detection. API Development & Integration Design and implement APIs to integrate Databricks services with external platforms (ServiceNow, monitoring tools). Build reusable automation utilities and integration frameworks for governance at scale. DevOps & CI/CD Manage source control and CI/CD pipelines (GitHub, Azure DevOps, Jenkins) for backend workflows. Deploy scalable and secure backend services in cloud environments (Azure preferred). Document, test, and industrialize automation solutions for production environments. Profile Core Skills Strong proficiency in Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows). Deep knowledge of Unity Catalog administration and APIs. Expertise in Python for automation scripts, API integrations, and data quality checks. Experience with governance frameworks (access control, tagging enforcement, lineage, compliance). Solid foundation in security & compliance best practices (IAM, encryption, PII). Automation & DevOps Experience with CI/CD and deployment pipelines (GitHub Actions, Azure DevOps, Jenkins). Familiarity with monitoring/observability tools and building custom logging & alerting pipelines. Experience integrating with external systems (ServiceNow, monitoring platforms). Additional Skills Experience with modern data quality frameworks (Great Expectations, Deequ, or equivalent). Strong problem-solving and debugging skills in distributed systems. Clear communication and documentation skills to collaborate across GT and D&A teams. Education & Experience Bachelor's degree in Computer Science, Engineering, or related field OR equivalent professional experience. 5+ years of backend engineering experience in data platforms. 3+ years working with Databricks and/or Unity Catalog in enterprise environments. Demonstrated ability to design and deliver automation solutions for governance, quality, and compliance at scale.
The mission of the Data & Analytics (D&A) team is to enable data users to easily client, understand, and access trusted data products. A critical enabler of this mission is robust governance and automation within Databricks and Unity Catalog. The Senior Backend Engineer will design, build, and scale automation capabilities that enforce governance standards, improve data quality, and provide transparency into metadata, lineage, and usage. This role will ensure that the Metadata Catalog UI and supporting services are powered by trusted, well-governed, and observable data infrastructure. Key Responsibilities Databricks & Unity Catalog Engineering Build and maintain backend services leveraging Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows) Administer Unity Catalog including metadata, permissions, lineage, and tags. Integrate Unity Catalog APIs to surface data into the Metadata Catalog UI. Governance Automation Develop automation scripts and pipelines to enforce access controls, tagging, and role-based policies. Implement governance workflows integrating with tools such as ServiceNow for request and approval processes. Automate compliance checks for regulatory and security requirements (IAM, PII handling, encryption). Data Quality & Observability Implement data quality frameworks (Great Expectations, Deequ, or equivalent) to validate datasets. Build monitoring and observability pipelines for logging, usage metrics, audit trails, and alerts. Ensure high system reliability and proactive issue detection. API Development & Integration Design and implement APIs to integrate Databricks services with external platforms (ServiceNow, monitoring tools). Build reusable automation utilities and integration frameworks for governance at scale. DevOps & CI/CD Manage source control and CI/CD pipelines (GitHub, Azure DevOps, Jenkins) for backend workflows. Deploy scalable and secure backend services in cloud environments (Azure preferred). Document, test, and industrialize automation solutions for production environments. Profile Core Skills Strong proficiency in Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows). Deep knowledge of Unity Catalog administration and APIs. Expertise in Python for automation scripts, API integrations, and data quality checks. Experience with governance frameworks (access control, tagging enforcement, lineage, compliance). Solid foundation in security & compliance best practices (IAM, encryption, PII). Automation & DevOps Experience with CI/CD and deployment pipelines (GitHub Actions, Azure DevOps, Jenkins). Familiarity with monitoring/observability tools and building custom logging & alerting pipelines. Experience integrating with external systems (ServiceNow, monitoring platforms). Additional Skills Experience with modern data quality frameworks (Great Expectations, Deequ, or equivalent). Strong problem-solving and debugging skills in distributed systems. Clear communication and documentation skills to collaborate across GT and D&A teams. Education & Experience Bachelor's degree in Computer Science, Engineering, or related field OR equivalent professional experience. 5+ years of backend engineering experience in data platforms. 3+ years working with Databricks and/or Unity Catalog in enterprise environments. Demonstrated ability to design and deliver automation solutions for governance, quality, and compliance at scale.