TRIMAH TECHNOLOGIES LLC
W2 Contract Req: Databricks Platform Engineer (Cloud Data) (Local/Onsite/Columbu
TRIMAH TECHNOLOGIES LLC, Columbus, Ohio, United States, 43224
Job Title: Databricks Platform Engineer (Cloud Data)
Location: On-site in Columbus, OH - 3 days per week
Position Type: 6-12 Month Contract with possible extension
Job Type: W2 Contract
Visa: USC/GC
We are seeking a Databricks Platform Engineer to install, configure, and operationalize an enterprise Databricks Lakehouse platform. This role focuses on secure setup, governance, automation, and performance optimization to enable data engineering and analytics at scale. Key Responsibilities Install and configure Databricks workspaces, clusters, and storage integrations. Implement SSO/SCIM, RBAC, and cluster policies for secure operations. Set up Unity Catalog for fine-grained data governance and access control. Configure network security including private endpoints, VNet/VPC peering, and firewall rules. Integrate logging and monitoring with enterprise SIEM platforms. Establish cost management controls and usage governance. Develop Infrastructure-as-Code using Terraform and CI/CD pipelines. Apply upgrades, troubleshoot platform issues, and document standards. Required Skills & Qualifications 5+ years experience administering cloud data platforms. 2+ years hands-on Databricks administration experience. Strong expertise with Unity Catalog, cluster policies, and governance. Deep understanding of Azure or AWS networking and security. Hands-on experience with Terraform and CI/CD pipelines. Nice to Have Databricks certifications. FinOps or cost optimization experience. Experience supporting migrations or proofs of concept.
We are seeking a Databricks Platform Engineer to install, configure, and operationalize an enterprise Databricks Lakehouse platform. This role focuses on secure setup, governance, automation, and performance optimization to enable data engineering and analytics at scale. Key Responsibilities Install and configure Databricks workspaces, clusters, and storage integrations. Implement SSO/SCIM, RBAC, and cluster policies for secure operations. Set up Unity Catalog for fine-grained data governance and access control. Configure network security including private endpoints, VNet/VPC peering, and firewall rules. Integrate logging and monitoring with enterprise SIEM platforms. Establish cost management controls and usage governance. Develop Infrastructure-as-Code using Terraform and CI/CD pipelines. Apply upgrades, troubleshoot platform issues, and document standards. Required Skills & Qualifications 5+ years experience administering cloud data platforms. 2+ years hands-on Databricks administration experience. Strong expertise with Unity Catalog, cluster policies, and governance. Deep understanding of Azure or AWS networking and security. Hands-on experience with Terraform and CI/CD pipelines. Nice to Have Databricks certifications. FinOps or cost optimization experience. Experience supporting migrations or proofs of concept.