Job DescriptionJob Description Technical Expertise: Data bricks Mastery: Deep, expert-level knowledge of the Data bricks Platform, including: Unity Catalog: Designing and implementing data governance and security. Delta Lake & Delta Live Tables: Architecture and building reliable, scalable data pipelines. Performance & Cost Optimization: Expertise in tuning Spark jobs, optimizing cluster usage, and managing platform costs. MLOps: Strong, practical understanding of the machine learning lifecycle on Data bricks using tools like MLflow. Data bricks SQL: Knowledge of designing and optimizing analytical workloads. Mosaic AI: Knowledge of designing and optimizing AI Agents. Cloud & Infrastructure: Deep knowledge of cloud architecture and services on AWS. Strong command of Infrastructure as Code (Terraform, YAML). Data Engineering & Programming: Strong background in data modeling, ETL/ELT development, and advanced, hands-on programming skills in Python and SQL. CI/CD & Automation: Experience with designing and implementing CI/CD pipelines (preferably with GitHub Actions) for data and ML workloads. Severability: Familiarity with implementing monitoring, logging, and alerting for data platforms. Automation: The platform is ephemeral, and all changes are implemented using Terraform and Python. Expertise in Terraform and Python is a must. #J-18808-Ljbffr
ZipRecruiter