Logo
Utah Staffing

Principal Data Architect - AI

Utah Staffing, South Jordan, Utah, United States, 84095

Save Job

Principal Data Architect - Ai

We are seeking a highly experienced Principal Data Architect to lead the design and implementation of our next-generation cloud data platform, enabling advanced analytics and AI/ML solutions in the healthcare domain. This role will be responsible for end-to-end data architecture, including infrastructure, pipelines, and consumption layers, and will collaborate closely with product management and data science teams. The ideal candidate will have deep experience in modern data platforms, cloud services, and healthcare data, with a strong track record of delivering scalable, reusable data solutions. You will play a pivotal role in building a modern platform within Databricks, optimizing data workflows for reuse, and enabling Generative AI applications. * Lead the architecture, design, and implementation of a cloud-native data platform supporting enterprise analytics and AI/ML. * Own the overall data architecture, ensuring scalability, security, data quality, and interoperability. * Design reusable data models (e.g., claim denorm layers) to minimize duplication and maximize efficiency across use cases. * Collaborate with product managers to define the architecture for reusable, scalable data products. * Develop robust data pipelines and infrastructure leveraging Databricks, AWS, and Azure. * Architect feature stores and governance frameworks for ML solution development. * Partner with Data Science and AI teams to ensure platform readiness for Generative AI and Agentic AI use cases. * Establish and enforce data governance, security, and privacy best practices in a healthcare context. * Lead the migration strategy from Cloudera to Databricks, ensuring minimal disruption and technical alignment. * Mentor and guide engineering teams, promoting cloud best practices and reusable design patterns. * Complete all responsibilities as outlined in the annual performance review and/or goal setting. * Complete all special projects and other duties as assigned. * Must be able to perform duties with or without reasonable accommodation. This job description is intended to describe the general nature and level of work being performed and is not to be construed as an exhaustive list of responsibilities, duties and skills required. This job description does not constitute an employment agreement and is subject to change as the needs of Cotiviti and requirements of the job change. * Bachelor's degree in computer science, Information Technology or related field, or equivalent experience. * 10+ years in data engineering, cloud engineering, or data architecture, with increasing responsibility. * Deep expertise with major cloud platforms (AWS, Azure, OCI) and modern data platforms (Databricks, Snowflake, Oracle). * Experiencing architecting or developing data pipelines using Python, Spark, PySpark, Scala. * Solid understanding and experience architecting systems involving distributed and streaming capabilities, leveraging technologies like Kubernetes, spark streaming, kafka. * Architecting or leading teams building data platforms on Databricks is preferred. * Proven experience architecting data and AI/ML solutions within the healthcare industry. * Hands-on experience building scalable, reusable data pipelines, data products, and feature stores. * Strong knowledge of data governance, security, and regulatory compliance (HIPAA, etc.) in the healthcare context. * Exceptional interpersonal skills, including teamwork, facilitation and negotiation. * Excellent written and verbal communication skills.