Sabre
GCP Data Architect
Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem as we power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and scores of other solutions. Simply put, we connect people with moments that matter. Hospitality Solutions, formerly part of Sabre Holdings, is a global leader at the forefront of hospitality technology powering over 40,000 properties across 174 countries. Celebrated for our innovative and customer-centric approach, we deliver integrated platforms for distribution, reservations, retailing, and guest experience to both renowned hotel brands and independent properties worldwide. With the strategic support of TPG, a leading private equity firm, we are entering an era of accelerated growth, digital transformation, and operational excellence as a focused, independent company. Building on our legacy of driving technological evolution in hospitality, we are committed to setting new standards for guest satisfaction and operational efficiency. Central to this transformation is the establishment of a world-class procurement function that supports global scale, fosters operational rigor, and enables value creation and innovation across the enterprise. Hospitality Solutions is looking for an experienced GCP Data Architect. Working in Hospitality Solutions Enterprise Architecture team gives you opportunities to develop your technical, business, and intercultural skills, cooperating with people from all over the world. Role and Responsibilities The GCP Data Architect is a senior technical leader responsible for designing, implementing, and overseeing scalable, secure, and cost-effective data architectures on Google Cloud Platform (GCP). This role blends strategic vision with hands-on engineering to modernize data platforms, prepare our organization for AI/ML at scale, and drive digital transformation. Key Responsibilities Architect and implement end-to-end data solutions using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Design and build data lakes, lakehouses, and data mesh platforms that serve as a foundation for advanced analytics, generative AI, and machine learning initiatives. Lead data modernization efforts, migrating legacy systems to cloud-native solutions to unlock new analytical and AI capabilities. Develop robust data pipelines and MLOps platforms that enable the development, deployment, and management of machine learning models. Collaborate with business and technical stakeholders to define enterprise data strategies and reference architectures that support our AI and business intelligence roadmaps. Establish and enforce data governance, security, and compliance to ensure the integrity and responsible use of data for all business purposes, including AI. Mentor engineering teams on GCP best practices, data architecture patterns, and the principles of MLOps. Required Skills & Experience 68+ years of experience in data engineering, architecture, or analytics roles. Deep expertise in GCP services, particularly those essential for data and AI workloads (e.g., BigQuery, Dataflow, Vertex AI, Cloud Composer). Strong proficiency in Python, SQL, and data orchestration tools like Airflow. Extensive experience with data modeling, distributed systems, and real-time processing. Hands-on experience with MLOps principles and tools for automating machine learning workflows. Familiarity with Infrastructure as Code (e.g., Terraform) to manage and scale data environments. Excellent communication and stakeholder engagement skills, with the ability to translate complex technical concepts into business value. QA awareness including Unit Tests, TDD, and performance testing. Preferred Qualifications Google Cloud Professional Data Engineer or Cloud Architect certification. Proven experience with AI/ML platforms, Kubernetes, and data cataloging tools. Experience in a consulting or hybrid delivery environment (onshore/offshore). Skills List GCP Core Services: BigQuery, Oracle, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Vertex AI, Cloud Spanner, Cloud SQL, Cloud IAM. Programming Languages: Python, SQL, and optionally Java or Scala. Data Technologies: Data lakes, data warehouses, real-time data streaming, distributed systems, ETL/ELT. Architecture & Design: Data modeling, data mesh, data fabric, Lambda and Kappa architectures. MLOps & AI: Vertex AI, Kubeflow, ML model serving, and CI/CD for machine learning pipelines. Infrastructure as Code (IaC): Terraform, Deployment Manager. Data Governance: Data cataloging, data lineage, metadata management. Soft Skills: Strategic planning, technical leadership, mentoring, communication, and problem-solving. Outstanding Benefits Very competitive compensation Generous Paid Time Off (25 PTO days) 4 days (one day/quarter) Volunteer Time Off (VTO) 5 days off annually for Year-End Break We offer a comprehensive medical, dental, and Wellness Program 12 weeks paid parental leave An infrastructure that allows flexible working arrangements Formal and informal reward, recognition, and acknowledgement programs Lots of fun and engaging employee development events Reasonable Accommodation Sabre is committed to working with and providing reasonable accommodation to applicants with disabilities. Applicants applying for a Sabre position with a disability who require a reasonable accommodation for any part of the application or hiring process may contact Sabre at recruiting@careers.sabre.com. Determinations on requests for reasonable accommodation will be made on a case-by-case basis. Affirmative Action Sabre is an equal employment opportunity/affirmative action employer and is committed to providing employment opportunities to minorities, females, veterans, and disabled individuals.
Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem as we power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and scores of other solutions. Simply put, we connect people with moments that matter. Hospitality Solutions, formerly part of Sabre Holdings, is a global leader at the forefront of hospitality technology powering over 40,000 properties across 174 countries. Celebrated for our innovative and customer-centric approach, we deliver integrated platforms for distribution, reservations, retailing, and guest experience to both renowned hotel brands and independent properties worldwide. With the strategic support of TPG, a leading private equity firm, we are entering an era of accelerated growth, digital transformation, and operational excellence as a focused, independent company. Building on our legacy of driving technological evolution in hospitality, we are committed to setting new standards for guest satisfaction and operational efficiency. Central to this transformation is the establishment of a world-class procurement function that supports global scale, fosters operational rigor, and enables value creation and innovation across the enterprise. Hospitality Solutions is looking for an experienced GCP Data Architect. Working in Hospitality Solutions Enterprise Architecture team gives you opportunities to develop your technical, business, and intercultural skills, cooperating with people from all over the world. Role and Responsibilities The GCP Data Architect is a senior technical leader responsible for designing, implementing, and overseeing scalable, secure, and cost-effective data architectures on Google Cloud Platform (GCP). This role blends strategic vision with hands-on engineering to modernize data platforms, prepare our organization for AI/ML at scale, and drive digital transformation. Key Responsibilities Architect and implement end-to-end data solutions using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Design and build data lakes, lakehouses, and data mesh platforms that serve as a foundation for advanced analytics, generative AI, and machine learning initiatives. Lead data modernization efforts, migrating legacy systems to cloud-native solutions to unlock new analytical and AI capabilities. Develop robust data pipelines and MLOps platforms that enable the development, deployment, and management of machine learning models. Collaborate with business and technical stakeholders to define enterprise data strategies and reference architectures that support our AI and business intelligence roadmaps. Establish and enforce data governance, security, and compliance to ensure the integrity and responsible use of data for all business purposes, including AI. Mentor engineering teams on GCP best practices, data architecture patterns, and the principles of MLOps. Required Skills & Experience 68+ years of experience in data engineering, architecture, or analytics roles. Deep expertise in GCP services, particularly those essential for data and AI workloads (e.g., BigQuery, Dataflow, Vertex AI, Cloud Composer). Strong proficiency in Python, SQL, and data orchestration tools like Airflow. Extensive experience with data modeling, distributed systems, and real-time processing. Hands-on experience with MLOps principles and tools for automating machine learning workflows. Familiarity with Infrastructure as Code (e.g., Terraform) to manage and scale data environments. Excellent communication and stakeholder engagement skills, with the ability to translate complex technical concepts into business value. QA awareness including Unit Tests, TDD, and performance testing. Preferred Qualifications Google Cloud Professional Data Engineer or Cloud Architect certification. Proven experience with AI/ML platforms, Kubernetes, and data cataloging tools. Experience in a consulting or hybrid delivery environment (onshore/offshore). Skills List GCP Core Services: BigQuery, Oracle, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Vertex AI, Cloud Spanner, Cloud SQL, Cloud IAM. Programming Languages: Python, SQL, and optionally Java or Scala. Data Technologies: Data lakes, data warehouses, real-time data streaming, distributed systems, ETL/ELT. Architecture & Design: Data modeling, data mesh, data fabric, Lambda and Kappa architectures. MLOps & AI: Vertex AI, Kubeflow, ML model serving, and CI/CD for machine learning pipelines. Infrastructure as Code (IaC): Terraform, Deployment Manager. Data Governance: Data cataloging, data lineage, metadata management. Soft Skills: Strategic planning, technical leadership, mentoring, communication, and problem-solving. Outstanding Benefits Very competitive compensation Generous Paid Time Off (25 PTO days) 4 days (one day/quarter) Volunteer Time Off (VTO) 5 days off annually for Year-End Break We offer a comprehensive medical, dental, and Wellness Program 12 weeks paid parental leave An infrastructure that allows flexible working arrangements Formal and informal reward, recognition, and acknowledgement programs Lots of fun and engaging employee development events Reasonable Accommodation Sabre is committed to working with and providing reasonable accommodation to applicants with disabilities. Applicants applying for a Sabre position with a disability who require a reasonable accommodation for any part of the application or hiring process may contact Sabre at recruiting@careers.sabre.com. Determinations on requests for reasonable accommodation will be made on a case-by-case basis. Affirmative Action Sabre is an equal employment opportunity/affirmative action employer and is committed to providing employment opportunities to minorities, females, veterans, and disabled individuals.