Sabre Corporation
GCP Data Architect (Hospitality Solutions)
Sabre Corporation, Southlake, Texas, United States, 76092
GCP Data Architect (Hospitality Solutions)
Sabre Corporation is hiring for the
GCP Data Architect (Hospitality Solutions)
role. Hospitality Solutions, formerly part of Sabre Holdings, is a global leader in hospitality technology powering over 40,000 properties across 174 countries. With the support of TPG Capital, Hospitality Solutions is operating as a focused, independent entity and Sabre will employ individuals to support the Hospitality Solutions business.
Role and Responsibilities The
GCP Data Architect
is a senior technical leader responsible for designing, implementing, and overseeing scalable, secure, and cost-effective data architectures on Google Cloud Platform (GCP). This role blends strategic vision with hands-on engineering to modernize data platforms, prepare the organization for AI/ML at scale, and drive digital transformation.
Architect and implement end-to-end data solutions using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
Design and build data lakes, lakehouses, and data mesh platforms that serve as a foundation for advanced analytics, generative AI, and machine learning initiatives.
Lead data modernization efforts, migrating legacy systems to cloud-native solutions to unlock new analytical and AI capabilities.
Develop robust data pipelines and MLOps platforms that enable the development, deployment, and management of machine learning models.
Collaborate with business and technical stakeholders to define enterprise data strategies and reference architectures that support AI and business intelligence roadmaps.
Establish and enforce data governance, security, and compliance to ensure data integrity and responsible use for all business purposes, including AI.
Mentor engineering teams on GCP best practices, data architecture patterns, and the principles of MLOps.
Required Skills & Experience
6–8+ years of experience in data engineering, architecture, or analytics roles.
Deep expertise in GCP services, particularly those essential for data and AI workloads (e.g., BigQuery, Dataflow, Vertex AI, Cloud Composer).
Strong proficiency in Python, SQL, and data orchestration tools like Airflow.
Extensive experience with data modeling, distributed systems, and real-time processing.
Hands-on experience with MLOps principles and tools for automating machine learning workflows.
Familiarity with Infrastructure as Code (e.g., Terraform) to manage and scale data environments.
Excellent communication and stakeholder engagement skills, with the ability to translate complex technical concepts into business value.
QA awareness including Unit Tests, TDD, and performance testing.
Preferred Qualifications
Google Cloud Professional Data Engineer or Cloud Architect certification.
Proven experience with AI/ML platforms, Kubernetes, and data cataloging tools.
Experience in a consulting or hybrid delivery environment (onshore/offshore).
Skills
GCP Core Services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, Cloud Composer, Cloud Spanner, Cloud SQL, Cloud IAM.
Programming Languages: Python, SQL, and optionally Java or Scala.
Data Technologies: Data lakes, data warehouses, real-time data streaming, distributed systems, ETL/ELT.
Architecture & Design: Data modeling, data mesh, data fabric, Lambda and Kappa architectures.
MLOps & AI: Vertex AI, Kubeflow, ML model serving, and CI/CD for machine learning pipelines.
Infrastructure as Code (IaC): Terraform, Deployment Manager.
Data Governance: Data cataloging, data lineage, metadata management.
Soft Skills: Strategic planning, technical leadership, mentoring, communication, and problem-solving.
Benefits
Very competitive compensation
Generous Paid Time Off (25 PTO days)
4 days Volunteer Time Off (VTO)
5 days off annually for Year-End Break
Comprehensive medical, dental and wellness program
12 weeks paid parental leave
Flexible working arrangements
Reward, recognition and development programs
Seniority and Employment
Seniority level: Mid-Senior level
Employment type: Full-time
Job function: Engineering and Information Technology
Industries: Technology, Information and Internet
Dallas, TX — $160,000.00-$180,000.00
Note: Referrals increase your chances of interviewing at Sabre Corporation.
#J-18808-Ljbffr
GCP Data Architect (Hospitality Solutions)
role. Hospitality Solutions, formerly part of Sabre Holdings, is a global leader in hospitality technology powering over 40,000 properties across 174 countries. With the support of TPG Capital, Hospitality Solutions is operating as a focused, independent entity and Sabre will employ individuals to support the Hospitality Solutions business.
Role and Responsibilities The
GCP Data Architect
is a senior technical leader responsible for designing, implementing, and overseeing scalable, secure, and cost-effective data architectures on Google Cloud Platform (GCP). This role blends strategic vision with hands-on engineering to modernize data platforms, prepare the organization for AI/ML at scale, and drive digital transformation.
Architect and implement end-to-end data solutions using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
Design and build data lakes, lakehouses, and data mesh platforms that serve as a foundation for advanced analytics, generative AI, and machine learning initiatives.
Lead data modernization efforts, migrating legacy systems to cloud-native solutions to unlock new analytical and AI capabilities.
Develop robust data pipelines and MLOps platforms that enable the development, deployment, and management of machine learning models.
Collaborate with business and technical stakeholders to define enterprise data strategies and reference architectures that support AI and business intelligence roadmaps.
Establish and enforce data governance, security, and compliance to ensure data integrity and responsible use for all business purposes, including AI.
Mentor engineering teams on GCP best practices, data architecture patterns, and the principles of MLOps.
Required Skills & Experience
6–8+ years of experience in data engineering, architecture, or analytics roles.
Deep expertise in GCP services, particularly those essential for data and AI workloads (e.g., BigQuery, Dataflow, Vertex AI, Cloud Composer).
Strong proficiency in Python, SQL, and data orchestration tools like Airflow.
Extensive experience with data modeling, distributed systems, and real-time processing.
Hands-on experience with MLOps principles and tools for automating machine learning workflows.
Familiarity with Infrastructure as Code (e.g., Terraform) to manage and scale data environments.
Excellent communication and stakeholder engagement skills, with the ability to translate complex technical concepts into business value.
QA awareness including Unit Tests, TDD, and performance testing.
Preferred Qualifications
Google Cloud Professional Data Engineer or Cloud Architect certification.
Proven experience with AI/ML platforms, Kubernetes, and data cataloging tools.
Experience in a consulting or hybrid delivery environment (onshore/offshore).
Skills
GCP Core Services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, Cloud Composer, Cloud Spanner, Cloud SQL, Cloud IAM.
Programming Languages: Python, SQL, and optionally Java or Scala.
Data Technologies: Data lakes, data warehouses, real-time data streaming, distributed systems, ETL/ELT.
Architecture & Design: Data modeling, data mesh, data fabric, Lambda and Kappa architectures.
MLOps & AI: Vertex AI, Kubeflow, ML model serving, and CI/CD for machine learning pipelines.
Infrastructure as Code (IaC): Terraform, Deployment Manager.
Data Governance: Data cataloging, data lineage, metadata management.
Soft Skills: Strategic planning, technical leadership, mentoring, communication, and problem-solving.
Benefits
Very competitive compensation
Generous Paid Time Off (25 PTO days)
4 days Volunteer Time Off (VTO)
5 days off annually for Year-End Break
Comprehensive medical, dental and wellness program
12 weeks paid parental leave
Flexible working arrangements
Reward, recognition and development programs
Seniority and Employment
Seniority level: Mid-Senior level
Employment type: Full-time
Job function: Engineering and Information Technology
Industries: Technology, Information and Internet
Dallas, TX — $160,000.00-$180,000.00
Note: Referrals increase your chances of interviewing at Sabre Corporation.
#J-18808-Ljbffr