Logo
Gravity IT Resources

Data Architect

Gravity IT Resources, Southlake, Texas, United States, 76092

Save Job

To Apply for this Job Click Here Job Title: GCP Data Architect | Location: Remote / CST hours | Type: Direct Hire

Overview Our client is growing and is rapidly expanding both its product and geographic footprint. They have a comprehensive suite of best-in-class products that empower hotels and hotel chains to better market and sell their products, as well as manage the guest experience. Working in the Data Architecture team gives you opportunities to develop your technical, business and intercultural skills, cooperating with people from all over the world.

Key Responsibilities

Architect and implement end-to-end data solutions

using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage.

Design and build data lakes, lakehouses, and data mesh platforms

that serve as a foundation for advanced analytics, generative AI, and machine learning initiatives.

Lead data modernization efforts , migrating legacy systems to cloud-native solutions to unlock new analytical and AI capabilities.

Develop robust data pipelines and MLOps platforms

that enable the development, deployment, and management of machine learning models.

Collaborate with business and technical stakeholders

to define enterprise data strategies and reference architectures that support our AI and business intelligence roadmaps.

Establish and enforce data governance, security, and compliance

to ensure the integrity and responsible use of data for all business purposes, including AI.

Mentor engineering teams

on GCP best practices, data architecture patterns, and the principles of MLOps.

Required Skills & Experience

6+ years of experience

in data engineering, architecture, or analytics roles.

Deep expertise in GCP services , particularly those essential for data and AI workloads (e.g., BigQuery, Dataflow, Vertex AI, Cloud Composer).

Strong proficiency in Python, SQL , and data orchestration tools like Airflow.

Extensive experience

with data modeling, distributed systems, and real-time processing.

Hands-on experience with MLOps principles

and tools for automating machine learning workflows.

Familiarity with Infrastructure as Code

(e.g., Terraform) to manage and scale data environments.

Excellent communication and stakeholder engagement skills , with the ability to translate complex technical concepts into business value.

QA awareness

including Unit Tests, TDD, and performance testing.

Preferred Qualifications

Google Cloud Professional Data Engineer or Cloud Architect certification.

Proven experience with AI/ML platforms , Kubernetes, and data cataloging tools.

Experience in a consulting or hybrid delivery environment

(onshore/offshore).

Skills List

GCP Core Services : BigQuery, Oracle, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Vertex AI, Cloud Spanner, Cloud SQL, Cloud IAM.

Programming Languages : Python, SQL, and optionally Java or Scala.

Data Technologies : Data lakes, data warehouses, real-time data streaming, distributed systems, ETL/ELT.

Architecture & Design : Data modeling, data mesh, data fabric, Lambda and Kappa architectures.

MLOps & AI : Vertex AI, Kubeflow, ML model serving, and CI/CD for machine learning pipelines.

Infrastructure as Code (IaC) : Terraform, Deployment Manager.

Data Governance : Data cataloging, data lineage, metadata management.

Soft Skills : Strategic planning, technical leadership, mentoring, communication, and problem-solving.

Equal Employment Opportunity Statement Gravity IT Resources is an Equal Opportunity Employer. We are committed to creating an inclusive environment for all employees and applicants. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, veteran status, or any other legally protected characteristic. All employment decisions are based on qualifications, merit, and business needs.

#J-18808-Ljbffr