Galent
US & Canada IT Recruitment Specialist
Location:
Phoenix, AZ (Onsite/Hybrid preferred) Type:
Contract Role Overview
We are seeking a
highly hands-on GCP Big Data Architect / Senior Engineer
to help design and implement the foundational data architecture for our enterprise. The ideal candidate will be a
GCP-certified Data Engineer
with deep expertise in data ingestion, modeling, and migration—capable of turning complex business problems into scalable cloud data solutions. Key Responsibilities
Lead the design and development of
data domains and data models
within the GCP ecosystem. Build and optimize
data ingestion pipelines
from diverse data sources. Drive the
GCP data migration strategy , ensuring scalability, performance, and cost optimization. Collaborate closely with directors and cross-functional teams to translate problem statements into executable technical plans. Serve as a
hands-on technical lead , mentoring junior engineers and ensuring best practices in data architecture and engineering. Required Skills & Experience
GCP Certification
(Data Engineer or Architect) is
mandatory . Proven experience building large-scale
data platforms
and
ETL/ELT pipelines
in GCP. Strong hands-on experience with GCP services like
BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer . Deep understanding of
data modeling ,
data governance , and
data quality frameworks . Experience leading or architecting
GCP migration projects
end-to-end. Excellent problem-solving and communication skills with a proactive, execution-oriented mindset. Seniority level
Executive Employment type
Contract Job function
Information Technology Industries
Software Development
#J-18808-Ljbffr
Location:
Phoenix, AZ (Onsite/Hybrid preferred) Type:
Contract Role Overview
We are seeking a
highly hands-on GCP Big Data Architect / Senior Engineer
to help design and implement the foundational data architecture for our enterprise. The ideal candidate will be a
GCP-certified Data Engineer
with deep expertise in data ingestion, modeling, and migration—capable of turning complex business problems into scalable cloud data solutions. Key Responsibilities
Lead the design and development of
data domains and data models
within the GCP ecosystem. Build and optimize
data ingestion pipelines
from diverse data sources. Drive the
GCP data migration strategy , ensuring scalability, performance, and cost optimization. Collaborate closely with directors and cross-functional teams to translate problem statements into executable technical plans. Serve as a
hands-on technical lead , mentoring junior engineers and ensuring best practices in data architecture and engineering. Required Skills & Experience
GCP Certification
(Data Engineer or Architect) is
mandatory . Proven experience building large-scale
data platforms
and
ETL/ELT pipelines
in GCP. Strong hands-on experience with GCP services like
BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer . Deep understanding of
data modeling ,
data governance , and
data quality frameworks . Experience leading or architecting
GCP migration projects
end-to-end. Excellent problem-solving and communication skills with a proactive, execution-oriented mindset. Seniority level
Executive Employment type
Contract Job function
Information Technology Industries
Software Development
#J-18808-Ljbffr