New York Technology Partners
Role Overview
Solution Architect, Utilities, Google Cloud Platform – a 6‑month+ contract based in New York City (Tri‑State area). The role focuses on designing and implementing cloud‑based data and analytics platforms for Energy & Utilities clients.
What They Are Looking For
Highly experienced GCP Solution Architect with deep expertise in cloud architecture, data platforms and utilities domain compliance.
Strong leadership skills and the ability to guide migrations.
Hands‑on experience with GCP services and modern data solutions.
Key Skills
Google Cloud Platform (Architecture, migration strategies, BigQuery, Vertex AI, Dataflow, Pub/Sub)
Cloud Data & Analytics – batch/streaming pipelines, data warehouses, AI/ML integration
Azure & Databricks Knowledge – evaluate trade‑offs and complement GCP solutions
Regulatory Compliance for Utilities – NERC CIP, security standards, governance
Responsibilities Client Delivery
Serve as SME and solution architect on GCP, guiding stakeholders through design decisions and migration strategies.
Compare Azure, AWS, and GCP capabilities (cost, performance, compliance, analytics/AI) to support decision‑making.
Lead architecture, design and migration approach for workloads into GCP, including hybrid and multi‑cloud scenarios.
Lead greenfield build of modern data platforms that enable analytics, AI and agentic solutions on GCP.
Understand overlap between Databricks and Google Cloud solutions.
Lead platform cost modeling and identify cost savings.
Collaborate with data engineers, developers and business SMEs to align architecture with business goals and regulatory requirements.
Provide technical guidance on BigQuery, BigTable, Dataflow, Pub/Sub, Looker, Vertex AI, security, networking and governance.
Act as trusted advisor to client leadership, driving alignment across business and IT stakeholders.
Practice Development
Contribute to the Energy & Utilities practice by developing methodologies, accelerators and best practices for GCP adoption.
Promote thought leadership around cloud modernization, data platforms and AI/ML use cases enabled by GCP.
Mentor junior team members and foster a collaborative, learning‑oriented culture.
Business Development
Partner with account teams to shape business needs into technical roadmaps, proposals and solutions.
Contribute to client pursuits as a GCP SME, including architecture options, estimates and risk assessments.
Build and maintain a professional network in the utilities and cloud ecosystems.
Requirements
Bachelor’s degree in Computer Science, Information Systems, Engineering or related field (advanced degrees a plus).
8+ years of work experience, including at least 3+ years designing and implementing cloud solutions on GCP.
Strong understanding of cloud data and analytics architectures (batch/streaming pipelines, BigQuery, Vertex AI).
Experience with Azure cloud services and Databricks platform.
Proven success leading large‑scale cloud migrations or platform builds.
Familiarity with regulatory/compliance frameworks relevant to utilities (NERC CIP, data residency, security standards).
Possess at least one Google Cloud Professional certification (Data Engineer, Cloud Database Engineer, Cloud Architect, Machine Learning Engineer).
Strong communication skills, ability to influence technical and business stakeholders.
Ability to document technical architectures and explain approaches and rationales.
Demonstrated leadership and mentoring capabilities.
Willingness to travel to client sites up to 50–75% annually.
Nice to Have
Experience working with large regulated utilities (NERC CIP, PSC, compliance requirements).
Databricks certification (Data Engineer Associate/Professional, Machine Learning Associate/Professional).
Employment Details
Seniority level: Mid‑Senior
Employment type: Contract
Job function: Information Technology
#J-18808-Ljbffr
What They Are Looking For
Highly experienced GCP Solution Architect with deep expertise in cloud architecture, data platforms and utilities domain compliance.
Strong leadership skills and the ability to guide migrations.
Hands‑on experience with GCP services and modern data solutions.
Key Skills
Google Cloud Platform (Architecture, migration strategies, BigQuery, Vertex AI, Dataflow, Pub/Sub)
Cloud Data & Analytics – batch/streaming pipelines, data warehouses, AI/ML integration
Azure & Databricks Knowledge – evaluate trade‑offs and complement GCP solutions
Regulatory Compliance for Utilities – NERC CIP, security standards, governance
Responsibilities Client Delivery
Serve as SME and solution architect on GCP, guiding stakeholders through design decisions and migration strategies.
Compare Azure, AWS, and GCP capabilities (cost, performance, compliance, analytics/AI) to support decision‑making.
Lead architecture, design and migration approach for workloads into GCP, including hybrid and multi‑cloud scenarios.
Lead greenfield build of modern data platforms that enable analytics, AI and agentic solutions on GCP.
Understand overlap between Databricks and Google Cloud solutions.
Lead platform cost modeling and identify cost savings.
Collaborate with data engineers, developers and business SMEs to align architecture with business goals and regulatory requirements.
Provide technical guidance on BigQuery, BigTable, Dataflow, Pub/Sub, Looker, Vertex AI, security, networking and governance.
Act as trusted advisor to client leadership, driving alignment across business and IT stakeholders.
Practice Development
Contribute to the Energy & Utilities practice by developing methodologies, accelerators and best practices for GCP adoption.
Promote thought leadership around cloud modernization, data platforms and AI/ML use cases enabled by GCP.
Mentor junior team members and foster a collaborative, learning‑oriented culture.
Business Development
Partner with account teams to shape business needs into technical roadmaps, proposals and solutions.
Contribute to client pursuits as a GCP SME, including architecture options, estimates and risk assessments.
Build and maintain a professional network in the utilities and cloud ecosystems.
Requirements
Bachelor’s degree in Computer Science, Information Systems, Engineering or related field (advanced degrees a plus).
8+ years of work experience, including at least 3+ years designing and implementing cloud solutions on GCP.
Strong understanding of cloud data and analytics architectures (batch/streaming pipelines, BigQuery, Vertex AI).
Experience with Azure cloud services and Databricks platform.
Proven success leading large‑scale cloud migrations or platform builds.
Familiarity with regulatory/compliance frameworks relevant to utilities (NERC CIP, data residency, security standards).
Possess at least one Google Cloud Professional certification (Data Engineer, Cloud Database Engineer, Cloud Architect, Machine Learning Engineer).
Strong communication skills, ability to influence technical and business stakeholders.
Ability to document technical architectures and explain approaches and rationales.
Demonstrated leadership and mentoring capabilities.
Willingness to travel to client sites up to 50–75% annually.
Nice to Have
Experience working with large regulated utilities (NERC CIP, PSC, compliance requirements).
Databricks certification (Data Engineer Associate/Professional, Machine Learning Associate/Professional).
Employment Details
Seniority level: Mid‑Senior
Employment type: Contract
Job function: Information Technology
#J-18808-Ljbffr