Logo
Nexwave

GCP Cloud Data Architect (BigQuery)

Nexwave, Trenton, New Jersey, United States

Save Job

Direct message the job poster from Nexwave

Duration : 12 Months

The GCP Cloud Data Architect will serve as the key technical leader responsible for the comprehensive end-to-end data architecture, design, and governance of the Google Cloud BigQuery platform, ensuring scalable, secure, and cost-effective analytics solutions.

Skills: Certifications:

GCP Professional Data Engineer

Exp: 12-16 Yrs

Job Description:

Data Architect - Architectural Strategy: Design and implement a scalable, cost‑optimized BigQuery architecture leveraging advanced features such as partitioning, clustering, and materialized views to maximize performance and minimize costs

Data Modeling: Develop robust enterprise data models and semantic layers to support comprehensive analytics, reporting, business intelligence, marketing campaign optimization, and advanced data science use cases

Business Engagement: Act as a trusted advisor to executives and business teams, translating business objectives into technical solutions that drive measurable outcomes

Data Management: Define and implement master and reference data strategies to ensure data consistency, accuracy, and governance across multiple domains

Data Quality Automation: Lead initiatives to automate data quality assurance by creating business transformation rules, automated checks, and reconciliation frameworks to uphold high data integrity standards

Testing & Validation: Collaborate with QA teams to review and validate test cases, ensuring full coverage of business use cases and requirements, and rigorously assess data quality results

Integration & Deployment: Coordinate across multiple teams to manage system dependencies, validate integration points, and support seamless system deployment activities

User Acceptance Testing (UAT) & Production Rollouts: Work closely with business and IT stakeholders to coordinate UAT signoffs and oversee production cutovers, ensuring smooth deployment with well‑defined fallback and contingency plans

Governance & Security: Establish and enforce data governance standards including IAM role configurations, encryption policies, metadata management, and data lineage tracking

Cost & Performance Optimization: Continuously monitor BigQuery usage and performance metrics to recommend and implement strategies for query optimization, and storage lifecycle policies to balance performance with cost efficiency

Enterprise Data Architecture Blueprint covering the full data lifecycle from ingestion through to consumption

Comprehensive Data Models reflecting enterprise needs and governance standards

Integration & Deployment Plans with dependency tracking

UAT and Production Rollout Strategy aligned with business and IT needs

Regards,

Lead Talent Acquisition Specialist

Seniority level Mid-Senior level

Employment type Contract

Job function Consulting

#J-18808-Ljbffr