Logo
Pyromis

GCP Technical Architect

Pyromis, San Juan Capistrano

Save Job

Position Details: GCP Technical Architect

Location: San Juan Capistrano, California

Openings: 2

Salary Range:

Description:

Job Title: GCP Technical Architect

Location: Charlotte, NC (Onsite)

Position Type: Contract

Responsibilities:

  • 3+ years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g. BigQuery, Composer , Dataflow , Dataproc, DLP, BigTable, Pub/Sub, Cloud Function, etc.).
  • Minimum 4+ years experience with data management strategy formulation, architectural blueprinting, and effort estimation.
  • Good understanding and knowledge of Teradata/Hadoop warehouse.
  • Advocate engineering and design best practices including design patterns, code reviews, and automation (e.g., CI/CD, test automation).
  • Cloud capacity planning and cost-based analysis.
  • Worked with large datasets and solved difficult analytical problems.
  • Regulatory and compliance work in Data Management.
  • Tackle design and architectural challenges such as performance, scalability, and reusability.
  • E2E data engineering and lifecycle management (including non-functional requirements and operations).
  • Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform.
  • Good understanding of data pipeline design and data governance concepts.
  • Experience in code deployment from lower environment to production.
  • Good communication skills to understand business requirements.

Required Skills and Abilities:

  • Mandatory Skills: BigQuery, Composer, Python/Java, GCP Fundamentals, Teradata/Hadoop.
  • Secondary Skills: Abinitio, Dataproc, Kubernetes, DLP, Pub/Sub, Dataflow, Shell Scripting, SQL, Security (Platform & Data) concepts.
  • Expertise in Data Modeling.
  • Detailed knowledge of Data Lake and Enterprise Data Warehouse principles.
  • Expertise in ETL migration from on-premises to GCP Cloud.
  • Familiarity with Hadoop ecosystems, HBase, Hive, Spark, or emerging data mesh patterns.
  • Ability to communicate with customers, developers, and other stakeholders.
  • Good To Have: Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer.
#J-18808-Ljbffr