TestingXperts
GCP Data Engineer (Remote) - Fulltime
TestingXperts, Washington, District of Columbia, us, 20022
Join to apply for the
GCP Data Engineer (Remote) - Fulltime
role at
TestingXperts
Key Responsibilities
Design, develop, and optimize end-to-end data pipelines using GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, and Cloud Storage.
Build and maintain ETL/ELT processes for structured and unstructured data.
Develop data ingestion frameworks, real-time streaming pipelines, and batch workflows.
Create data models, partition strategies, and performance-optimized schemas in BigQuery.
Implement CI/CD pipelines for data workflows using Git, Cloud Build, Jenkins, or similar tools.
Collaborate with data analysts, data scientists, and business teams to understand requirements and deliver high-quality datasets.
Implement best practices in data quality, data governance, monitoring, and security.
Troubleshoot production issues and optimize performance, cost, and reliability.
Document technical designs, data flows, and build reusable components.
Required Skills & Qualifications
Strong hands‑on experience with GCP ecosystem:
BigQuery
Dataflow / Apache Beam
Dataproc / Hadoop / Spark
Pub/Sub
Cloud Composer / Airflow
Cloud Storage
Proficiency in SQL and Python for data engineering workflows.
Experience with ETL/ELT development, data modeling, and data warehousing concepts.
Strong understanding of distributed systems, performance tuning, and large‑scale data processing.
Experience with CI/CD, version control (Git), and automated workflow orchestration.
Familiarity with Terraform, Helm, or IaC tools (preferred).
Knowledge of API integrations, REST, and JSON handling.
Strong problem‑solving and communication skills.
Seniority level
Entry level
Employment type
Full‑time
Job function
Information Technology
Industries
IT Services and IT Consulting
Referrals increase your chances of interviewing at TestingXperts by 2x
#J-18808-Ljbffr
GCP Data Engineer (Remote) - Fulltime
role at
TestingXperts
Key Responsibilities
Design, develop, and optimize end-to-end data pipelines using GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, and Cloud Storage.
Build and maintain ETL/ELT processes for structured and unstructured data.
Develop data ingestion frameworks, real-time streaming pipelines, and batch workflows.
Create data models, partition strategies, and performance-optimized schemas in BigQuery.
Implement CI/CD pipelines for data workflows using Git, Cloud Build, Jenkins, or similar tools.
Collaborate with data analysts, data scientists, and business teams to understand requirements and deliver high-quality datasets.
Implement best practices in data quality, data governance, monitoring, and security.
Troubleshoot production issues and optimize performance, cost, and reliability.
Document technical designs, data flows, and build reusable components.
Required Skills & Qualifications
Strong hands‑on experience with GCP ecosystem:
BigQuery
Dataflow / Apache Beam
Dataproc / Hadoop / Spark
Pub/Sub
Cloud Composer / Airflow
Cloud Storage
Proficiency in SQL and Python for data engineering workflows.
Experience with ETL/ELT development, data modeling, and data warehousing concepts.
Strong understanding of distributed systems, performance tuning, and large‑scale data processing.
Experience with CI/CD, version control (Git), and automated workflow orchestration.
Familiarity with Terraform, Helm, or IaC tools (preferred).
Knowledge of API integrations, REST, and JSON handling.
Strong problem‑solving and communication skills.
Seniority level
Entry level
Employment type
Full‑time
Job function
Information Technology
Industries
IT Services and IT Consulting
Referrals increase your chances of interviewing at TestingXperts by 2x
#J-18808-Ljbffr