Logo
Infinite Computer Solutions

Data Platform Engineer (GCP Focus)

Infinite Computer Solutions, Campus, Illinois, us, 60920

Save Job

Role:

Data Platform Engineer (GCP Focus)

Key Responsibilities

Design, build, and automate scalable, production-grade data pipelines on GCP using core services such as Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery.

Develop and implement continuous integration and continuous deployment (CI/CD) workflows for data processing and analytics pipelines using tools like Cloud Build, GitHub Actions, or Jenkins.

Implement reusable data frameworks, templates, and libraries to standardize pipeline deployments, configuration management, and promote "Infrastructure as Code" principles.

Orchestrate complex, multi-stage ETL/ELT pipelines across diverse data sources and environments, ensuring efficient resource utilization and low latency.

Implement and manage automated data validation, schema checks, and anomaly detection using tools like Great Expectations, dbt tests, or custom Python frameworks.

Integrate quality gates directly into CI/CD workflows to ensure early issue detection and continuously improve overall data reliability.

Schedule, monitor, and optimize data workflows, ensuring strict adherence to data delivery SLAs.

Set up and maintain proactive monitoring, logging, and automated alerting for all data pipelines and platform components.

Develop and maintain comprehensive dashboards to track critical metrics, including data health, SLA adherence, and pipeline operational performance.

Integrate and manage data assets, schemas, and metadata within Google Data Catalog or equivalent metadata management platforms.

Enforce robust governance policies, including data lineage tracking, strict access control (IAM), and compliance standards for sensitive data.

Required Skills & Experience

5+ years of professional experience in data engineering, data platform operations, or a similar cloud-native technical role.

Strong expertise in the Google Cloud Platform (GCP) data stack: BigQuery, Dataflow, Cloud Composer, Pub/Sub, Cloud Functions, and Cloud Build.

High proficiency in Python, SQL, and general automation scripting.

Hands‑on experience with CI/CD principles and tools, including GitOps and Infrastructure as Code (IaC) using Terraform or Cloud Deployment Manager.

Proven experience with data quality and testing frameworks such as Great Expectations, dbt, or PyTest.

Working knowledge of observability, logging, and monitoring frameworks for high‑volume data systems.

Familiarity with metadata management, data lineage tools, and establishing data governance policies.

Qualifications Graduate; 5‑8 years of experience; Mid‑Senior level; Full‑time; Engineering and Information Technology; IT Services and IT Consulting.

#J-18808-Ljbffr