USM
Start Date: Interview Types
Skills GCP Data Engineer Visa Types H1B, Green Card, US ..
Tittle : GCP Data Engineer
Location: Cincinnati OH (day 1 onsite)
Rate: $60/Hr on C2C
Responsibilities:
• Design and build scalable data pipelines to ingest, process, and transform structured, semi-structured, and unstructured data from heterogeneous sources.
• Develop batch and streaming pipelines using Pub/Sub, Dataflow, Cloud Run, and Cloud Functions for real-time and near-real-time processing.
• Implement data models and optimize performance for OLTP and OLAP sinks, including Cloud Spanner, BigQuery, and relational stores.
• Participate in mainframe modernization initiatives, extracting, transforming, and migrating legacy datasets into GCP-native platforms.
• Ensure data quality, lineage, and governance practices using GCP-native tools.
• Collaborate with business and application teams to deliver end-to-end data solutions.
Requirements:
• Strong experience with GCP services: Pub/Sub, Dataflow, Cloud Run, Cloud Spanner, BigQuery.
• Hands-on with Python/Java/Scala for data engineering pipelines.
• Good knowledge of mainframe data formats (COBOL copybooks, VSAM, DB2) and integration with modern platforms.
• Solid understanding of OLTP vs OLAP workloads and best practices for optimization.
• Familiarity with DevOps practices for CI/CD on GCP.
Tittle : GCP Data Engineer
Location: Cincinnati OH (day 1 onsite)
Rate: $60/Hr on C2C
Responsibilities:
• Design and build scalable data pipelines to ingest, process, and transform structured, semi-structured, and unstructured data from heterogeneous sources.
• Develop batch and streaming pipelines using Pub/Sub, Dataflow, Cloud Run, and Cloud Functions for real-time and near-real-time processing.
• Implement data models and optimize performance for OLTP and OLAP sinks, including Cloud Spanner, BigQuery, and relational stores.
• Participate in mainframe modernization initiatives, extracting, transforming, and migrating legacy datasets into GCP-native platforms.
• Ensure data quality, lineage, and governance practices using GCP-native tools.
• Collaborate with business and application teams to deliver end-to-end data solutions.
Requirements:
• Strong experience with GCP services: Pub/Sub, Dataflow, Cloud Run, Cloud Spanner, BigQuery.
• Hands-on with Python/Java/Scala for data engineering pipelines.
• Good knowledge of mainframe data formats (COBOL copybooks, VSAM, DB2) and integration with modern platforms.
• Solid understanding of OLTP vs OLAP workloads and best practices for optimization.
• Familiarity with DevOps practices for CI/CD on GCP.