Zeus Solutions Inc
We are seeking a skilled GCP Data Engineer based in Houston (onsite), with 8+ years of overall data engineering experience and at least 4 years of hands-on expertise in Google Cloud Platform (GCP).
Job Description Develop, construct, test, and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing (in Google Cloud). Build large and complex datasets based on business requirements. Construct big data pipeline architecture. Identify opportunities for data acquisition via working with stakeholders and business clients. Translate business needs to technical requirements. Leverage a variety of tools in the Google Cloud Ecosystem such as Python, Dataflow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage to integrate systems and data pipelines. Use logs & alerts to effectively monitor pipelines. Use SAP SLT to replicate SAP tables to Google Cloud using SLT. Develop JSON messaging structure for integrating with various applications. Leverage DevOps and CI/CD practices (GitHub, Terraform) to ensure the reliability and scalability of data pipelines. Partition/Cluster and retrieve content in BigQuery and use IAM roles & Policy Tags to secure the data. Use roles to secure access to datasets, authorized views to share data between projects. Design and build an ingestion pipeline using REST API. Recommend ways to improve data quality, reliability, and efficiency.
Required 7+ years of experience in a GCP data engineering role(if onsite/nearshore) Atleast 2 years of hands on experience in GCP tools highlighted(Python, SQL, Data Flow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub Sub, BigQuery, Cloud Storage) Great communication, work independently and manage small projects.
Job Description Develop, construct, test, and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing (in Google Cloud). Build large and complex datasets based on business requirements. Construct big data pipeline architecture. Identify opportunities for data acquisition via working with stakeholders and business clients. Translate business needs to technical requirements. Leverage a variety of tools in the Google Cloud Ecosystem such as Python, Dataflow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage to integrate systems and data pipelines. Use logs & alerts to effectively monitor pipelines. Use SAP SLT to replicate SAP tables to Google Cloud using SLT. Develop JSON messaging structure for integrating with various applications. Leverage DevOps and CI/CD practices (GitHub, Terraform) to ensure the reliability and scalability of data pipelines. Partition/Cluster and retrieve content in BigQuery and use IAM roles & Policy Tags to secure the data. Use roles to secure access to datasets, authorized views to share data between projects. Design and build an ingestion pipeline using REST API. Recommend ways to improve data quality, reliability, and efficiency.
Required 7+ years of experience in a GCP data engineering role(if onsite/nearshore) Atleast 2 years of hands on experience in GCP tools highlighted(Python, SQL, Data Flow, DataStream, CDC (Change Data Capture), Cloud Functions, Cloud Run, Pub Sub, BigQuery, Cloud Storage) Great communication, work independently and manage small projects.