Apex Systems
We see you're coming in from Europe. Are you interested in a position with Apex Systems? Send us your information and resume or call us at our Cork office +353 21 2330 150
Job#: 2061647
Job Description:
Location:
Hybrid preferred in SE MI - will consider remote for the right candidate. Duration:
12+ months Position Description:
We are seeking an experienced GCP Data Engineer to build a cloud analytics platform that meets expanding business requirements with speed and quality using lean Agile practices. You will analyze and manipulate large datasets, supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in Google Cloud Platform (GCP). You will be responsible for designing transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large-scale solutions and operationalization of data warehouses, data lakes, and analytics platforms on Google Cloud or other cloud environments is required. We seek candidates with broad technology skills across these areas who can demonstrate the ability to design suitable solutions with a combination of GCP and third-party technologies for deployment on Google Cloud. Responsibilities:
Collaborate with cross-functional teams, including pairing and mobbing with other engineers. Deliver working, tested software in an agile team environment. Work effectively with data engineers, product owners, data champions, and technical experts. Demonstrate technical knowledge and leadership, advocating for technical excellence. Develop analytics data products using streaming and batch ingestion in GCP with solid data warehouse principles. Serve as Subject Matter Expert in Data Engineering and GCP tools. Skills Required:
Experience in implementing data solutions from concept to operations, providing deep technical expertise. Automate data pipelines to reduce manual effort in development and production. Analyze complex data, organize raw data, and integrate large datasets from multiple sources. Work with architects to evaluate and operationalize GCP tools for data ingestion, integration, presentation, and reporting. Translate business problems into technical data requirements and collaborate with product management. Proficient in Machine Learning architecture, data pipeline interaction, and metrics interpretation, including automated data lineage. Develop Proof of Concepts and evaluate solutions to recommend the best options. Integrate GCP Data Catalog with Informatica EDC. Design and build production data engineering solutions using GCP services like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred:
Strong results orientation, ability to multitask and work independently. Innovative, self-starting attitude. Excellent communication skills across teams and management levels. Commitment to quality and timely project delivery. Ability to document complex systems and create detailed test plans. Experience Required:
Deep understanding of Google or other cloud platform architectures. 5+ years of analytics application development and SQL experience. 3+ years of cloud experience (preferably GCP) with solutions at production scale. Experience with GCP Big Data deployments, Terraform, BigQuery, BigTable, Cloud Storage, Pub/Sub, Data Fusion, DataFlow, DataProc, Cloud Build, Airflow, Cloud Composer. 2+ years of Java or Python development, including Apache Beam. Experience with microservice architecture and container orchestration. Data extraction, transformation, validation skills. Designing data processing pipelines and architectures. 1+ year designing Tekton pipelines. Experience Preferred:
Building Machine Learning solutions with TensorFlow, BigQueryML, AutoML, Vertex AI. Solution architecture, infrastructure provisioning, security, and data services in GCP. Experience with DataPlex or Informatica EDC. Development ecosystems like Git, Jenkins, CI/CD. Strong problem-solving and communication skills. Experience with DBT/Dataform, Agile, Lean methodologies. Attention to detail, performance tuning. Education Required:
Bachelor's in computer science or related field. IT or related. Education Preferred:
GCP Professional Data Engineer Certification. Master's in computer science or related field. Mentoring experience, in-depth software engineering knowledge. EEO Employer
#J-18808-Ljbffr
Location:
Hybrid preferred in SE MI - will consider remote for the right candidate. Duration:
12+ months Position Description:
We are seeking an experienced GCP Data Engineer to build a cloud analytics platform that meets expanding business requirements with speed and quality using lean Agile practices. You will analyze and manipulate large datasets, supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in Google Cloud Platform (GCP). You will be responsible for designing transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large-scale solutions and operationalization of data warehouses, data lakes, and analytics platforms on Google Cloud or other cloud environments is required. We seek candidates with broad technology skills across these areas who can demonstrate the ability to design suitable solutions with a combination of GCP and third-party technologies for deployment on Google Cloud. Responsibilities:
Collaborate with cross-functional teams, including pairing and mobbing with other engineers. Deliver working, tested software in an agile team environment. Work effectively with data engineers, product owners, data champions, and technical experts. Demonstrate technical knowledge and leadership, advocating for technical excellence. Develop analytics data products using streaming and batch ingestion in GCP with solid data warehouse principles. Serve as Subject Matter Expert in Data Engineering and GCP tools. Skills Required:
Experience in implementing data solutions from concept to operations, providing deep technical expertise. Automate data pipelines to reduce manual effort in development and production. Analyze complex data, organize raw data, and integrate large datasets from multiple sources. Work with architects to evaluate and operationalize GCP tools for data ingestion, integration, presentation, and reporting. Translate business problems into technical data requirements and collaborate with product management. Proficient in Machine Learning architecture, data pipeline interaction, and metrics interpretation, including automated data lineage. Develop Proof of Concepts and evaluate solutions to recommend the best options. Integrate GCP Data Catalog with Informatica EDC. Design and build production data engineering solutions using GCP services like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred:
Strong results orientation, ability to multitask and work independently. Innovative, self-starting attitude. Excellent communication skills across teams and management levels. Commitment to quality and timely project delivery. Ability to document complex systems and create detailed test plans. Experience Required:
Deep understanding of Google or other cloud platform architectures. 5+ years of analytics application development and SQL experience. 3+ years of cloud experience (preferably GCP) with solutions at production scale. Experience with GCP Big Data deployments, Terraform, BigQuery, BigTable, Cloud Storage, Pub/Sub, Data Fusion, DataFlow, DataProc, Cloud Build, Airflow, Cloud Composer. 2+ years of Java or Python development, including Apache Beam. Experience with microservice architecture and container orchestration. Data extraction, transformation, validation skills. Designing data processing pipelines and architectures. 1+ year designing Tekton pipelines. Experience Preferred:
Building Machine Learning solutions with TensorFlow, BigQueryML, AutoML, Vertex AI. Solution architecture, infrastructure provisioning, security, and data services in GCP. Experience with DataPlex or Informatica EDC. Development ecosystems like Git, Jenkins, CI/CD. Strong problem-solving and communication skills. Experience with DBT/Dataform, Agile, Lean methodologies. Attention to detail, performance tuning. Education Required:
Bachelor's in computer science or related field. IT or related. Education Preferred:
GCP Professional Data Engineer Certification. Master's in computer science or related field. Mentoring experience, in-depth software engineering knowledge. EEO Employer
#J-18808-Ljbffr