MM INTERNATIONAL, LLC
Data Engineer with DevOps Skill
MM INTERNATIONAL, LLC, Dearborn, Michigan, United States, 48120
Role : DataOps Engineer
Location; Hybrid work Dearborn MI (starting September 1st will be moving to 4 days a week onsite).
Duration : 12 month contract.
Additional Information :
Hybrid Position Currently 2-3 days a week but come September 1
resources will be in office 4 days a week.
Teams Video interview 1 hour 1 round
Job Description :
We are seeking a highly skilled and experienced Senior DataOps Engineer to join our EPEO DataOps team.
This role will be pivotal in designing building and maintaining robust scalable and secure telemetry data pipelines on Google Cloud Platform (GCP).
The ideal candidate will have a strong background in DataOps principles deep expertise in GCP data services and a solid understanding of IT operations especially within the security and network domains.
You will enable real-time visibility and actionable insights for our security and network operations centers contributing directly to our operational excellence and threat detection capabilities.
Skills Required : Code Assessment
Data Architecture
Endpoint Security
Google Cloud Platform
Data Governance
Cloud Infrastructure
Extract Transform Load (Etl)
Big Query
Network Security
Python
Skills Preferred : Problem Solving
Critical Thinking
Communications
Cross-functional
Technologies
Cloud Computing
Experience Required :
Core DataOps & Engineering Skills :
Proven experience as a DataOps Engineer Data Engineer or similar role with a strong focus on operationalizing data pipelines.
Expertise in designing building and optimizing large-scale data pipelines for both batch and real-time processing.
Strong understanding of DataOps principles including CI / CD automation data quality data governance and monitoring.
Proficiency in programming languages commonly used in data engineering such as Python.
Experience with Infrastructure as Code (IaC) tools (e.g. Terraform) for managing cloud resources.
Solid understanding of data modeling schema design and data warehousing concepts (e.g. star schema).
Experience Preferred :
Key Responsibilities :
Design & Development : Lead the design development and implementation of high-performance fault-tolerant telemetry data pipelines for ingesting processing and transforming large volumes of IT operational data (logs metrics traces) from diverse sources with a focus on security and network telemetry.
GCP Ecosystem Management : Architect and manage data solutions using a comprehensive suite of GCP services ensuring optimal performance cost-efficiency and scalability. This includes leveraging services like Cloud Pub / Sub for messaging Dataflow for real-time and batch processing BigQuery for analytics Cloud Logging for log management and Cloud Monitoring for observability.
DataOps Implementation : Drive the adoption and implementation of DataOps best practices including automation CI / CD for data pipelines version control (e.g. Git) automated testing data quality checks and robust monitoring and alerting.
Security & Network Focus : Develop specialized pipelines for critical security and network data sources such as VPC Flow Logs firewall logs intrusion detection system (IDS) logs endpoint detection and response (EDR) data and Security Information and Event Management (SIEM) data (e.g. Google Security Operations / Chronicle).
Data Governance & Security : Implement and enforce data governance compliance and security measures including data encryption (at rest and in transit) access controls (RBAC) data masking and audit logging to protect sensitive operational data.
Performance Optimization : Continuously monitor optimize and troubleshoot data pipelines for performance reliability and cost-effectiveness identifying and resolving bottlenecks.
Education Required : Bachelors Degree
Education Preferred :
Collaboration & Mentorship : Collaborate closely with IT operations security analysts network engineers and other data stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor junior engineers and contribute to the teams technical growth.
Documentation : Create and maintain comprehensive documentation for data pipelines data models and operational procedures.
Education & Experience :
Bachelors or Masters degree in Computer Science Data Engineering Information Technology or a related quantitative field.
Typically 8 years of experience in data engineering with at least 4 years in a Senior or Lead role focused on DataOps or cloud-native data platforms.
Key Skills
Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Employment Type : Full-time
Experience : years
Vacancy : 1
#J-18808-Ljbffr