Logo
Georgia Staffing

Sr. Kafka Engineer (DevOps, AWS, OpenShift)

Georgia Staffing, Atlanta, Georgia, United States, 30383

Save Job

Senior Kafka Engineer

This position is on-site. No Sponsorship: For this opportunity, Truist will not sponsor an applicant for work visa status or employment authorization, nor will we offer any immigration-related support for this position. We are seeking an experienced Senior Kafka Engineer with strong DevOps expertise to design, build, and support high-performance data streaming and deployment pipelines within our ecosystem. This role is critical to ensuring the reliability, scalability, and security of our real-time data platforms that power trading, payments, and risk management systems. The ideal candidate will have deep experience with Confluent Kafka architecture and administration, AWS cloud services, and OpenShift/Kubernetes-based CI/CD automation. You will collaborate across infrastructure, security, and application engineering teams to deliver resilient, compliant, and high-throughput data solutions that meet stringent financial industry standards for availability and data integrity. Key Responsibilities

Design, deploy, and manage Kafka clusters supporting mission-critical financial applications, ensuring low latency, high availability, and regulatory compliance. Build and maintain CI/CD pipelines across AWS and OpenShift to enable secure and automated deployments for microservices and data streaming applications. Develop infrastructure-as-code (IaC) solutions using Terraform, Ansible, and CloudFormation to maintain consistency and auditability in financial environments. Monitor, troubleshoot, and tune Kafka performance to ensure optimal throughput and reliability for real-time data flows (e.g., trade events, payment transactions, risk metrics). Implement robust security and compliance controls across Kafka, cloud, and containerized systems in alignment with financial regulations (e.g., SOC 2, PCI DSS). Automate build, deployment, and monitoring processes to minimize operational risk and improve system resilience. Collaborate with software engineers, data architects, and DevOps teams to design event-driven architectures supporting core banking, fraud detection, and analytics platforms. Lead incident response and root cause analysis for production issues impacting financial data pipelines. Stay informed on emerging technologies in data streaming, observability, and secure DevOps practices relevant to the financial sector. Qualifications

Required Qualifications: Bachelor's degree in computer science, CIS, or related field 5 to 7 years of experience in software development or a related field 5 to 7 years of experience in database technologies 5 to 7 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) Preferred Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (Master's preferred). 7+ years of experience in DevOps, Cloud Infrastructure, or Data Engineering, with at least 4+ years specializing in Confluent Kafka in production. Proven experience deploying and managing Kafka in financial or regulated environments. Strong knowledge of AWS services (EC2, S3, IAM, Lambda, CodePipeline) and OpenShift/Kubernetes orchestration. Expertise in Terraform, Ansible, Jenkins, Git, and scripting (Python, Bash, Groovy). Deep understanding of security, encryption, and compliance best practices for financial systems. Familiarity with monitoring tools such as Dynatrace or Splunk. Excellent communication and leadership skills, with the ability to work across infrastructure, application, and compliance teams. Financial services experience