Centurion Consulting Group, LLC
Senior Kafka Administrator with Ansible
Centurion Consulting Group, LLC, Riverdale, Maryland, us, 20738
Job Description
Centurion is looking for a Senior Kafka Admin with Ansible for our federal government client. This is a long term position that is 100% onsite in Woodlawn, MD.
Key Required Skills:
Kafka Architecture, Ansible Automation, RHEL/Linux Administration, Scripting (Bash, Shell, Python), Availability Monitoring / Triage (Splunk, Dynatrace, Prometheus).
Position Description:
Architect, design, develop, and implement next-generation data streaming and event-based architecture / platform using software engineering best practices in the latest technologies: o Data Streaming, Event Driven Architecture, Event Processing Frameworks o DevOps (Jenkins, Red Hat OpenShift, Docker, SonarQube) o Infrastructure-as-Code and Configuration-as-Code (Ansible, Terraform / CloudFormation, Scripting)
Administer Kafka including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux. Provide expertise in one or more of these areas: Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management/business rules processing, CI/CD pipeline and containerization, or data ingestion/data modeling. Investigate, repair, and actively ensure business continuity regardless of impacted component: Kafka Platform, business logic, middleware, networking, CI/CD pipeline, or database (PL/SQL and Data Modeling). Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience All other duties as assigned or directed Skills Requirements:
Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field. Masters or Doctorate degree may substitute for required experience 8+ years of combined experience with Site Reliability Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka. 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK) 4+ years of experience with Ansible automation Must be able to obtain and maintain a Public Trust. Contract requirement. ** Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week. Strong experience with Ansible Automation and authoring playbooks and roles for installing, maintaining, or upgrading platforms Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation. Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies. Strong experience in automating tasks with scripting languages like Bash, Shell, or Python Solid foundation of Red Hat Enterprise Linux (RHEL) administration Basic networking skills Solid experience triaging and monitoring complex issues, outages, and incidents Experience with integrating/maintaining various 3rd party tools like ZooKeeper, Flink, Pinot, Prometheus, and Grafana. Experience with Platform-as-a-Service (PaaS) using Red Hat OpenShift/Kubernetes and Docker containers Experience working on Agile projects and understanding Agile terminology. Desired:
Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK) Practical experience with event-driven applications and at least one event processing framework, such as Kafka Streams, Apache Flink, or ksqlDB. Understanding of Domain Driven Design (DDD) and experience applying DDD patterns in software development. Experience working with Kafka connectors and/or supporting operation of the Kafka Connect API Experience with Avro / JSON data serialization and schema governance with Confluent Schema Registry. Preferred experience with AWS cloud technologies or other cloud providers; AWS cloud certifications. Experience with Infrastructure-as-Code (CloudFormation / Terraform, Scripting) Solid knowledge of relational databases (PostgreSQL, DB2, or Oracle), NoSQL databases (MongoDB, Cassandra, DynamoDB), SQL, or/and ORM technologies (JPA2, Hibernate, or Spring JPA) Knowledge of Social Security Administration (SSA) Education:
Bachelor's Degree with 7+ years of experience Must be able to obtain and maintain a Public Trust. Contract requirement.
Centurion is looking for a Senior Kafka Admin with Ansible for our federal government client. This is a long term position that is 100% onsite in Woodlawn, MD.
Key Required Skills:
Kafka Architecture, Ansible Automation, RHEL/Linux Administration, Scripting (Bash, Shell, Python), Availability Monitoring / Triage (Splunk, Dynatrace, Prometheus).
Position Description:
Architect, design, develop, and implement next-generation data streaming and event-based architecture / platform using software engineering best practices in the latest technologies: o Data Streaming, Event Driven Architecture, Event Processing Frameworks o DevOps (Jenkins, Red Hat OpenShift, Docker, SonarQube) o Infrastructure-as-Code and Configuration-as-Code (Ansible, Terraform / CloudFormation, Scripting)
Administer Kafka including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux. Provide expertise in one or more of these areas: Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management/business rules processing, CI/CD pipeline and containerization, or data ingestion/data modeling. Investigate, repair, and actively ensure business continuity regardless of impacted component: Kafka Platform, business logic, middleware, networking, CI/CD pipeline, or database (PL/SQL and Data Modeling). Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience All other duties as assigned or directed Skills Requirements:
Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field. Masters or Doctorate degree may substitute for required experience 8+ years of combined experience with Site Reliability Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka. 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK) 4+ years of experience with Ansible automation Must be able to obtain and maintain a Public Trust. Contract requirement. ** Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week. Strong experience with Ansible Automation and authoring playbooks and roles for installing, maintaining, or upgrading platforms Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation. Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies. Strong experience in automating tasks with scripting languages like Bash, Shell, or Python Solid foundation of Red Hat Enterprise Linux (RHEL) administration Basic networking skills Solid experience triaging and monitoring complex issues, outages, and incidents Experience with integrating/maintaining various 3rd party tools like ZooKeeper, Flink, Pinot, Prometheus, and Grafana. Experience with Platform-as-a-Service (PaaS) using Red Hat OpenShift/Kubernetes and Docker containers Experience working on Agile projects and understanding Agile terminology. Desired:
Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK) Practical experience with event-driven applications and at least one event processing framework, such as Kafka Streams, Apache Flink, or ksqlDB. Understanding of Domain Driven Design (DDD) and experience applying DDD patterns in software development. Experience working with Kafka connectors and/or supporting operation of the Kafka Connect API Experience with Avro / JSON data serialization and schema governance with Confluent Schema Registry. Preferred experience with AWS cloud technologies or other cloud providers; AWS cloud certifications. Experience with Infrastructure-as-Code (CloudFormation / Terraform, Scripting) Solid knowledge of relational databases (PostgreSQL, DB2, or Oracle), NoSQL databases (MongoDB, Cassandra, DynamoDB), SQL, or/and ORM technologies (JPA2, Hibernate, or Spring JPA) Knowledge of Social Security Administration (SSA) Education:
Bachelor's Degree with 7+ years of experience Must be able to obtain and maintain a Public Trust. Contract requirement.