Only on w2 Need submissions for Kafka Cloud Architect, Woodlawn,
Nukasani Group Inc - Gwynn Oak, Maryland, United States, 21207
Work at Nukasani Group Inc
Overview
- View job
Overview
Good Morning , Greetings from Nukasani group Inc !, We have below urgent long term contract project immediately available for Kafka Cloud Architect, Woodlawn, MD , Onsite submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated.
Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you.
Candidate Submission Format - needed from you
Full Legal Name
Personal Cell No ( Not google phone number)
Email Id
Skype Id
Interview Availability
Availability to start, if selected
Current Location
Open to Relocate
Work Authorization
Total Relevant Experience
Education./ Year of graduation University Name, Location
Last 4 digits of SSN
Country of Birth
Contractor Type
: mm/dd
Home Zip Code
Assigned Job Details
Job Title : Kafka Cloud Architect
Location: Woodlawn, MD , Onsite Rate : Best competitive rate
Position Overview
We are seeking an experienced Kafka Cloud Architectto lead the design, implementation, and management of next-generation data streaming and event-driven architecture platforms. The ideal candidate will have extensive experience with Confluent Kafka, AWS cloud deployments, and event-driven microservice architectures. This role requires strong technical leadership, hands-on expertise, and the ability to mentor teams while driving strategic initiatives.
Key Responsibilities
Team Leadership & Collaboration
Lead and mentor a team of Kafka administrators and developers.
Assign tasks, oversee deliverables, and conduct weekly Kafka Technical Review meetings.
Collaborate with customers to expand Kafka use within the agency.
Partner with leadership to explore and implement emerging Kafka-related technologies.
Architecture & Development
Architect, design, and implement event-driven solutions using Confluent Kafka.
Define strategies for streaming data integration into data warehouses and microservice-based applications.
Establish Kafka best practices, governance standards, and reusable integration patterns.
Ensure data integrity, event modeling, and appropriate use of design patterns across projects.
Technical Expertise & Problem Resolution
Provide deep expertise in Kafka architecture, including capacity planning, installation, and administration.
Troubleshoot and resolve platform issues across components.
Design and operationalize new Kafka connectors and optimize data pipelines.
Apply advanced knowledge in application integration, SOA, enterprise services, and security.
Stakeholder Communication
Deliver clear technical and business presentations to management, customers, and senior leaders.
Translate complex Kafka-based solutions into business-friendly language.
Facilitate workshops and provide insights into innovative event-driven architecture solutions.
Basic Qualifications
Bachelors degree in computer science, Engineering, Mathematics, or a related field with 12+ years of relevant experience (Masters degree with 10 years or equivalent experience also considered)
12+ years in modern software development, system analysis, and design
7+ years working with Apache/Confluent Kafka
2+ years architecting and deploying solutions in AWS
1+ year leading technical teams
Must be able to obtain and maintain a Public Trust security clearance
Local to Maryland and willing to work on-site in Woodlawn, MD
Required Technical Skills
Kafka Expertise:
Production experience with Confluent Kafka
Kafka cluster administration, partitioning, and security (including HA & SLA architecture)
Proficiency with Kafka Connect, KStreams, and KSQL
Design and deployment of Kafka on OpenShift/Kubernetes
Event-Driven Architecture & Data Streaming:
Strong understanding of EDA principles and best practices
Data replication, streaming, and performance optimization
Schema design and serialization (Avro, JSON)
Experience scaling Kafka infrastructure (Broker, Connect, ZooKeeper, Schema Registry, Control Center)
Cloud & Application Integration:
AWS services (ECS, EKS, Flink, RDS for PostgreSQL, S3)
Microservices architecture, distributed systems, resiliency, and load balancing
Knowledge of relational databases (PostgreSQL, DB2, Oracle), SQL, and ORM frameworks (Hibernate, Spring JPA)
Preferred Skills (Nice to Have)
AWS Cloud Certifications
Experience with Disaster Recovery Strategies and Domain-Driven Design (DDD)
CI/CD pipeline design and DevOps automation
Red Hat OpenShift / Kubernetes / Docker expertise
Configuration management tools (Ansible, Terraform, CloudFormation)
Strong background in Spring Framework (Boot, Batch, Cloud, Security, Data)
Proficiency in Java EE, concurrency, and generics
Experience with testing frameworks (JUnit, Mockito, Cucumber, Selenium, Jasmine/Karma)
Monitoring & observability tools: Grafana and Prometheus