Purple Drive
Role:
Kafka Developer with AWS
Location: Tampa, FL - hybrid onsite (LOCAL CANDIDATES ONLY!)
A Kafka Developer with AWS expertise is responsible for designing, developing, and maintaining real-time data streaming solutions leveraging Apache Kafka within the Amazon Web Services (AWS) cloud environment.
Key Responsibilities:
- Design, implement, and maintain Kafka producers, consumers, and stream processing applications using languages like Java, Scala, or Python.
- Deploy, manage, and optimize Kafka clusters and related applications on AWS services such as Amazon Managed Streaming for Apache Kafka (MSK), EC2, S3, Lambda, and CloudWatch.
- Develop and manage end-to-end data pipelines involving Kafka Connect, Kafka Streams, and other data integration tools.
- Ensure the performance, scalability, and reliability of Kafka-based systems, including cluster tuning, monitoring, and troubleshooting.
- Implement security best practices for Kafka on AWS, including authentication, authorization (ACLs), and data encryption.
- Utilize infrastructure-as-code tools (e.g., Terraform, CloudFormation) and CI/CD pipelines (e.g., Jenkins, GitLab CI) for efficient deployment and management
- Work closely with data engineers, architects, and other development teams to understand requirements and deliver robust streaming solutions.
Required Skills and Qualifications
- Deep understanding of Kafka architecture, concepts (topics, partitions, brokers), and related tools (Kafka Connect, Kafka Streams, Schema Registry).
- Strong experience with relevant AWS services for data streaming and infrastructure management (e.g., MSK, EC2, S3, CloudWatch, IAM, VPC).
- Expertise in one or more programming languages commonly used with Kafka, such as Java, Scala, or Python.
- Knowledge of distributed systems principles and experience building scalable, fault-tolerant applications.
- Familiarity with relational or NoSQL databases for data persistence and integration.
- Strong analytical and problem-solving skills to diagnose and resolve issues in complex distributed environments.
- Excellent communication and collaboration skills to work effectively within a team and with stakeholders.
Kafka Developer with AWS
Location: Tampa, FL - hybrid onsite (LOCAL CANDIDATES ONLY!)
A Kafka Developer with AWS expertise is responsible for designing, developing, and maintaining real-time data streaming solutions leveraging Apache Kafka within the Amazon Web Services (AWS) cloud environment.
Key Responsibilities:
- Design, implement, and maintain Kafka producers, consumers, and stream processing applications using languages like Java, Scala, or Python.
- Deploy, manage, and optimize Kafka clusters and related applications on AWS services such as Amazon Managed Streaming for Apache Kafka (MSK), EC2, S3, Lambda, and CloudWatch.
- Develop and manage end-to-end data pipelines involving Kafka Connect, Kafka Streams, and other data integration tools.
- Ensure the performance, scalability, and reliability of Kafka-based systems, including cluster tuning, monitoring, and troubleshooting.
- Implement security best practices for Kafka on AWS, including authentication, authorization (ACLs), and data encryption.
- Utilize infrastructure-as-code tools (e.g., Terraform, CloudFormation) and CI/CD pipelines (e.g., Jenkins, GitLab CI) for efficient deployment and management
- Work closely with data engineers, architects, and other development teams to understand requirements and deliver robust streaming solutions.
Required Skills and Qualifications
- Deep understanding of Kafka architecture, concepts (topics, partitions, brokers), and related tools (Kafka Connect, Kafka Streams, Schema Registry).
- Strong experience with relevant AWS services for data streaming and infrastructure management (e.g., MSK, EC2, S3, CloudWatch, IAM, VPC).
- Expertise in one or more programming languages commonly used with Kafka, such as Java, Scala, or Python.
- Knowledge of distributed systems principles and experience building scalable, fault-tolerant applications.
- Familiarity with relational or NoSQL databases for data persistence and integration.
- Strong analytical and problem-solving skills to diagnose and resolve issues in complex distributed environments.
- Excellent communication and collaboration skills to work effectively within a team and with stakeholders.