Member-only story
Setting Up Your Local Event-Driven Environment Using Kafka Docker
Spin up a Kafka cluster in a Docker container and learn how to create topics, produce and consume messages
Introduction
Event-driven architecture is one of the modern architectures that are implemented in many applications today. There are many tools developed over the past few years to support this kind of architecture, for example, AWS SNS, SQS, RabbitMQ, and Apache Kafka.
The main purpose of an event-driven architecture is to decouple your services by having a message queue or queues the services will publish to and poll or consume from. This way, we can replace the producer or the consumer pretty easily as they are decoupled from each other.
In this piece, we’re going to set up a local Kafka Docker and use its CLI tools to do the basic operations including creating a topic, publishing messages, and consuming them.
The Docker images we will be using are as follows:
confluentinc/cp-zookeeper:5.4.0
confluentinc/cp-server:5.4.0
confluentinc/cp-kafka:5.4.0