Member-only story
Add Schema Registry to Kafka in Your Local Docker Environment
Schema Registry helps maintain the contract between producer and consumer message data structures in Kafka topics
Introduction
In the world of event-driven applications, one key thing is to make sure that the messages that get published to a topic or a queue can be understood by all the players, i.e. the producers and consumers.
This means that having a data structure or schema is highly recommended.
Confluent, which is the company behind Kafka, recommends and supports Avro serialization on its platform. Some other serialization libraries include Thrift and Protocol Buffers.
The benefits of having a defined data schema in your event-driven ecosystem are clear data structure, type, and meaning and more efficient data encoding.
In this tutorial, we are going to add Schema Registry to our Kafka environment, which was built as part of this tutorial.
It is highly recommended that you follow that tutorial before commencing with this one.
Why Schema Registry?
We all understand that our Kafka producers publish messages to Kafka topics and our Kafka…