Member-only story

Add Schema Registry to Kafka in Your Local Docker Environment

Schema Registry helps maintain the contract between producer and consumer message data structures in Kafka topics

billydharmawan
Better Programming
6 min readFeb 18, 2020
Photo by Dmitry Ratushny on Unsplash

Introduction

In the world of event-driven applications, one key thing is to make sure that the messages that get published to a topic or a queue can be understood by all the players, i.e. the producers and consumers.

This means that having a data structure or schema is highly recommended.

Confluent, which is the company behind Kafka, recommends and supports Avro serialization on its platform. Some other serialization libraries include Thrift and Protocol Buffers.

The benefits of having a defined data schema in your event-driven ecosystem are clear data structure, type, and meaning and more efficient data encoding.

In this tutorial, we are going to add Schema Registry to our Kafka environment, which was built as part of this tutorial.

It is highly recommended that you follow that tutorial before commencing with this one.

Why Schema Registry?

We all understand that our Kafka producers publish messages to Kafka topics and our Kafka…

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

billydharmawan
billydharmawan

Written by billydharmawan

An ordinary man trying to leave a good legacy on earth

Responses (2)

What are your thoughts?