Building an Order Delivery Analytics Application With FastAPI, Kafka, Apache Pinot, and Dash, Part 1

Valerio Uberti
Better Programming
Published in
4 min readSep 10, 2023

--

Image by Brett Jordan from Unsplash

Managing orders efficiently is vital for businesses in today’s world of online shopping and fast deliveries. Data-driven insights have become the key to achieving this. Imagine having a tool that lets you track your orders in real time, predict delivery times accurately, and discover trends to improve your operations. That’s exactly what we’ll explore in this article — creating a user-friendly Order Delivery Analytics Application.

We’ll use accessible technologies like FastAPI, a Python web framework known for its speed and versatility, to build the core of our application. Apache Kafka, a system that handles data streams, will help us process incoming orders. Apache Pinot, our data storage solution, acts like a supercharged database perfect for real-time analytics. And lastly, Dash, a Python framework, will enable us to create interactive data visualizations for our users.

This article is your guide to building this system from scratch. We’ll walk through the details of each technology, from setting up FastAPI for order handling, connecting it with Kafka for real-time data processing, using Apache Pinot for storing and querying order information and crafting appealing data visuals using Dash.

Set Up

Setting up the project is a step-by-step journey that begins with building the Order Service application. Using FastAPI, we’ll create a web-based system that receives and processes incoming orders, ensuring data integrity and validation. With the Order Service in place, it’s time to dive into configuring Kafka.

We’ll explore the intricacies of setting up Kafka as our data pipeline, efficiently transmitting order information to downstream components. Next on the list is Apache Pinot, our data warehousing solution. We’ll delve into configuring Apache Pinot, defining schemas to structure our order data, and ensuring seamless storage and real-time querying capabilities.

Lastly, we’ll focus on the dashboard configuration using Dash, where the magic happens. We’ll guide you through creating an interactive dashboard that presents insightful visualizations, empowering you to explore order delivery analytics effortlessly. Each step in this process is pivotal, and we’ll navigate through them together to bring your Order Delivery Analytics Application to life.

Let’s see a simple schema of our architecture:

FastAPI goes to Kafka goes to Pinot goes to Dash

Step 1: Creating the Order Service Application With FastAPI

The heart of our Order Delivery Analytics Application lies in the Order Service, a FastAPI-based Python application responsible for handling incoming orders. FastAPI is known for its speed and ease of use, making it an excellent choice for building robust web applications.

To get started, let’s define the structure of an order using Pydantic models. For this section, we’ll use JSON. Here’s the code:

{
"id": "12345",
"total_price": 99.99,
"user_id": 9876,
"items": [
{
"product_id": 1,
"quantity": 3,
"price": 29.99
}
],
"created_at": null,
"delivery_lat": 40.7128,
"delivery_lon": -74.0060
}

We’ll define two Pydantic models: Item to represent individual items within an order and Order for the entire order. The code snippet below demonstrates this:

from typing import List, Optional

from pydantic import BaseModel


class Items(BaseModel):
product_id: int
quantity: int
price: float


class Order(BaseModel):
id: str
total_price: float
user_id: int
items: List[Items]
created_at: Optional[int]
delivery_lat: float
delivery_lon: float

With our models in place, we’re ready to create the /order endpoint. Decorated with @app.post("/order"), this endpoint handles incoming POST requests. It expects JSON data to conform to our Order model and is designed to accommodate order creation, validation, and processing.

We’ll keep it simple and return the received order data as a response for now. However, in practice, this is where you’d integrate with your order processing pipeline, database, or other systems to manage and analyze orders effectively.

Now, let’s create our method that sends the order to a Kafka topic. Here’s what that looks like:

import datetime
from uuid import uuid4

import uvicorn
from confluent_kafka import Producer
from confluent_kafka.serialization import StringSerializer
from fastapi import FastAPI

from model.order import Order
from settings.settings import settings

app = FastAPI()

topic = settings.KAFKA_TOPIC
producer_conf = {'bootstrap.servers': settings.BOOTSTRAP_SERVERS}
string_serializer = StringSerializer('utf_8')


@app.post("/orders")
def create_order(order: Order):
order.created_at = round(datetime.datetime.now().timestamp() * 1000)
producer = Producer(producer_conf)
producer.produce(topic=topic,
key=string_serializer(str(uuid4())),
value=order.model_dump_json().encode('utf-8'))
producer.flush()
print(f"Order {order} created successfully!")
return {"message": f"Order {order} created successfully!"}

FastAPI’s simplicity and flexibility make it an excellent choice for building the Order Service component of our application.

With this foundation, we’re ready to move on to the next step: Configuring Kafka for efficient data pipelining.

In the next article, we’ll explore how to set up Kafka to handle incoming orders, ensuring seamless data flow within our application.

Check out the entire code on my GitHub.

--

--

I’m an enthusiastic Backend Software Engineer. In love for all new tech stuff. Java/Kotlin/Python