Better Programming

Advice for programmers.

Follow publication

AWS SQS Batch Consumer Using AWS Lambda and Go

How to consume several messages at once

Dmytro Misik
Better Programming
Published in
7 min readSep 19, 2022
Photo by Emile Perron on Unsplash

Sometimes when integrating with some HTTP API, sending requests at the same moment you’re executing your business flow is not crucial (synchronous flow). Some requests can even be grouped in batches and sent eventually (asynchronous flow). Here comes the event-driven design that can help you to solve this problem. But how to do it? How to group several requests in one batch? In this publication, I will describe my experience and how I deal with it with real-life examples.

Overview

Some time ago, I integrated with API (I will call it Entity API) that owns some domain model. It’s wide and allows you to perform almost all CRUD operations and group them into batches. In the scope of a single HTTP request, it’s possible to update, add and delete three different entities. But of course, that’s not how I’ve used it.

I’ve worked on a service that was listening to HTTP webhook, and when it was triggered, I had to do a lot of stuff. One thing I had to do was to update the model using Entity API. The requirement for an update was next: it should happen in the first 1 hour after the webhook was triggered. But to keep it simple, I’ve updated the entity in the scope of handling webhook.

And here comes the issue I had to deal with. I’ve started to receive 429 HTTP responses with a giant load. I’ve exceeded the Entity API HTTP rate limit. At this moment, there was no other choice. It was necessary to reimplement integration.

Switch to event-driven architecture

My first decision was to switch to asynchronous flow when it comes to entity updates. Update SLA made it possible (it was 1 hour). Here I’ve started to search for a message delivery service I can use.

But before it, I’ve investigated external API as well. As I said before, it allows to group several updates in a single batch. If I can group at least two updates in a single batch — it will decrease the load twice. And the bigger the batch size — the lower the load on external API! It can dramatically reduce the load.

So I needed not only a message broker. I need the ability to consume several messages at once. First of all, I thought about RabbitMQ. It allows to consume several messages without acknowledgments. So you can collect them into in-memory collection, process and acknowledge them. It means several message consuming should be made on the application side. It’s the option but not what I wanted.

I’ve started to investigate what I can do with Amazon SQS. SQS was exactly what I needed.

Amazon SQS

If you open Amazon documentation, it says next:

Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware, and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.

Amazon SQS can be a trigger for Amazon Lambda. Simply it’s another AWS service that allows you to run application code in the cloud without infrastructure overheads. More details you can find in Amazon documentation and in another publication I’ve made before:

When I read how to trigger Lambda with events from SQS (link), I’ve found the next statement:

Lambda reads messages in batches and invokes your function once for each batch

It was exactly what I needed! So I had to create an SQS queue, create AWS Lambda and configure it correctly. After several messages were published to the queue — Lambda should be executed with these messages as input parameters.

The architecture diagram would be next:

Integration

Instead of a direct update call to Entity API from service, a message would be published to the SQS queue, eventually triggering Lambda. Lambda will combine several events into a single batch and perform a single HTTP call to update all the entries. Let’s try to implement it!

Implementation

I can’t show the real code, but I will implement Lambda code using Go, which will publish messages to SQS. It will be enough to understand the flow and how to extend it according to your needs.

Let’s start with the dependencies you will need:

  • github.com/aws/aws-lambda-go/events —package provides input types for AWS Lambda;
  • github.com/aws/aws-lambda-go/lambda — package provides methods to implement AWS Lambda.

Let’s implement the code. It will be elementary. you can see it below:

package main

import (
"context"
"fmt"

"github.com/aws/aws-lambda-go/events"
"github.com/aws/aws-lambda-go/lambda"
)

func HandleRequest(ctx context.Context, sqsEvent events.SQSEvent) error {
fmt.Printf("Messages count: %d\n", len(sqsEvent.Records))
// here comes the code that creates HTTP request based on records
// and send them to Entity API
return nil
}

func main() {
lambda.Start(HandleRequest)
}

Single record of sqsEvent.Records is SQS Message that has field Body. This is the content you sent when you published the message. It can be JSON or any other format you want. So you can unmarshal it and handle it the way you need.

Infrastructure

First of all, you need to create an SQS queue. You need to open AWS Management Console, go to Simple Queue Service and click Create queue. You will need to configure the queue. I won’t describe details here because the configuration depends on your needs.

After it, you can create the Lambda. Go to Lambda, click Create Function and create an empty function with Go runtime and permissions needed to consume AWS SQS (Amazon SQS poller permissions).

Now you can deploy the code. You need to execute the following command to create the artifact needed to deploy code:

GOOS=linux go build main.go
zip function.zip main

Then open the created Lambda and choose Upload from .zip file. And last but not least, you need to configure the trigger. Click Add trigger, choose SQS as a source, choose SQS queue and provide additional configuration: batch size and batch window. Batch size is the largest number of records that will be read from your stream at once. Batch window is the maximum amount of time to gather records before invoking the function, in seconds.

That’s it. Now you need to make your application to publish a message to the SQS queue. Here is the sample console application I’ve used to test Lambda:

package main

import (
"fmt"
"os"
"strconv"

"github.com/aws/aws-sdk-go/aws"
"github.com/aws/aws-sdk-go/aws/session"
"github.com/aws/aws-sdk-go/service/sqs"
"github.com/google/uuid"
)

var queueUrl string = "https://sqs.eu-central-1.amazonaws.com/327251382021/go-batch-queue"

func must[T any](res T, err error) T {
if err != nil {
panic(err)
}

return res
}

func min(a int, b int) int {
if a < b {
return a
}
return b
}

func main() {
messageCount := 1
if len(os.Args) > 1 {
messageCountStr := os.Args[1]
messageCount = must(strconv.Atoi(messageCountStr))
}

sess := must(session.NewSessionWithOptions(session.Options{
Profile: "default",
Config: aws.Config{
Region: aws.String("eu-central-1"),
},
}))

sqsService := sqs.New(sess)

for m := messageCount; m > 0; m -= 10 {
count := min(m, 10)
entries := make([]*sqs.SendMessageBatchRequestEntry, count)

for i := 0; i < count; i++ {
id := strconv.Itoa(i)
entries[i] = &sqs.SendMessageBatchRequestEntry{
Id: &id,
MessageBody: aws.String(uuid.New().String()),
}
}

result := must(sqsService.SendMessageBatch(&sqs.SendMessageBatchInput{
QueueUrl: &queueUrl,
Entries: entries,
}))

fmt.Printf("Messages sent!. Result: %v\n", result)
}
}

AWS SDK (at least in Go) allows you to publish ten messages at once. So be careful when implementing your code.

Testing

I’ve configured lambda to process 20 messages in a single batch. After you publish a lot of messages and check the log, you will see something like I had:

Messages count: 16
Messages count: 14
Messages count: 2
Messages count: 20
...

AWS does not guarantee that lambda will always be executed with messages count equal to batch size. In my scenario, it doesn’t matter. You should also be aware that the standard SQS queue does not guarantee order and one message can be delivered more than once. It can be crucial for your system design.

Conclusion

Amazon Web Services provides many services you can use to develop your applications. You can combine the benefits of AWS Lambda and AWS SQS to build robust, resilient, scalable systems expecting a big load.

Resources

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Dmytro Misik
Dmytro Misik

Written by Dmytro Misik

🇺🇦 Software Engineering Manager 👨‍💻 at DraftKings. LinkedIn: https://www.linkedin.com/in/dmytro-misik-6390a7141/

No responses yet

Write a response