Building an ApiGateway-SQS-Lambda integration using Terraform

Daniel Da Costa
Towards Data Science
4 min readJun 22, 2020

--

Photo by Max Duzij on Unsplash

Terraform is an amazing tool for building infrastructures. This tool is used for building, changing, and versioning infrastructure safely and efficiently. Terraform is the infrastructure as code offering from HashiCorp.

While using Terraform for building a project that I’m designing using Amazon Web Services (AWS), I came across the need to set up an API Gateway endpoint that takes records, put them into an SQS queue that triggers an Event Source for a Lambda function.

In this post, I would like to share with you each step required to build this infrastructure. This post assumes that you are familiar with Terraform code and AWS services.

Diagram made using LucidChart workspace: https://www.lucidchart.com/pages/?noHomepageRedirect=true

Variables

First, let’s go through the input variables used in the project. The variables are stored inside the file varibles.tf.

The use of these variables makes it very easy to deploy the services in different environments. Changing the environment variable to prd (a.k.a production), will create all services with the corresponding environment name.

Simple Queue Service (SQS)

Start by creating the SQS resource.

ApiGateway

Before creating the ApiGateway resource, let’s first define the permissions so that API Gateway has the necessary permissions to SendMessage to SQS queue. I usually create a folder called policies that contains all the policies that I'll be using in the project, I recommend that you do the same, it will keep your code clean and organized.

├── iam.tf
├── policies: // all policies created
│ ├── api-gateway-permission.json
│ └── lambda-permission.json

These permissions will give API Gateway the ability to create and write to CloudWatch logs, as well as the ability to put, read, and list data from the SQS queue.

With all the permissions needed for the ApiGateway-SQS interaction created, we can start creating our endpoint.

Our endpoint will have a path attached to the root:/form_score , with an API Gateway POST method. We’ll also require that a query_string_parameter, called unity, is passed through the HTTP request. A validator is created in order to validate the existence of the query_string_parameter in the request parameters.

Now we can define the API Gateway integration that will forward records into the SQS queue. In order to forward the Method, Body, QueryParameters and path Parameters from the endpoint to the SQS message, a request_template was added.

For this integration with SQS, the HTTP header of the request has to be with the Content-Type: application/x-www-form-urlencoded .

The SQS response should also be mapped for the successful responses. In this project, the SQS response is returned to the apiGateway as it is, but you can also change it in order to return a custom message, for example: 'Message Added to SQS successfully' . Here, we defined a 200 handler for successful requests.

Finally, we can add the API Gateway REST Deployment in order to deploy our endpoint. A redeployment trigger was added. This configuration calculates a hash of the API’s Terraform resources to determine changes that should trigger a new deployment.

Lambda

When building a lambda with terraform I like to split the lambda code from the infra code, it makes the code cleaner and more organized.

├── lambda: folder for lambda code
│ ├── handler.py
│ └── sqs-integration-dev-lambda.zip
├── lambda.tf

Before building our lambda, we have to create the necessary permissions so that it can read the messages from SQS and write to CloudWatch logs.

Then, we can build our lambda.

Lastly, we need to add a permission so that SQS can invoke the lambda. We also need to add an event source so that SQS can trigger the lambda when new messages arrives in the queue.

Conclusion

With everything set up, we are ready to apply all the changes to our AWS account. When deployed, we’ll have a public endpoint that will write to SQS with a Lambda function that will consume from it.

This is a powerful pipeline that can be used in different scenarios. You can check the full code, with extra analysis, in Here!

--

--

Data Science Graduate at University of Southern California | MS École CentraleSupélec | BSc PUC-Rio | Data Scientist with 3 years of industrial experience