<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1 https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1 "> Bitovi Blog - UX and UI design, JavaScript and Frontend development
Loading

Run AWS Lambda and Node.js locally with docker-compose and LocalStack

Run AWS in a Docker container. AWS services can be a real headache for developers who just want to write some code. LocalStack to the rescue!

Ben Shelomovitz

Ben Shelomovitz

Twitter Reddit

AWS is an essential tool in many developersโ€™ toolboxes, but using AWS in a development environment can be tricky. Developers who just want to write some code are often frustrated using the AWS console and setting up AWS services.

LocalStack to the rescue!

LocalStack is a service that mocks AWS locally in a docker container on the your computer. With LocalStack, you don't need to worry about connecting to an actual AWS server. This post will walk you through setting up a docker container running LocalStack so that you can run AWS services locally.

If youโ€™d like to jump straight into the final code, check out the Github repo. Or, keep reading for a step-by-step guide to running AWS services locally.

Make sure you have Docker and AWS CLI installed on your computer. You will be using the following AWS services: Lambda, SQS, SNS, S3, and IAM.

Setting Up LocalStack


Create a docker-compose.yml file in your root directory.

Here you will add LocalStack to services and set the environmental variables.

docker-compose.yml

version: '3.8'

services:
localstack:
image: localstack/localstack
container_name: localstack-example
hostname: localstack
ports:
- "4566:4566"
environment:
# Declare which aws services will be used in localstack
- SERVICES=sqs,sns,iam,s3,lambda
- DEBUG=1
# These variables are needed for localstack
- AWS_DEFAULT_REGION=us-east-1
- AWS_ACCESS_KEY_ID=testUser
- AWS_SECRET_ACCESS_KEY=testAccessKey
- DOCKER_HOST=unix:///var/run/docker.sock
- DATA_DIR=/tmp/localstack/data
volumes:
- "${TMPDIR:-/tmp}/localstack:/tmp/localstack"
- /var/run/docker.sock:/var/run/docker.sock
- ./create-resources.sh:/docker-entrypoint-initaws.d/create-resources.sh

In SERVICES, you declare which AWS services you will be using, such as in this example: sqs, sns, iam, s3, and lambda.

The AWS_DEFAULT_REGION, AWS_ACCESS_KEY, and AWS_SECRET_ACCESS_KEY are required by LocalStack and are filled with dummy data.

Next, create a create-resources.sh file.

create-resources.sh

echo "All resources initialized! ๐Ÿš€"

This shell script will run on container startup. Right now, it's empty, but you're going to fill it with commands.

Run docker-compose up and see that All resources initialized! ๐Ÿš€.

You can also see the status of all AWS services hosted by LocalStack by going to:

https://localhost:4566/health

Thatโ€™s it. You have now set up LocalStack on your computer. Now it's time to add some resources to create-resources.sh and test them out.

First create a SQS queue named testQueue:

echo "Create SQS queue testQueue"
aws \
sqs create-queue \
--queue-name testQueue \
--endpoint-url http://localhost:4566

If you stop and start docker-compose, you can test that this is working by running:

aws sqs send-message --endpoint-url=http://localhost:4566 --queue-url http://localhost:4576/000000000000/testQueue --region us-east-1 --message-body 'Test Message!'

LocalStack will return:

Response from LocalStack when sending an SQS message.

Next create an SNS topic named testTopic and subscribe testQueue to it:

echo "Create SNS Topic testTopic"
aws \
sns create-topic \
--name testTopic \
--endpoint-url http://localhost:4566
echo "Subscribe testQueue to testTopic"
aws \
sns subscribe \
--endpoint-url http://localhost:4566 \
--topic-arn arn:aws:sns:us-east-1:000000000000:testTopic \
--protocol sqs \
--notification-endpoint arn:aws:sqs:us-east-1:000000000000:testQueue

The sns subscribe command will subscribe testQueue to testTopic. When you publish a message to testTopic, it will be passed on to testQueue.

This can be modified to accommodate different services such as SMS or email by changing

--protocol sqs to your preferred service.

Lets quickly test these commands by running docker-compose down and then docker-compose up to rerun our newly updated script. You should see this in your terminal:

Output from docker-compose down and up

You can see that testQueue, testTopic and a subscription were all created and are ready to be used.

Sending a test SQS message by running:

aws sns publish--endpoint-url=http://localhost:4566 --topic-arn arn:aws:sns:us-east-1:000000000000:testTopic --region us-east-1 --message 'Test Topic!'

should return:

Response from LocalStack when running aws sns publish

Creating Your Lambda


The first thing you will do is set up the lambda function handler. Create a src directory and then create index.js inside of it.

index.js

const dayjs = require('dayjs');

exports.handler = async function(event, context) {
var now = dayjs();
console.log('+*+++*+*+*+*+START*+*+*+*+*+**+*++*+*');
console.log('EVENT OCCURRED!');
console.log(`Message created on ${now}`);
// Print the event that triggers the lambda
console.log("EVENT: \n" + JSON.stringify(event, null, 2));
console.log('+*+++*+*+*+*+*END+*+*+*+*+**+*++*+*');
return context.logStreamName
}

The function above receives an event and prints out the eventโ€™s details alongside printing a message using an external package: dayjs.

Create package.json in the src directory.

package.json

{
"name": "localstack-tutorial",
"dependencies": {
"dayjs": "^1.11.0"
}
}

Add any external dependencies your lambda function uses to the dependencies list.

AWS lambda expects a zip file with the function handler inside. You can do this with in a Dockerfile.

Create a Dockerfile and add this:

Dockerfile

FROM node:15 as lambda

ARG PORT=8000
ENV PORT=$PORT
WORKDIR /usr/src
COPY . .
# Install zip in container
RUN apt-get update
RUN apt-get install zip
# Enter the src directory, install dependencies, and zip the src directory in the container
RUN cd src && npm install && zip -r lambdas.zip .

FROM localstack/localstack
# Copy lambdas.zip into the localstack directory
COPY --from=lambda /usr/src/src/lambdas.zip ./lambdas.zip

This will install any external dependencies and zip the src directory. It will then move the zip file into the LocalStack directory.

Change your docker-compose.yml to include the Dockerfile by removing:

image: localstack/localstack 

adding:

network_mode: bridge
build:
context: .
dockerfile: Dockerfile

and adding - LAMBDA_EXECUTOR=local to environment.

Your docker-compose.yml should now look like this:

version: '3.8'services:
localstack:
network_mode: bridge
build:
context: .
dockerfile: Dockerfile
container_name: localstack-example
hostname: localstack
ports:
- "4566:4566"
environment:
# Declare which aws services will be used in localstack
- SERVICES=sqs,sns,iam,s3,lambda
# These variables are needed for localstack
- AWS_DEFAULT_REGION=us-east-1
- AWS_ACCESS_KEY_ID=testUser
- AWS_SECRET_ACCESS_KEY=testAccessKey
- LAMBDA_EXECUTOR=local
- DOCKER_HOST=unix:///var/run/docker.sock
- DATA_DIR=/tmp/localstack/data
volumes:
- "${TMPDIR:-/tmp}/localstack:/tmp/localstack"
- /var/run/docker.sock:/var/run/docker.sock
- ./create-resources.sh:/docker-entrypoint-initaws.d/create-resources.sh

LAMBDA_EXECUTOR supports 3 different options:

local: runs the lambda in the current LocalStack container.

docker: creates a new lambda container, everytime the lambda is invoked. This is the default option .

docker-reuse: creates a new lambda container that stays open for future lambda invocations.

Back in create-resources.sh, add these commands:

echo "Create admin"
aws \
--endpoint-url=http://localhost:4566 \
iam create-role \
--role-name admin-role \
--path / \
--assume-role-policy-document file:./admin-policy.json
echo "Make S3 bucket"
aws \
s3 mb s3://lambda-functions \
--endpoint-url http://localhost:4566
echo "Copy the lambda function to the S3 bucket"
aws \
s3 cp lambdas.zip s3://lambda-functions \
--endpoint-url http://localhost:4566

These commands will create an admin role using IAM, make an S3 bucket and upload the lambda handler function to the bucket.

Finally, you will create the lambda function and then set an event source mapper to it.

echo "Create the lambda exampleLambda"
aws \
lambda create-function \
--endpoint-url=http://localhost:4566 \
--function-name exampleLambda \
--role arn:aws:iam::000000000000:role/admin-role \
--code S3Bucket=lambda-functions,S3Key=lambdas.zip
--handler index.handler \
--runtime nodejs10.x \
--description "SQS Lambda handler for test sqs." \
--timeout 60 \
--memory-size 128 \
echo "Map the testQueue to the lambda function"
aws \
lambda create-event-source-mapping \
--function-name exampleLambda \
--batch-size 1 \
--event-source-arn "arn:aws:sqs:us-east-1:000000000000:testQueue" \
--endpoint-url=http://localhost:4566
echo "All resources initialized! ๐Ÿš€"

You can see that the S3 bucket and zip file are both declared here and the handler function is declared.

In the event source mapping command, the testQueue Arn is used for triggering the lambda.

With this file, your lambda function will be ready to receive messages from testTopic and testQueue.


Testing Your Lambda

Run docker-compose down to remove the container and docker-compose up to build the container again.

You will see the container startup, LocalStack initializing, and your create-resources.sh running in your terminal.

Once all your resources are created, run the following command in your CLI to publish an SNS message to testTopic:

aws sns publish --endpoint-url=http://localhost:4566 --topic-arn arn:aws:sns:us-east-1:000000000000:testTopic --region us-east-1 --message 'Test Topic!'

Alternatively, you can send an SQS message directly to testQueue:

aws sqs send-message --endpoint-url=http://localhost:4566 --queue-url http://localhost:4576/000000000000/testQueue --region us-east-1 --message-body 'Test Message!'

You should see that the lambda was triggered, the dependency was used, and the message was logged.

Logged message from our LocalStack AWS Lambda

Congratulations, you have successfully triggered your lambda!

You now know how to create a lambda function with an external dependency, initialize LocalStack on your computer, and execute AWS commands from your CLI.

LocalStack can utilize other AWS services such as DynamoDB and API Gateway, and many others. We're AWS Lambda and Node.js experts at Bitovi, so if you have any other posts you'd like to see on these topics, let us know. 

Do you have thoughts?

discord-mark-blueWeโ€™d love to hear them! Join our Community Discord to continue the conversation.