This content originally appeared on DEV Community and was authored by Antai Okon-Otoyo
Overview
This project provides a guide on how to deploy a simple item manager using a serverless architecture.
At the heart of serverless computing with AWS sits the AWS Lambda, which allows you to forget about server or infrastructure management and provisioning.
AWS Lambda is a service that lets you run code without having to manage servers. So in a nutshell, AWS handles your compute processes while you provide the code.
In our project, we are going to make use of the following AWS services to deploy our simple item manager application:
- AWS Lambda
- Amazon API Gateway
- Amazon DynamoDB
- AWS Amplify
Architecture Diagram
Project Steps
1. Setting up the Lambda Function
On the AWS Management Console, you search for Lambda and then choose “Create Function”
Deploy your code in the editor provided.
2. Setting up DynamoDB Table for CRUD Operations
On the AWS Management Console, you search for DynamoDB and then choose “Create Table”
The partition key in Amazon DynamoDB is similar to a primary key in relational databases, but not exactly the same.
Every DynamoDB table must have a partition key, and it is used to determine the physical storage partition where the item will live (hence the name).
Also, please note that if the sort key is not defined, then the primary key must be unique.
Now that the DynamoDB table has been created successfully, we need to go back to the Lambda function and grant the Lambda execution role access to DynamoDB to allow the function to perform CRUD operations.
Under the Configuration tab of the Lambda function, navigate to Permissions and click on the execution role, which will redirect you to the IAM role to grant the necessary permissions.
On the IAM role, search for DynamoDB and select the AmazonDynamoDBFullAccess permission. Then, click “Add permissions”.
In a production environment, it is important to ensure the principle of least privilege is applied when granting the necessary permissions. This would mean attaching an inline policy and granting only the permissions required by Lambda for your application.
We will also need to set an environment variable to enable our Lambda function to recognise our DynamoDB table.
3. Setting up API Gateway for Lambda Function
On the AWS Management Console, you search for and select API Gateway and then choose “Create an API”.
We will need to choose an API type, and for the sake of this project, we are going to be using the REST API type
Create Resource paths and their respective HTTP methods that would be processed through them.
After setting the API resources, we now have to deploy the API and get our API Invoke URL, which we can test with and use for invoking our lambda function.
Test the API to ensure the API is working as expected.
4. Setting up AWS Amplify to host our Frontend.
AWS Amplify is a set of tools and services provided by AWS that is designed to help developers build, deploy, and manage full-stack web and mobile applications quickly and easily.
We are going to use it in this project to quickly and easily host or deploy our item manager application.
On the AWS Management Console, you search for and select Amplify and then choose “Deploy an App”
If you’ve not used Amplify before, it would require you to integrate with your repository for deployment.
Please note that the frontend code has already been pushed to the GitHub repository and the API Invoke URL has been included in the necessary file as a variable to be called when performing crud operations.
Review the deployment before saving and deploying the code.
After saving and deploying, we can confirm that the application is working as expected, as seen in the screenshot below.
Challenges
Lambda Function Configuration Issues
My initial API tests returned a 502 Bad Gateway error, pointing to an issue within the Lambda function’s execution. A quick review of the function’s configuration revealed the problem:
I had omitted a critical environment variable required for the function to resolve the correct DynamoDB table. This experience highlighted the importance of thorough environment variable management as a key step in the deployment and testing lifecycle.
Bonus Step – Terraform Migration
After a successful deployment, I decided to experiment with Infrastructure as Code (IaC). I used Terraform to import the existing cloud resources into my Terraform state file, allowing me to manage future updates and changes through IaC.
If you would like to do the same, you can follow these steps:
- Create empty resource blocks for the resources you are trying to import.
- I ran these commands (among others) to import the resources from AWS. Do the same regarding the resources being imported. The number of import statements you run is determined by the number of resources you are importing. The Terraform documentation is great and will guide you accordingly.
$ terraform import aws_dynamodb_table.items-table ItemsTable
$ terraform import aws_lambda_function.items-function ItemFunction
$ terraform import aws_api_gateway_rest_api.ItemsAPI 9lwgudjgw4
$ terraform import aws_amplify_app.items-app d313k4wcghvl6q
Run
terraform show
to give you details of the resources imported so you can update your empty resource blocks as well.Run
terraform plan
to ensure that your configuration is fine and no extra resources are being added.Run
terraform apply
to ensure that your configuration is fine and your infrastructure is successfully and finally synced to your state file.You can now check and confirm that there are no issues with your application and resources.
Project source code can be found on my GitHub account here
This content originally appeared on DEV Community and was authored by Antai Okon-Otoyo