Running container image using AWS lambda

Background : AWS Lambda

AWS released a long requested feature for the ability to run a packaged code (container image) using the compute service lambda. This feature was available in azure functions for sometime but was missing on AWS. The feature enables the development community to code + package + deploy the container image into AWS ECR and run them using AWS Lambda. It also eliminates some of the earlier lambda limitations on size of packaged code.

AWS Fargate is also a similar compute service that supports running container image. However, there are lot of similarities and differences between the two in its functionality , use case(s) and cost etc.

Application Interactions : ECR , Lambda , S3

Lambda — container support

Use case : Find the Record count of data files on S3

User uploads a data file to s3 and the lambda application has to count the number of records in the file and write the result/output to the same s3 location.

Install + Prerequisite : Set up instructions

Before we start the instructions please download and install the below softwares.

Next : Install the AWS specific tools to perform the docker container specific tasks (i.e build, tag , push image to ECR). The below powershell commands will install AWS tools and also install required AWS modules.

Install AWS Tools using powershell

Next : Set the Access and Secret keys to be used for ECR authentication in the powershell session.

Set the Default AWS Credentials in the powershell session

The keys are generated and managed using the AWS IAM Policy for the user. Below is a sample policy for the user to perform actions on S3 (read/Write/List) and ECR (All actions to deploy /delete etc).

AWS policy for user access to s3 and ECR deployment

Build + Deploy

Next : Build the image to perform the above use case and deploy into AWS ECR.

Python code for file count on s3
Dockerfile to build Image
Packages to be installed on base aws linux/python image
Build Image using poweshell and authenticate to push image to ECR

Configure + Execute

Conclusion

AWS Lambda on container images now support upto 10 GB in size. This enables a lot of possibilities to package complex machine learning and analytical, data engineering components which could have been difficult earlier under native event based handler.

It also is true that since container service runs as an compute/machine we cannot have dynamic environment variables , every task has to be driven out of existing resources like file on s3, SQS events etc.

AWS lambda on dockers is more suited for smaller tasks and applications running less than 900s and resource less than 10 GB of RAM. AWS fargate can run long running jobs however the cost is incurred per hr even though no workload on the servers.

Hence, the use case to choose fargate and lambda on containers has multiple dimensions to be considered.

Decisive and multi-faceted IT professional with 12 years of experience delivering data management projects with 5+ years experience in cloud and big data tech

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store