Running container image using AWS lambda
Background : AWS Lambda
AWS released a long requested feature for the ability to run a packaged code (container image) using the compute service lambda. This feature was available in azure functions for sometime but was missing on AWS. The feature enables the development community to code + package + deploy the container image into AWS ECR and run them using AWS Lambda. It also eliminates some of the earlier lambda limitations on size of packaged code.
AWS Fargate is also a similar compute service that supports running container image. However, there are lot of similarities and differences between the two in its functionality , use case(s) and cost etc.
Application Interactions : ECR , Lambda , S3

Use case : Find the Record count of data files on S3
User uploads a data file to s3 and the lambda application has to count the number of records in the file and write the result/output to the same s3 location.
Install + Prerequisite : Set up instructions
Before we start the instructions please download and install the below softwares.
- Visual studio Code (https://code.visualstudio.com/Download)
- Docker client for windows (https://desktop.docker.com/win/stable/Docker%20Desktop%20Installer.exe)
- Powershell (installed by default on windows machine)
Next : Install the AWS specific tools to perform the docker container specific tasks (i.e build, tag , push image to ECR). The below powershell commands will install AWS tools and also install required AWS modules.
Next : Set the Access and Secret keys to be used for ECR authentication in the powershell session.
The keys are generated and managed using the AWS IAM Policy for the user. Below is a sample policy for the user to perform actions on S3 (read/Write/List) and ECR (All actions to deploy /delete etc).
Build + Deploy
Next : Build the image to perform the above use case and deploy into AWS ECR.

Configure + Execute





Conclusion
AWS Lambda on container images now support upto 10 GB in size. This enables a lot of possibilities to package complex machine learning and analytical, data engineering components which could have been difficult earlier under native event based handler.
It also is true that since container service runs as an compute/machine we cannot have dynamic environment variables , every task has to be driven out of existing resources like file on s3, SQS events etc.
AWS lambda on dockers is more suited for smaller tasks and applications running less than 900s and resource less than 10 GB of RAM. AWS fargate can run long running jobs however the cost is incurred per hr even though no workload on the servers.
Hence, the use case to choose fargate and lambda on containers has multiple dimensions to be considered.