AWS Certified Developer – Associate / Question #976 of 557

Question #976

A developer is deploying a serverless data processing application on AWS Lambda. The application requires a machine learning model file that is 600 MB to execute its core functionality. Which solution will meet these requirements?

A

Compress the application code and model file into a .zip file. Directly upload the .zip file as a deployment package for the Lambda function without external storage.

B

Compress the application code and model file into a .zip file. Upload the .zip file to an Amazon S3 bucket. Configure the Lambda function to reference the .zip file in S3 during execution.

C

Package the application code and model file into a container image. Upload the image to an Amazon S3 bucket. Configure the Lambda function to pull the image from S3 for deployment.

D

Package the application code and model file into a container image. Push the image to an Amazon Elastic Container Registry (Amazon ECR) repository. Deploy the image to the Lambda function.

Explanation

The correct answer is D because:

- Lambda Deployment Limits: Lambda's .zip deployment package has a 50MB upload limit (direct upload) and 250MB unzipped limit. A 600MB model file exceeds these limits, ruling out options A and B.
- Container Image Support: Lambda supports container images (up to 10GB) for large dependencies. Option D packages the code and model into a container image stored in Amazon ECR, which is the correct service for Lambda container deployments.
- Why Other Options Fail:
- A: Direct .zip uploads are capped at 50MB.
- B: Even with S3, the unzipped size (600MB) exceeds the 250MB limit.
- C: Lambda cannot pull container images from S3; they must be stored in ECR.

Key Points: Use Lambda container images (via ECR) for dependencies >250MB. Avoid S3 for .zip packages exceeding size limits.

Answer

The correct answer is: D