Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam AWS Certified Solutions Architect - Associate SAA-C03 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Associate SAA-C03 exam

Exam AWS Certified Solutions Architect - Associate SAA-C03 topic 1 question 422 discussion

A company is developing a new machine learning (ML) model solution on AWS. The models are developed as independent microservices that fetch approximately 1 GB of model data from Amazon S3 at startup and load the data into memory. Users access the models through an asynchronous API. Users can send a request or a batch of requests and specify where the results should be sent.

The company provides models to hundreds of users. The usage patterns for the models are irregular. Some models could be unused for days or weeks. Other models could receive batches of thousands of requests at a time.

Which design should a solutions architect recommend to meet these requirements?

  • A. Direct the requests from the API to a Network Load Balancer (NLB). Deploy the models as AWS Lambda functions that are invoked by the NLB.
  • B. Direct the requests from the API to an Application Load Balancer (ALB). Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from an Amazon Simple Queue Service (Amazon SQS) queue. Use AWS App Mesh to scale the instances of the ECS cluster based on the SQS queue size.
  • C. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue. Deploy the models as AWS Lambda functions that are invoked by SQS events. Use AWS Auto Scaling to increase the number of vCPUs for the Lambda functions based on the SQS queue size.
  • D. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue. Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from the queue. Enable AWS Auto Scaling on Amazon ECS for both the cluster and copies of the service based on the queue size.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
examtopictempacc
Highly Voted 1 year, 6 months ago
asynchronous=SQS, microservices=ECS. Use AWS Auto Scaling to adjust the number of ECS services.
upvoted 15 times
TariqKipkemei
1 year, 5 months ago
good breakdown :)
upvoted 3 times
...
...
TariqKipkemei
Highly Voted 1 year, 5 months ago
Selected Answer: D
For once examtopic answer is correct :) haha... Batch requests/async = Amazon SQS Microservices = Amazon ECS Workload variations = AWS Auto Scaling on Amazon ECS
upvoted 9 times
...
wizcloudifa
Most Recent 7 months, 1 week ago
Selected Answer: D
ALB is mentioned in other options to distract you, you dont need ALB for scaling here, we would need ECS autoscaling, they play with that idea in option B a bit however D gets it in a completely optimized way.... A and C both have lambda which for Machine learning models with workloads on heavy side, will not fly
upvoted 1 times
...
Guru4Cloud
1 year, 2 months ago
Selected Answer: D
I go with everyone D.
upvoted 2 times
...
alexandercamachop
1 year, 5 months ago
Selected Answer: D
D, no need for an App Load balancer like C says, no where in the text. SQS is needed to ensure all request gets routed properly in a Microservices architecture and also that it waits until its picked up. ECS with Autoscaling, will scale based on the unknown pattern of usage as mentioned.
upvoted 1 times
...
anibinaadi
1 year, 5 months ago
It is D Refer https://aws.amazon.com/blogs/containers/amazon-elastic-container-service-ecs-auto-scaling-using-custom-metrics/ for additional information/knowledge.
upvoted 1 times
...
nosense
1 year, 6 months ago
Selected Answer: D
because it is scalable, reliable, and efficient. C does not scale the models automatically
upvoted 3 times
deechean
1 year, 2 months ago
why C doesn't scale the model? Application Auto Scaling can apply to lambda.
upvoted 1 times
NSA_Poker
5 months, 3 weeks ago
Auto Scaling doesn't apply to Lambda. As your functions receive more requests, Lambda automatically handles scaling the number of execution environments until you reach your account's concurrency limit.
upvoted 1 times
...
pentium75
10 months, 3 weeks ago
How would you "use Auto Scaling (!) to increase the number of vCPUs (!) for the Lamba functions"?
upvoted 2 times
...
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...