Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam AWS Certified Solutions Architect - Associate SAA-C03 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Associate SAA-C03 exam

Exam AWS Certified Solutions Architect - Associate SAA-C03 topic 1 question 94 discussion

A company is designing an application where users upload small files into Amazon S3. After a user uploads a file, the file requires one-time simple processing to transform the data and save the data in JSON format for later analysis.
Each file must be processed as quickly as possible after it is uploaded. Demand will vary. On some days, users will upload a high number of files. On other days, users will upload a few files or no files.
Which solution meets these requirements with the LEAST operational overhead?

  • A. Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
  • B. Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EC2 instances to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
  • C. Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
  • D. Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when a new file is uploaded. Use an AWS Lambda function to consume the event from the stream and process the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
rjam
Highly Voted 2 years ago
Option C Dynamo DB is a NoSQL-JSON supported
upvoted 17 times
rjam
2 years ago
also Use an AWS Lambda - serverless - less operational overhead
upvoted 12 times
...
...
cookieMr
Highly Voted 1 year, 5 months ago
Selected Answer: C
A. Configuring EMR and an Aurora DB cluster for this use case would introduce unnecessary complexity and operational overhead. EMR is typically used for processing large datasets and running big data frameworks like Apache Spark or Hadoop. B. While using S3 event notifications and SQS for decoupling is a good approach, using EC2 to process the data would introduce operational overhead in terms of managing and scaling the EC2. D. Using EventBridge and Kinesis Data Streams for this use case would introduce additional complexity and operational overhead compared to the other options. EventBridge and Kinesis are typically used for real-time streaming and processing of large volumes of data. In summary, option C is the recommended solution as it provides a serverless and scalable approach for processing uploaded files using S3 event notifications, SQS, and Lambda. It offers low operational overhead, automatic scaling, and efficient handling of varying demand. Storing the resulting JSON file in DynamoDB aligns with the requirement of saving the data for later analysis.
upvoted 10 times
...
PaulGa
Most Recent 2 months ago
Selected Answer: C
Ans C - as per cookieMr (1 yr, 2 mth ago) "...In summary, option C is the recommended solution as it provides a serverless and scalable approach for processing uploaded files using S3 event notifications, SQS, and Lambda. It offers low operational overhead, automatic scaling, and efficient handling of varying demand. Storing the resulting JSON file in DynamoDB aligns with the requirement of saving the data for later analysis."
upvoted 2 times
...
jaradat02
4 months ago
Selected Answer: C
Option C, fulfills the least operational overhead condition.
upvoted 2 times
...
TilTil
8 months, 1 week ago
Selected Answer: C
B where we use EC2 instances for processing would be ideal in situations where runtime is > 15 minutes. However the question mentions 'simple processing', hence we go for Lambda.
upvoted 3 times
...
awsgeek75
10 months, 1 week ago
Selected Answer: C
LEAST operational overhead A: EMR is massive programming effort for this B: EC2 is considerable overhead D: Nice solution but why would you use Kinesis as there is no streaming scenario here C: Simplest and all managed services so least operational overhead compared to other options
upvoted 2 times
...
Ruffyit
1 year ago
Option C is the best solution that meets the requirements with the least operational overhead: Configure Amazon S3 to send event notification to SQS queue Use Lambda function triggered by SQS to process each file Store output JSON in DynamoDB This leverages serverless components like S3, SQS, Lambda, and DynamoDB to provide automated file processing without needing to provision and manage servers. SQS queues the notifications and Lambda scales automatically to handle spikes and drops in file uploads. No EMR cluster or EC2 Fleet is needed to manage.
upvoted 2 times
...
Modulopi
1 year, 1 month ago
Selected Answer: C
C: Lambdas are made for that
upvoted 1 times
...
TariqKipkemei
1 year, 3 months ago
Selected Answer: C
C is best
upvoted 1 times
...
Guru4Cloud
1 year, 3 months ago
Selected Answer: C
Option C is the best solution that meets the requirements with the least operational overhead: Configure Amazon S3 to send event notification to SQS queue Use Lambda function triggered by SQS to process each file Store output JSON in DynamoDB This leverages serverless components like S3, SQS, Lambda, and DynamoDB to provide automated file processing without needing to provision and manage servers. SQS queues the notifications and Lambda scales automatically to handle spikes and drops in file uploads. No EMR cluster or EC2 Fleet is needed to manage.
upvoted 2 times
...
beginnercloud
1 year, 6 months ago
Selected Answer: C
Option C is correct - Dynamo DB is a NoSQL-JSON supported
upvoted 1 times
...
Abrar2022
1 year, 6 months ago
Selected Answer: C
SQS + LAMDA + JSON >>>>>> Dynamo DB
upvoted 2 times
...
Bmarodi
1 year, 6 months ago
Selected Answer: C
The option C is right answer.
upvoted 1 times
...
jy190
1 year, 6 months ago
can someone explain why SQS? it's a poll-based messaging, does it guarantee reacting the event asap?
upvoted 1 times
...
Zerotn3
1 year, 10 months ago
Selected Answer: C
Dynamo DB is a NoSQL-JSON supported
upvoted 1 times
...
Buruguduystunstugudunstuy
1 year, 11 months ago
Selected Answer: C
Option C, Configuring Amazon S3 to send an event notification to an Amazon Simple Queue Service (SQS) queue and using an AWS Lambda function to read from the queue and process the data, would likely be the solution with the least operational overhead. AWS Lambda is a serverless computing service that allows you to run code without the need to provision or manage infrastructure. When a new file is uploaded to Amazon S3, it can trigger an event notification which sends a message to an SQS queue. The Lambda function can then be set up to be triggered by messages in the queue, and it can process the data and store the resulting JSON file in Amazon DynamoDB.
upvoted 4 times
Buruguduystunstugudunstuy
1 year, 11 months ago
Using a serverless solution like AWS Lambda can help to reduce operational overhead because it automatically scales to meet demand and does not require you to provision and manage infrastructure. Additionally, using an SQS queue as a buffer between the S3 event notification and the Lambda function can help to decouple the processing of the data from the uploading of the data, allowing the processing to happen asynchronously and improving the overall efficiency of the system.
upvoted 2 times
...
...
career360guru
1 year, 11 months ago
Selected Answer: C
Option C as JSON is supported by DynamoDB. RDS or AuroraDB are not suitable for JSON data. A - Because this is not a Bigdata analytics usecase.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...