Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam AWS Certified Solutions Architect - Associate SAA-C03 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Associate SAA-C03 exam

Exam AWS Certified Solutions Architect - Associate SAA-C03 topic 1 question 819 discussion

A company’s application is receiving data from multiple data sources. The size of the data varies and is expected to increase over time. The current maximum size is 700 KB. The data volume and data size continue to grow as more data sources are added.

The company decides to use Amazon DynamoDB as the primary database for the application. A solutions architect needs to identify a solution that handles the large data sizes.

Which solution will meet these requirements in the MOST operationally efficient way?

  • A. Create an AWS Lambda function to filter the data that exceeds DynamoDB item size limits. Store the larger data in an Amazon DocumentDB (with MongoDB compatibility) database.
  • B. Store the large data as objects in an Amazon S3 bucket. In a DynamoDB table, create an item that has an attribute that points to the S3 URL of the data.
  • C. Split all incoming large data into a collection of items that have the same partition key. Write the data to a DynamoDB table in a single operation by using the BatchWriteItem API operation.
  • D. Create an AWS Lambda function that uses gzip compression to compress the large objects as they are written to a DynamoDB table.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Neung983
Highly Voted 7 months, 1 week ago
Selected Answer: B
option B is the most operationally efficient solution for handling large data sizes in Amazon DynamoDB.
upvoted 9 times
...
seetpt
Highly Voted 7 months, 1 week ago
Selected Answer: B
B is correct
upvoted 5 times
...
Scheldon
Most Recent 4 months ago
Selected Answer: B
AnswerB Compresion of data in DynamoDB is a good idea especially for text data link from forum, but to do that we do not need AWS Lambda if I'm not wrong. In other head Storing big object on S3 and seving URL to it in DynamoDB is one of best practices mentioned by Amazon. Hence we do not know what kind of data we are storing in DB and how big objects will be in the future option B looks like the best solution. https://aws.amazon.com/blogs/database/large-object-storage-strategies-for-amazon-dynamodb/ <<<< Read Option 2 https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-use-s3-too.html
upvoted 1 times
...
Sergiuss95
5 months, 2 weeks ago
Selected Answer: B
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-use-s3-too.html
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...