Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.

Unlimited Access

Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Exam AWS Certified Solutions Architect - Associate SAA-C03 topic 1 question 33 discussion

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.
What should a solutions architect recommend to meet these requirements?

  • A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.
  • B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.
  • C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.
  • D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
ArielSchivo
Highly Voted 1 year, 8 months ago
Selected Answer: C
I would go for C. The tricky phrase is "near-real-time solution", pointing to Firehouse, but it can't send data to DynamoDB, so it leaves us with C as best option. Kinesis Data Firehose currently supports Amazon S3, Amazon Redshift, Amazon OpenSearch Service, Splunk, Datadog, NewRelic, Dynatrace, Sumologic, LogicMonitor, MongoDB, and HTTP End Point as destinations. https://aws.amazon.com/kinesis/data-firehose/faqs/#:~:text=Kinesis%20Data%20Firehose%20currently%20supports,HTTP%20End%20Point%20as%20destinations.
upvoted 75 times
Lonojack
1 year, 5 months ago
This was a really tough one. But you have the best explanation on here with reference point. Thanks. I’m going with answer C!
upvoted 4 times
...
SaraSundaram
1 year, 3 months ago
There are many questions having Firehose and Stream. Need to know them in detail to answer. Thanks for the explanation
upvoted 4 times
diabloexodia
11 months, 4 weeks ago
Stream is used if you want real time results , but with firehose , you generally use the data at a later point of time by storing it somewhere. Hence if you see "REAL TIME" the answer is most probably Kinesis Data Streams.
upvoted 16 times
...
...
lizzard812
1 year, 5 months ago
Sorry but I still can't see how Kinesis Data Stream is 'scalable', since you have to provision the quantity of shards in advance?
upvoted 1 times
habibi03336
1 year, 4 months ago
"easily stream data at any scale" This is a description of Kinesis Data Stream. I think you can configure its quantity but still not provision and manage scalability by yourself.
upvoted 1 times
...
...
...
JesseeS
Highly Voted 1 year, 8 months ago
The answer is C, because Firehose does not suppport DynamoDB and another key word is "data" Kinesis Data Streams is the correct choice. Pay attention to key words. AWS likes to trick you up to make sure you know the services.
upvoted 30 times
...
the_mellie
Most Recent 1 month, 1 week ago
Selected Answer: C
with multiple consumers and on the fly modification, it seems like the most logical choice
upvoted 1 times
...
vi24
3 months, 4 weeks ago
I chose B. The "near real time" is very specific to Kinesis firehose which is a better option anyway. The rest of the answer makes sense too. C is wrong : "sensitive data removed by Lambda & then store transaction data in DynamoDB" , while it continues to say other applications are accessing the transaction data from kinesis Data stream !!
upvoted 2 times
...
Pics00094
4 months, 2 weeks ago
Selected Answer: C
need to know.. 1) Lambda Integration 2) Difference between Real time(Kinesis Data Stream) vs Near Real time(Kinesis Fire House) 3) Firehouse can't target DynamoDB
upvoted 4 times
...
JulianWaksmann
5 months ago
i think c are bad too, because it isn't near real time.
upvoted 2 times
...
awsgeek75
5 months, 3 weeks ago
Selected Answer: C
A: DynamoDB streams are logs, not fit for real-time sharing. B: S3 is not document database, it's BLOB D: S3 and files are not database C: Kinesis + Lambda + DynamoDB is high performance, low latency scalable solution.
upvoted 2 times
...
A_jaa
5 months, 3 weeks ago
Selected Answer: C
Answer-C
upvoted 1 times
...
bujuman
6 months, 2 weeks ago
Selected Answer: C
Data Stream can handle near-real-time and is able to store to DynamoDB
upvoted 1 times
...
djgodzilla
6 months, 3 weeks ago
Selected Answer: C
Kinesis Data Streams stores data for later processing by applications , key difference with Firehose which delivers data directly to AWS services.
upvoted 1 times
...
wabosi
7 months, 3 weeks ago
Selected Answer: C
Correct answer is C. As some commented already, 'near-real-time' could make you think abut Firehose but its consumers are 3rd-party partners destinations, Amazon S3, Amazon Redshift, Amazon OpenSearch and HTTP endpoint so DynamoDB can't be used in this scenario.
upvoted 1 times
...
Ruffyit
8 months, 1 week ago
C is the best solution for the following reasons: 1. Real-time Data Stream: To share millions of financial transactions with other apps, you need to be able to ingest data in real-time, which is made possible by Amazon Kinesis Data Streams. 2. Data Transformation: You can cleanse and eliminate sensitive data from transactions before storing them in Amazon DynamoDB by utilizing AWS Lambda with Kinesis Data Streams. This takes care of the requirement to handle sensitive data with care. 3. Scalability: DynamoDB and Amazon Kinesis are both extremely scalable technologies that can manage enormous data volumes and adjust to the workload.
upvoted 1 times
...
AWSStudyBuddy
8 months, 2 weeks ago
C is the best solution for the following reasons: 1. Real-time Data Stream: To share millions of financial transactions with other apps, you need to be able to ingest data in real-time, which is made possible by Amazon Kinesis Data Streams. 2. Data Transformation: You can cleanse and eliminate sensitive data from transactions before storing them in Amazon DynamoDB by utilizing AWS Lambda with Kinesis Data Streams. This takes care of the requirement to handle sensitive data with care. 3. Scalability: DynamoDB and Amazon Kinesis are both extremely scalable technologies that can manage enormous data volumes and adjust to the workload. 4. Low-Latency retrieval: Applications requiring real-time data can benefit from low-latency retrieval, which is ensured by storing the processed data in DynamoDB.
upvoted 2 times
AWSStudyBuddy
8 months, 2 weeks ago
Choices A, B, and D are limited in certain ways: • Real-time data streaming is not provided by Option A (DynamoDB with Streams); additional components would need to be implemented in order to handle data in real-time. • Kinesis Data Firehose, Option B, lacks the real-time processing capabilities of Kinesis Data Streams and is primarily used for data distribution to destinations like as S3. • For near-real-time use cases, Option D (Batch processing with S3) is not the best choice. It adds latency and overhead associated with batch processing, which is incompatible with the need for real-time data sharing. Using the advantages of Lambda, DynamoDB, and Kinesis Data Streams, Option C offers a scalable, real-time, and effective solution for the given use case.
upvoted 1 times
...
...
Ak9kumar
9 months, 2 weeks ago
I picked B. We need to understand how Kinesis Data Warehouse works to answer this question right.
upvoted 1 times
spw7
8 months, 1 week ago
firehose can not send data to dynamoDB
upvoted 1 times
...
...
sohailn
11 months ago
kinesis Data Firhouse optionally support lambda for transformation
upvoted 1 times
...
TariqKipkemei
11 months, 1 week ago
Selected Answer: C
Scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications = Amazon Kinesis Data Streams. Remove sensitive data from transactions = AWS Lambda. Store transaction data in a document database for low-latency retrieval = Amazon DynamoDB.
upvoted 9 times
...
cookieMr
1 year ago
Selected Answer: C
To meet the requirements of sharing financial transaction details with several other internal applications, and processing and storing the transactions data in a scalable and near-real-time manner, a solutions architect should recommend option C: Stream the transactions data into Amazon Kinesis Data Streams, use AWS Lambda integration to remove sensitive data, and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream. Option A (storing transactions data in DynamoDB and using DynamoDB Streams) may not provide the same level of scalability and real-time data sharing as Kinesis Data Streams. Option B (using Kinesis Data Firehose to store data in DynamoDB and S3) adds unnecessary complexity and additional storage costs. Option D (storing batched transactions data in S3 and processing with Lambda) may not provide the required near-real-time data sharing and low-latency retrieval compared to the streaming-based solution.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
ex Want to SAVE BIG on Certification Exam Prep?
close
ex Unlock All Exams with ExamTopics Pro 75% Off
  • arrow Choose From 1000+ Exams
  • arrow Access to 10 Exams per Month
  • arrow PDF Format Available
  • arrow Inline Discussions
  • arrow No Captcha/Robot Checks
Limited Time Offer
Ends in