exam questions

Exam AWS Certified Solutions Architect - Associate SAA-C03 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Associate SAA-C03 exam

Exam AWS Certified Solutions Architect - Associate SAA-C03 topic 1 question 139 discussion

A reporting team receives files each day in an Amazon S3 bucket. The reporting team manually reviews and copies the files from this initial S3 bucket to an analysis S3 bucket each day at the same time to use with Amazon QuickSight. Additional teams are starting to send more files in larger sizes to the initial S3 bucket.
The reporting team wants to move the files automatically analysis S3 bucket as the files enter the initial S3 bucket. The reporting team also wants to use AWS Lambda functions to run pattern-matching code on the copied data. In addition, the reporting team wants to send the data files to a pipeline in Amazon SageMaker Pipelines.
What should a solutions architect do to meet these requirements with the LEAST operational overhead?

  • A. Create a Lambda function to copy the files to the analysis S3 bucket. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s3:ObjectCreated:Put as the event type.
  • B. Create a Lambda function to copy the files to the analysis S3 bucket. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
  • C. Configure S3 replication between the S3 buckets. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s3:ObjectCreated:Put as the event type.
  • D. Configure S3 replication between the S3 buckets. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Six_Fingered_Jose
Highly Voted 2 years, 1 month ago
Selected Answer: D
i go for D here A and B says you are copying the file to another bucket using lambda, C an D just uses S3 replication to copy the files, They are doing exactly the same thing while C and D do not require setting up of lambda, which should be more efficient The question says the team is manually copying the files, automatically replicating the files should be the most efficient method vs manually copying or copying with lambda.
upvoted 31 times
Tsige
2 months ago
S3 Replication: Configuring S3 replication between the initial and analysis S3 buckets automates the process of moving files between the buckets without the need to manually copy files or run a Lambda function for this purpose. This reduces operational overhead. S3 Event Notifications: Once files are replicated to the analysis bucket, you can configure S3 event notifications for the s3:ObjectCreated event. This event triggers actions (such as invoking Lambda functions and sending data to SageMaker Pipelines) when new files are placed in the analysis bucket. The answer is C
upvoted 1 times
...
vipyodha
1 year, 6 months ago
yes d because of least operational overhead and also s3 event notification can only send to sns.sqs.and lambda , not to sagemaker.eventbridge can send to sagemaker
upvoted 19 times
...
Abdou1604
1 year, 2 months ago
but the reporting team also wants to use AWS Lambda functions to run pattern-matching code on the copied , S3 replication cons is copying everything
upvoted 2 times
pentium75
12 months ago
The Lambda functions should run "on the copied data", so first copy, THEN run Lambda function, which is achieved by D.
upvoted 5 times
...
...
...
123jhl0
Highly Voted 2 years, 2 months ago
Selected Answer: B
C and D aren't answers as replicating the S3 bucket isn't efficient, as other teams are starting to use it to store larger docs not related to the reporting, making replication not useful. As Amazon SageMaker Pipelines, ..., is now supported as a target for routing events in Amazon EventBridge, means the answer is B https://aws.amazon.com/about-aws/whats-new/2021/04/new-options-trigger-amazon-sagemaker-pipeline-executions/
upvoted 18 times
JayBee65
2 years ago
I think you are mis-interpreting the question. I think you need to use all files, including the ones provided by other teams, otherwise how can you tell what files to copy? I think the point of this statement is to show that more files are in use, and being copied at different times, rather than suggesting you need to differentiate between the two sources of files.
upvoted 9 times
...
KADSM
2 years, 1 month ago
Not sure how far lambda will cope up with larger files with the timelimit in place.
upvoted 4 times
byteb
1 year ago
"The reporting team wants to move the files automatically to analysis S3 bucket as the files enter the initial S3 bucket." Replication is asynchronous, with lambda the data will be available faster. So I think A is the answer.
upvoted 1 times
...
...
vipyodha
1 year, 6 months ago
but B is not least operational overhead , D is least operational overhead
upvoted 2 times
...
jdr75
1 year, 8 months ago
You misinterpret it ... the reporting team is overload, cos' more teams request their services uploading more data to the bucket. That's the reason reporting team need to automate the process. So ALL the bucket objects need to be copied to other bucket, and the replication is better an cheaper than use Lambda. So the answer is D.
upvoted 3 times
...
...
salman7540
Most Recent 3 days, 6 hours ago
Selected Answer: D
S3 replication can use to copy files in different buckets so we don't need lambda. S3 events can't be sent directly to sagemaker so we have to utilise eventbridge who supports many targets including sagemaker.
upvoted 1 times
...
rmanuraj
6 days, 3 hours ago
Selected Answer: D
In the case of S3 event notification only one destination type can be specified for each event notification. https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-how-to-event-types-and-destinations.html#supported-notification-destinations
upvoted 1 times
...
PSH123
3 weeks, 3 days ago
Selected Answer: C
gpt said 'C' is solution
upvoted 1 times
...
PaulGa
3 months ago
Selected Answer: D
Ans D - least operational overhead using replication; I was initially going for Ans C until I spotted S3 event notification can only send to SQS, SNS, Lambda - not directly to Sagemaker; but Eventbridge can send to Sagemaker. Not sure why author prefers A...?
upvoted 3 times
...
jatric
5 months, 2 weeks ago
Selected Answer: D
S3 event can't be use to notify sagemaker, So C can't be right option. AB required lambda which is not unnecessary
upvoted 3 times
...
lofzee
6 months, 4 weeks ago
Answer is D because it requires least operational overhead and S3 replication does the copying for you. Also read this https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html Lambda and Sagemaker are not supported destinations for S3 Event Notifications
upvoted 4 times
lofzee
6 months, 4 weeks ago
Sorry i meant Sagemaker is not supported as an S3 Event Notification. Lambda is though. Still doesn't change what the answer is.... D
upvoted 4 times
...
...
andyngkh86
11 months, 1 week ago
I go for C, because option C no need to configure event notifications, but D need to extra work to configure the event notification, for the least operation, option C is best choice
upvoted 1 times
...
Marco_St
1 year, 1 month ago
Selected Answer: D
B is the first option I denied. Since it makes the event happens inside the analysis bucket to trigger the lambda function. so if the lambda function is running code to copy files from initial bucket to analysis bucket. Then this lambda function should be triggered by the event in initial bucket like once the data reaches in the initial bucket then lambda is triggered. D is the answer.
upvoted 2 times
...
AntonioMinolfi
1 year, 2 months ago
Selected Answer: D
Utilizing a lambda function would introduce additional operational overhead, eliminating options A and B. S3 replication offers a simpler setup and efficiently accomplishes the task. S3 notifications cannot use SageMaker as a destination; the permissible destinations include SQS, SNS, Lambda, and Eventbridge, so C is out.
upvoted 10 times
...
vijaykamal
1 year, 2 months ago
Selected Answer: D
Create lambda for replication is overhead. This dismisses A and B S3 event notification cannot be directed to Sagemaker directly. This dismisses C Correct Answer: D
upvoted 3 times
fantastique007
8 months, 1 week ago
You are right. This is the key point - Sagemaker cannot be the destination of S3 event notification.
upvoted 1 times
...
...
TariqKipkemei
1 year, 3 months ago
Selected Answer: D
D provide the least operational overhead
upvoted 1 times
...
Guru4Cloud
1 year, 4 months ago
Selected Answer: D
Option D is the solution with the least operational overhead: Use S3 replication between buckets Send S3 events to EventBridge Add Lambda and SageMaker as EventBridge rule targets The reasons this has the least overhead: S3 replication automatically copies new objects to analysis bucket EventBridge allows easily adding multiple targets for events No custom Lambda function needed for copying objects Leverages managed services for event processing
upvoted 4 times
...
MutiverseAgent
1 year, 5 months ago
Selected Answer: D
Correct: D B & D the only possible as Sagemaker is not supported as target for S3 events. Using bucket replication as D mention is more efficient than using a lambda as B mention.
upvoted 3 times
...
cookieMr
1 year, 6 months ago
Selected Answer: D
Option D is correct because it combines S3 replication, event notifications, and Amazon EventBridge to automate the copying of files from the initial S3 bucket to the analysis S3 bucket. It also allows for the execution of Lambda functions and integration with SageMaker Pipelines. Option A is incorrect because it suggests manually copying the files using a Lambda function and event notifications, but it does not utilize S3 replication or EventBridge for automation. Option B is incorrect because it suggests using S3 event notifications directly with EventBridge, but it does not involve S3 replication or utilize Lambda for copying the files. Option C is incorrect because it only involves S3 replication and event notifications without utilizing EventBridge or Lambda functions for further processing.
upvoted 3 times
...
studynoplay
1 year, 7 months ago
Selected Answer: D
https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-how-to-event-types-and-destinations.html#supported-notification-destinations S3 can NOT send event notification to SageMaker. This rules out C. you have to send to • Amazon EventBridge 1st then to SageMaker
upvoted 7 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago