exam questions

Exam AWS Certified Solutions Architect - Professional SAP-C02 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Professional SAP-C02 exam

Exam AWS Certified Solutions Architect - Professional SAP-C02 topic 1 question 244 discussion

A company has developed a hybrid solution between its data center and AWS. The company uses Amazon VPC and Amazon EC2 instances that send application logs to Amazon CloudWatch. The EC2 instances read data from multiple relational databases that are hosted on premises.

The company wants to monitor which EC2 instances are connected to the databases in near-real time. The company already has a monitoring solution that uses Splunk on premises. A solutions architect needs to determine how to send networking traffic to Splunk.

How should the solutions architect meet these requirements?

  • A. Enable VPC flows logs, and send them to CloudWatch. Create an AWS Lambda function to periodically export the CloudWatch logs to an Amazon S3 bucket by using the pre-defined export function. Generate ACCESS_KEY and SECRET_KEY AWS credentials. Configure Splunk to pull the logs from the S3 bucket by using those credentials.
  • B. Create an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination. Configure a pre-processing AWS Lambda function with a Kinesis Data Firehose stream processor that extracts individual log events from records sent by CloudWatch Logs subscription filters. Enable VPC flows logs, and send them to CloudWatch. Create a CloudWatch Logs subscription that sends log events to the Kinesis Data Firehose delivery stream.
  • C. Ask the company to log every request that is made to the databases along with the EC2 instance IP address. Export the CloudWatch logs to an Amazon S3 bucket. Use Amazon Athena to query the logs grouped by database name. Export Athena results to another S3 bucket. Invoke an AWS Lambda function to automatically send any new file that is put in the S3 bucket to Splunk.
  • D. Send the CloudWatch logs to an Amazon Kinesis data stream with Amazon Kinesis Data Analytics for SQL Applications. Configure a 1-minute sliding window to collect the events. Create a SQL query that uses the anomaly detection template to monitor any networking traffic anomalies in near-real time. Send the result to an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
bhanus
Highly Voted 1 year, 4 months ago
Selected Answer: B
Answer is B Question asks for "near real time" analysis For near real time -->use Kinesis Datafirehose. For real time ---> use Kineses data streams real-time is instant, whereas near real-time is delayed
upvoted 16 times
...
adelynllllllllll
Most Recent 9 months, 4 weeks ago
B: Why do they answer the solution backwards. it does no follow the workflow, it is hard to put the picture together. but , anyway.
upvoted 4 times
...
career360guru
11 months, 1 week ago
Selected Answer: B
B is right answer as KDF supports Splunk integration.
upvoted 1 times
career360guru
11 months, 1 week ago
and Requirement is Near Real time.
upvoted 1 times
...
...
joleneinthebackyard
12 months ago
Selected Answer: B
Monitoring solution -> VPC flow logs Near real time analysis -> Firehose Firehose also can have spunk as destination -> eye on B A: giving access key normally a secondary considered option C: too complex to get logs while we have vpc flow logs D: same
upvoted 3 times
...
ggrodskiy
1 year, 3 months ago
correct B.
upvoted 1 times
...
NikkyDicky
1 year, 3 months ago
Selected Answer: B
its a B
upvoted 1 times
...
Christina666
1 year, 3 months ago
Selected Answer: B
B, in this link https://docs.aws.amazon.com/firehose/latest/dev/creating-the-stream-to-splunk.html#:~:text=In%20this%20part%20of%20the%20Kinesis%20Data%20Firehose%20tutorial%2C%20you%20create%20an%20Amazon%20Kinesis%20Data%20Firehose%20delivery%20stream%20to%20receive%20the%20log%20data%20from%20Amazon%20CloudWatch%20and%20deliver%20that%20data%20to%20Splunk., the traffic flow is: CW logs-> Kinesis Datafirehose delivery-> Splunk. In our case, we need custom logs, so need to subscribe VPC flow logs to send to splunk for specific monitoring
upvoted 1 times
...
SkyZeroZx
1 year, 3 months ago
Selected Answer: B
Answer is B Question asks for "near real time" analysis For near real time -->use Kinesis Datafirehose. For real time ---> use Kineses data streams real-time is instant, whereas near real-time is delayed
upvoted 2 times
...
SmileyCloud
1 year, 4 months ago
Selected Answer: B
It's B - Rest is too complex. https://docs.aws.amazon.com/firehose/latest/dev/creating-the-stream-to-splunk.html
upvoted 3 times
...
PhuocT
1 year, 4 months ago
Selected Answer: B
B is answer, I think
upvoted 1 times
...
ozelllll
1 year, 4 months ago
Selected Answer: B
B. https://docs.aws.amazon.com/firehose/latest/dev/vpc-splunk-tutorial.html
upvoted 2 times
...
gd1
1 year, 4 months ago
Selected Answer: B
GPT - Amazon VPC Flow Logs can be enabled to capture information about the IP traffic going to and from network interfaces in the VPC. Flow log data can be published to Amazon CloudWatch Logs and Amazon S3. Once the logs are in CloudWatch, you can create a subscription filter that forwards events to a Kinesis Data Firehose stream. AWS Lambda can preprocess records in the Kinesis Data Firehose stream before they are delivered to Splunk.This solution provides near-real-time delivery of VPC Flow Logs to Splunk.Other options are less optimal because they involve unnecessary complexity or do not provide near-real-time monitoring.
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago