exam questions

Exam AWS Certified Solutions Architect - Professional SAP-C02 All Questions

View all questions & answers for the AWS Certified Solutions Architect - Professional SAP-C02 exam

Exam AWS Certified Solutions Architect - Professional SAP-C02 topic 1 question 443 discussion

A flood monitoring agency has deployed more than 10,000 water-level monitoring sensors. Sensors send continuous data updates, and each update is less than 1 MB in size. The agency has a fleet of on-premises application servers. These servers receive updates from the sensors, convert the raw data into a human readable format, and write the results to an on-premises relational database server. Data analysts then use simple SQL queries to monitor the data.

The agency wants to increase overall application availability and reduce the effort that is required to perform maintenance tasks. These maintenance tasks, which include updates and patches to the application servers, cause downtime. While an application server is down, data is lost from sensors because the remaining servers cannot handle the entire workload.

The agency wants a solution that optimizes operational overhead and costs. A solutions architect recommends the use of AWS IoT Core to collect the sensor data.

What else should the solutions architect recommend to meet these requirements?

  • A. Send the sensor data to Amazon Kinesis Data Firehose. Use an AWS Lambda function to read the Kinesis Data Firehose data, convert it to .csv format, and insert it into an Amazon Aurora MySQL DB instance. Instruct the data analysts to query the data directly from the DB instance.
  • B. Send the sensor data to Amazon Kinesis Data Firehose. Use an AWS Lambda function to read the Kinesis Data Firehose data, convert it to Apache Parquet format, and save it to an Amazon S3 bucket. Instruct the data analysts to query the data by using Amazon Athena.
  • C. Send the sensor data to an Amazon Managed Service for Apache Flink (previously known as Amazon Kinesis Data Analytics) application to convert the data to .csv format and store it in an Amazon S3 bucket. Import the data into an Amazon Aurora MySQL DB instance. Instruct the data analysts to query the data directly from the DB instance.
  • D. Send the sensor data to an Amazon Managed Service for Apache Flink (previously known as Amazon Kinesis Data Analytics) application to convert the data to Apache Parquet format and store it in an Amazon S3 bucket. Instruct the data analysts to query the data by using Amazon Athena.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
CMMC
Highly Voted 1 year, 1 month ago
Selected Answer: B
Kinesis Data Firehose is well-suited for ingesting and processing streaming data at scale, such as the continuous updates from the water-level monitoring sensors. It can reliably capture and deliver data to various destinations, including S3, without requiring additional application code. Storing the data in Apache Parquet format in S3 offers several benefits. Parquet is a columnar storage format optimized for analytics workloads, providing efficient compression and query performance. This format is suitable for data analysis and querying using tools like Athena. Using AWS Lambda to transform the data from Kinesis Data Firehose into Parquet format reduces the maintenance effort associated with managing traditional servers. Lambda automatically scales with the incoming workload, ensuring continuous data processing without downtime.
upvoted 7 times
...
mifune
Highly Voted 11 months, 3 weeks ago
Selected Answer: B
Lamda functions integrates with Data Firehouse better than sending the data to Apache Flink and then implement a solution to transform the data into the Parquet Format to be sent to S3. From AWS Documentation: "With Amazon Managed Service for Apache Flink, you can use Java, Scala, Python, or SQL to process and analyze streaming data". So, Flink does not make any authomatic data transformation. The correct option is B.
upvoted 5 times
...
dv1
Most Recent 4 months, 3 weeks ago
Selected Answer: B
Managed service for apache flink cannot ingest streaming data directly. This means that anything flink is out. Best remaining answer is B.
upvoted 1 times
...
JoeTromundo
6 months, 2 weeks ago
Selected Answer: B
Option B is the most suitable solution as it leverages serverless and scalable services (Kinesis Data Firehose, Lambda, S3, and Athena) to handle data ingestion, transformation, and analysis with minimal operational overhead and optimized costs.
upvoted 1 times
...
sammyhaj
8 months, 2 weeks ago
Selected Answer: A
It says convert to human readable, that isn't Parquet, its CSV
upvoted 3 times
altonh
2 months, 1 week ago
KDF Aurora DB directly, and the lambda function that KDF invoked is for transformation only.
upvoted 1 times
altonh
2 months, 1 week ago
KDF cannot directly write to Aurora DB, and the lambda function that KDF invoked is for transformation only.
upvoted 1 times
...
...
JoeTromundo
6 months, 2 weeks ago
The statement does not say that it is necessary to continue storing the data in human readable form. The statement says that the agency wants to increase overall application availability and reduce the effort that is required to perform maintenance tasks. These are the requirements.
upvoted 1 times
...
...
nileshlg
11 months, 3 weeks ago
D seems to be the correct option as its a managed service
upvoted 2 times
...
seetpt
11 months, 4 weeks ago
Selected Answer: B
B for me
upvoted 2 times
...
titi_r
1 year ago
Selected Answer: B
“B” seems to be the correct ans. Amazon Data Firehose can ingest data streams from IoT and convert them to into Parquet format using Lambda function. The destination of the stream can be S3. https://aws.amazon.com/firehose/ https://d1.awsstatic.com/pdp-how-it-works-assets/Product-Pate-Diagram-Amazon-Kinesis-Data-Firehose%402x.39ea068e48494676c0f4386535f85a966e9ac252.png
upvoted 3 times
...
tushar321
1 year ago
D seems a better fit as Apache flink is a managed services for steaming as well as transformation. makes things simpler
upvoted 1 times
...
VerRi
1 year ago
Selected Answer: B
Both B and D are work. B - KDF&Lambda for data transformation D - KDA for real-time analysis
upvoted 4 times
...
Wilson_S
1 year ago
Selected Answer: D
Using a managed service for data transformation optimizes operational overhead.
upvoted 1 times
...
oayoade
1 year, 1 month ago
Selected Answer: A
"human readable format", I go with CSV
upvoted 3 times
...
Russs99
1 year, 1 month ago
Selected Answer: B
Although option D call work, it introduces unnecessary complexity for the given scenario.
upvoted 3 times
...
Dgix
1 year, 1 month ago
Selected Answer: D
Answer is D.
upvoted 2 times
...
Sathya
1 year, 1 month ago
Answer is D
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago