exam questions

Exam AZ-305 All Questions

View all questions & answers for the AZ-305 exam

Exam AZ-305 topic 2 question 28 discussion

Actual exam question from Microsoft's AZ-305
Question #: 28
Topic #: 2
[All AZ-305 Questions]

HOTSPOT
-

You have an app that generates 50,000 events daily.

You plan to stream the events to an Azure event hub and use Event Hubs Capture to implement cold path processing of the events. The output of Event Hubs Capture will be consumed by a reporting system.

You need to identify which type of Azure storage must be provisioned to support Event Hubs Capture, and which inbound data format the reporting system must support.

What should you identify? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
jspisak
Highly Voted 1 year, 6 months ago
Man sometimes I think I know what I'm talking about with Azure, and then I see a question like this and I question my sanity.
upvoted 154 times
josola
3 months ago
You're not the only one.
upvoted 2 times
...
maltlk
10 months, 3 weeks ago
I guess we are on the same boat
upvoted 9 times
...
U4ea
1 year, 1 month ago
Seriously, who knows this by heart?
upvoted 8 times
Lazylinux
7 months, 1 week ago
Bill Gates
upvoted 5 times
Pringlesucka
1 week, 5 days ago
I'd be willing to bet a lot of money that bill gates doesn't know shit about Avro LMFAO
upvoted 1 times
...
...
...
Ivantor
1 year, 2 months ago
You are not alone
upvoted 18 times
...
...
NotMeAnyWay
Highly Voted 1 year, 4 months ago
1. Storage Type: Azure Data Lake Storage Gen2 Azure Event Hubs Capture allows captured data to be written either to Azure Blob Storage or Azure Data Lake Storage Gen2. Given the nature of the data and its use in reporting and analysis, Azure Data Lake Storage Gen2 is the more appropriate choice because it is designed for big data analytics. 2. Data format: Avro Event Hubs Capture uses Avro format for the data it captures. Avro is a row-oriented format that is suitable for various data types, it's compact, fast, binary, and enables efficient and fast serialization of data. This makes it a good choice for Event Hubs Capture.
upvoted 34 times
...
Thanveer
Most Recent 1 week, 3 days ago
* Event Hubs doesn't support capturing events in a premium storage account. * Event Hubs Capture supports any non-premium Azure storage account with support for block blobs. Answer : 1.Storage Type: Azure Data Lake Storage Gen2 2. Data format: Avro
upvoted 1 times
...
SeMo0o0o0o
3 weeks, 1 day ago
CORRECT
upvoted 1 times
...
Len83
3 months, 3 weeks ago
This question appeared in the exam, August 2024, I gave this same answer provided here. I scored 870
upvoted 4 times
...
23169fd
5 months, 1 week ago
The given answer is correct. Azure Data Lake Storage Gen2 is designed for big data analytics and is highly scalable, making it suitable for storing large volumes of event data. Avro is a compact, fast binary format supported by Event Hubs Capture, optimized for efficiency and performance in data streaming scenarios.
upvoted 1 times
23169fd
5 months, 1 week ago
why not other options ? Storage Type: Premium block blobs: Designed for high-performance workloads but not optimized for big data analytics and hierarchical storage. Premium file shares: Suitable for high-performance file sharing but lacks the scalability and analytics features of ADLS Gen2. Data Format: Apache Parquet: Columnar storage format optimized for read-heavy operations, but not natively supported by Event Hubs Capture. JSON: Readable and widely used, but less efficient in terms of storage and performance compared to Avro for streaming data.
upvoted 1 times
...
...
AAsif098
6 months, 2 weeks ago
Response from Gemini AI: Storage Type: Event Hubs Capture allows captured data to be written to either Azure Blob Storage or Azure Data Lake Storage Gen2. However, for cold path processing scenarios, which involve analyzing historical data, Azure Data Lake Storage Gen2 is the more suitable choice. It's designed for big data analytics workloads and offers better performance and scalability for working with large datasets captured from event hubs. Inbound Data Format: Event Hubs Capture uses Avro format for the captured data. Avro is a widely used open-source data format specifically designed for data exchange. It's a row-oriented, binary format that provides rich data structures with inline schema definition. This makes it efficient for storage and easy for various analytics tools and reporting systems to understand and process the captured event data.
upvoted 1 times
...
chair123
9 months ago
Based on Gemini AI: Box 2-
upvoted 1 times
chair123
9 months ago
Box 2 - Avro Explanation: Supported Formats: While Event Hubs itself can handle various data formats including JSON, Avro, and Apache Parquet, Event Hubs Capture specifically writes data in Apache Avro format. This format is well-suited for cold path processing due to its: - Compact nature - Speed - Ability to represent complex data structures - Inline schema definition for easier data understanding Why not JSON or Parquet? JSON: While JSON is a common data interchange format, it can be less efficient for cold path processing due to its larger size compared to Avro. Parquet: Although Azure Stream Analytics can be used to capture Event Hubs data in Parquet format, Event Hubs Capture itself doesn't directly support Parquet.
upvoted 1 times
...
...
4fd861f
9 months ago
For streaming Avro is made for it compared to Parquet as it row oriented format so if you have batch in the question => Parquet, Streaming => Avro
upvoted 1 times
...
fodocel235
1 year ago
Correct given answers. "Azure Data Lake Storage Gen 2" or "Azure Storage Account" can be used as a Storage Account via the Portal. Just to be sure, I created a Premium Storage Account (blob) and this is - NOT - a valid option to store the Captured files. By default Avro is selected via the Portal. Also Parquet and Delta Lake (preview) are supported via the Portal.
upvoted 1 times
...
J404
1 year ago
Correct Answer: - Azure Data Lake Storage Gen2 - Apache Parquet I am thinkin rather best-practice driven rather than looking into docs. If I'd set up an analytics service in Azure, I'd prefer Databricks. In Databricks I am always working with Parquet files rather than Avro. Avro is often used in case of streaming. Single messages can be compressed and a schema is still enforced. But the question is only about analytics. While as I am preferring Avro in context of streaming, I am preferring Parquet for data analysis.
upvoted 2 times
...
randy0077
1 year, 1 month ago
avro or apache parquet both are correct answer. however apache parquet is columnar storage format that provides efficient commpression and query performance.
upvoted 2 times
...
ntma3b
1 year, 2 months ago
The answer is correct. https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview#how-event-hubs-capture-works The capture can be in Parquet format however if you use no code editor which is outside the scope of the question.
upvoted 3 times
...
Mladen_66
1 year, 2 months ago
Capture data to ADLS Gen2 in Parquet format: https://learn.microsoft.com/en-us/azure/stream-analytics/event-hubs-parquet-capture-tutorial
upvoted 2 times
husam421
1 year, 2 months ago
Capture data to ADLS Gen2 in Parquet format tile.
upvoted 1 times
...
...
Ashfarqk
1 year, 5 months ago
Azure Data Lake Storage Gen2 is not a premium storage account. It is a storage account type that provides a unified storage solution for both structured and unstructured data. As Premium Storage options are not supported by Event Hubs Capture
upvoted 1 times
...
Tr619899
1 year, 6 months ago
To support Event Hubs Capture, the appropriate Azure storage type is Azure Data Lake Storage Gen2. Event Hubs Capture is specifically designed to write captured events directly to Azure Data Lake Storage Gen2, providing a durable and scalable storage solution. Regarding the inbound data format that the reporting system must support, the data format supported by Event Hubs Capture is Apache Avro. Event Hubs Capture writes the captured events in Avro format by default. Therefore, the reporting system should be able to consume and process data in the Apache Avro format. So the correct selections would be: Storage Type: Azure Data Lake Storage Gen2 Data Format: Apache Avro
upvoted 4 times
...
sw1000
1 year, 6 months ago
Answer is not correct I side and agree with the explanation by Sanaie. Azure Data Lake Storage Gen2, as premium storage options are not supported by Event Hubs Capture. Apache Parquet is better suited for data analytics compared to Avro and JSON. Avro and Parquet are the only supported formats I have seen in the documentation. As we have an analytics case here I would suggest Parquet. Avro, however, is the default option and doesn't need any specific configurations.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...