exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 3 question 27 discussion

Actual exam question from Microsoft's DP-203
Question #: 27
Topic #: 3
[All DP-203 Questions]

DRAG DROP -
You have an Azure data factory.
You need to ensure that pipeline-run data is retained for 120 days. The solution must ensure that you can query the data by using the Kusto query language.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
Select and Place:

Show Suggested Answer Hide Answer
Suggested Answer:
Step 1: Create an Azure Storage account that has a lifecycle policy
To automate common data management tasks, Microsoft created a solution based on Azure Data Factory. The service, Data Lifecycle Management, makes frequently accessed data available and archives or purges other data according to retention policies. Teams across the company use the service to reduce storage costs, improve app performance, and comply with data retention policies.
Step 2: Create a Log Analytics workspace that has Data Retention set to 120 days.
Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets, such as a Storage Account: Save your diagnostic logs to a storage account for auditing or manual inspection. You can use the diagnostic settings to specify the retention time in days.
Step 3: From Azure Portal, add a diagnostic setting.
Step 4: Send the data to a log Analytics workspace,
Event Hub: A pipeline that transfers events from services to Azure Data Explorer.
Keeping Azure Data Factory metrics and pipeline-run data.
Configure diagnostic settings and workspace.
Create or add diagnostic settings for your data factory.
1. In the portal, go to Monitor. Select Settings > Diagnostic settings.
2. Select the data factory for which you want to set a diagnostic setting.
3. If no settings exist on the selected data factory, you're prompted to create a setting. Select Turn on diagnostics.
4. Give your setting a name, select Send to Log Analytics, and then select a workspace from Log Analytics Workspace.
5. Select Save.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Sunnyb
Highly Voted 3 years, 10 months ago
Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.
upvoted 193 times
rainbowyu
3 years, 2 months ago
Shouldn't it need to swap step 3 & 4?
upvoted 5 times
...
Deeksha1234
2 years, 8 months ago
seems correct to me
upvoted 1 times
...
kkk5566
1 year, 7 months ago
correct
upvoted 2 times
...
Rajashekharc
2 years, 7 months ago
This is correct order, I have tried this on Azure portal.
upvoted 5 times
...
...
herculian_effort
Highly Voted 3 years, 9 months ago
step 1. From Azure Portal, add a diagnostic setting. step 2. Send data to a Log analytics workspace. step 3. Create a Log Analytics workspace that has Data Retention set to 120 days. step 4. Select the PipelineRuns Category. The video in the below link walks you through the process step by step, start watching at 2min 30sec mark https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#keeping-azure-data-factory-metrics-and-pipeline-run-data
upvoted 39 times
Armandoo
3 years, 8 months ago
This is the correct answer
upvoted 1 times
...
LiLy91
3 years, 3 months ago
Don't you have to select PipelineRuns Category while adding a diagnostic setting?
upvoted 1 times
...
BK10
3 years, 2 months ago
Video matches the steps above. Thanks for sharing.
upvoted 2 times
...
Towin
2 years, 10 months ago
I don't agree with you, why "Send data to a Log analytics workspace" is step2, but "Create the Log Analytics workspace" is step3 ? how to use Log analytics workspace if the Log analytics workspace hasn't been created?
upvoted 4 times
...
...
be8a152
Most Recent 1 year, 2 months ago
Sunnyb is correct
upvoted 1 times
...
Sriramiyer92
2 years, 8 months ago
Can see multiple answers that are correct in the discussion! Also note the question states : "More than one order of answer choices is correct"
upvoted 2 times
...
NamitSehgal
2 years, 10 months ago
Output is either SA, LA or Eventhub Retention is configured during setting up the diag on any Azure resource , so take out option 1 which says configure SA retention. Just stick to LA solution and include all the points related to it.
upvoted 1 times
...
[Removed]
3 years, 7 months ago
I am not very familiar with this topic, but follow the link below, we can know With Monitor, you can route diagnostic logs for analysis to multiple different targets: Storage account, Event Hub and Log Analytics. It also needs to query the data by use Kusto query language, so we can know we should use Log Analytics for this scenario. With this in mind, we can exclude anything related with storage account and Event Hub. Then the question talks about Pipeline runs log, so we can also exclude the Trigger run log one. Then there are 4 options left there as listed in the solution raised by @Sunnyb. https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#keeping-azure-data-factory-metrics-and-pipeline-run-data
upvoted 12 times
...
Amalbenrebai
3 years, 8 months ago
in this case we will not use a storage Account to save the diagnostic logs to a storage account, but we will send them to Log Analytics: 1: Create a Log Analytics workspace that has Data Retention set to 120 days. 2: From Azure Portal, add a diagnostic setting. 3: Select the PipelineRuns Category 4: Send the data to a Log Analytics workspace
upvoted 9 times
...
mss1
3 years, 8 months ago
If you create diagnostics from the Datafactory you wil notice that you can only set the retentiondays when you select a storage account for the PipelineRuns. So you need a storage account first. You do not have an option in the selection to create a diagnostic from the datafactory and thus the option "select the pipelineruns" is not an option. I agree with the current selection.
upvoted 2 times
mss1
3 years, 8 months ago
To complete my answer. I also agree with "Sunnyb". There are more solutions to this question.
upvoted 2 times
Marcus1612
3 years, 7 months ago
When you create diagnostic, you have to select "Log Analytics" as destination target. Log Analytics Workspace has it own Data Retention Properties under General/Usage and Estimated Cost/Data Retention. So the good answer is:Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.
upvoted 1 times
...
...
...
mric
3 years, 9 months ago
According to the linked article, it's: first Storage Account, then Event Hub, and finally Log Analytics. So I would say: 1- Create an Azure Storage Account with a lifecycle policy 2- Stream to an Azure Event Hub 3- Create a Log Analytics workspace that has a Data Retention set to 120 days 4- Send the data to a Log Analytics Workspace Source: https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#keeping-azure-data-factory-metrics-and-pipeline-run-data
upvoted 4 times
...
det_wizard
3 years, 10 months ago
Take off the storage account and After add diagnostic setting it would be select pipelineruns then send to log analytics
upvoted 2 times
...
teofz
3 years, 11 months ago
regarding the storage account, what is it for?!
upvoted 1 times
sagga
3 years, 11 months ago
I don't know if you need to, see this discussion: https://www.examtopics.com/discussions/microsoft/view/49811-exam-dp-200-topic-3-question-19-discussion/
upvoted 2 times
...
Amsterliese
3 years ago
In this case, not needed (imo). MS advises to store log data in a storage account (if needed) since Data Factory only retains it for 45 days. However, in this case you don't have to store it longer than 2 years and you want to use Kusto, so Log Analytics makes more sense.
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago