exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 22 discussion

Actual exam question from Microsoft's DP-203
Question #: 22
Topic #: 2
[All DP-203 Questions]

HOTSPOT -
You build an Azure Data Factory pipeline to move data from an Azure Data Lake Storage Gen2 container to a database in an Azure Synapse Analytics dedicated
SQL pool.
Data in the container is stored in the following folder structure.
/in/{YYYY}/{MM}/{DD}/{HH}/{mm}
The earliest folder is /in/2021/01/01/00/00. The latest folder is /in/2021/01/15/01/45.
You need to configure a pipeline trigger to meet the following requirements:
✑ Existing data must be loaded.
✑ Data must be loaded every 30 minutes.
✑ Late-arriving data of up to two minutes must be included in the load for the time at which the data should have arrived.
How should you configure the pipeline trigger? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: Tumbling window -
To be able to use the Delay parameter we select Tumbling window.
Box 2:
Recurrence: 30 minutes, not 32 minutes
Delay: 2 minutes.
The amount of time to delay the start of data processing for the window. The pipeline run is started after the expected execution time plus the amount of delay.
The delay defines how long the trigger waits past the due time before triggering a new run. The delay doesn't alter the window startTime.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-tumbling-window-trigger

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Puneetgupta003
Highly Voted 3 years, 4 months ago
ANswers are correct
upvoted 52 times
...
positivitypeople
Highly Voted 10 months ago
Got this question today on the exam
upvoted 7 times
...
Alongi
Most Recent 6 months, 3 weeks ago
Both correct
upvoted 1 times
...
kkk5566
1 year, 1 month ago
Answers are correct
upvoted 1 times
...
Deeksha1234
2 years, 2 months ago
correct
upvoted 4 times
...
StudentFromAus
2 years, 4 months ago
Answers are correct
upvoted 2 times
...
parx
2 years, 6 months ago
Correct. https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers#trigger-type-comparison
upvoted 4 times
...
azurearmy
2 years, 11 months ago
Why can't we use an event-based trigger here?
upvoted 2 times
aaaaaaaan
2 years, 11 months ago
because we also wanna do backfill with past data. Technically, the event-based trigger will also allows ADF to find all the old files from the source which ADF hasn't processed yet (and we could add a datetime filter when loading the data) but ADF is gonna choke on so many past events from experience. With tumbling windows, the trigger will kick off for each 30 minutes slices of the time span, emulating batch loads. be very careful when doing backfill with a tumbling window, by default, ADF will start 50 concurrent pipelines, it can be pricey, change the settings in advanced panel of the trigger creation form.
upvoted 10 times
...
...
belha
3 years, 3 months ago
not schedule ?
upvoted 2 times
captainbee
3 years, 3 months ago
As the solution says, you cannot use the Delay with Schedule.
upvoted 7 times
...
...
escoins
3 years, 3 months ago
why not schedule trigger?
upvoted 1 times
Podavenna
3 years, 1 month ago
Schedule trigger would not work because backfill is only possible with Tumbling window trigger. In this case, we need to use trigger for old data.
upvoted 8 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago