exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 106 discussion

Actual exam question from Microsoft's DP-203
Question #: 106
Topic #: 2
[All DP-203 Questions]

You have two Azure Blob Storage accounts named account1 and account2.

You plan to create an Azure Data Factory pipeline that will use scheduled intervals to replicate newly created or modified blobs from account1 to account2.

You need to recommend a solution to implement the pipeline. The solution must meet the following requirements:
• Ensure that the pipeline only copies blobs that were created or modified since the most recent replication event.
• Minimize the effort to create the pipeline.

What should you recommend?

  • A. Run the Copy Data tool and select Metadata-driven copy task.
  • B. Create a pipeline that contains a Data Flow activity.
  • C. Create a pipeline that contains a flowlet.
  • D. Run the Copy Data tool and select Built-in copy task.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Sabbath
Highly Voted 1 year, 3 months ago
Selected Answer: D
Just use Built-in copy task, according to: https://learn.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-lastmodified-copy-data-tool
upvoted 13 times
...
vctrhugo
Highly Voted 1 year, 2 months ago
Selected Answer: D
"[...] use the Copy Data tool to create a pipeline that incrementally copies new and changed files only, from Azure Blob storage to Azure Blob storage. It uses LastModifiedDate to determine which files to copy." https://learn.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-lastmodified-copy-data-tool
upvoted 6 times
...
9370d83
Most Recent 18 hours, 10 minutes ago
Selected Answer: A
While the Built-in copy task is useful for simpler scenarios, it does not inherently support metadata-driven incremental copy, which is key to meeting the requirements.
upvoted 1 times
...
Pey1nkh
2 weeks, 4 days ago
Selected Answer: A
This option provides a minimal effort solution and can automatically track changes. why not D : The Built-in copy task is a straightforward approach in Azure Data Factory to copy data from one source to a destination. However, this does not inherently include metadata tracking like the metadata-driven copy task does.
upvoted 1 times
...
swathi_rs
4 months, 2 weeks ago
Selected Answer: D
correct according to doc
upvoted 1 times
...
Dusica
4 months, 2 weeks ago
vote for D
upvoted 1 times
...
Elanche
5 months, 1 week ago
A. Run the Copy Data tool and select Metadata-driven copy task. Explanation: The Metadata-driven copy task in Azure Data Factory allows you to dynamically define and manage data movement and transformation tasks based on metadata. This includes copying only blobs that were created or modified since the most recent replication event, which aligns with the requirements. This solution minimizes the effort to create the pipeline, as you can configure the replication task based on metadata properties such as the creation or modification timestamp of the blobs. The Built-in copy task also provides a solution for data movement but may require more manual effort to configure the incremental replication based on timestamps
upvoted 3 times
...
Azure_2023
7 months ago
Selected Answer: D
The built-in copy task leads you to create a pipeline within five minutes to replicate data without learning about entities. The metadata driven copy task to ease your journey of creating parameterized pipelines and external control table in order to manage to copy large amounts of objects (for example, thousands of tables) at scale.
upvoted 1 times
...
dakku987
8 months, 1 week ago
Selected Answer: A
Metadata-driven copy task (A): The Metadata-driven copy task in the Copy Data tool of Azure Data Factory allows you to replicate data based on changes in metadata. This includes the ability to only copy blobs that were created or modified since the last replication event. Given the requirements to replicate blobs based on changes in metadata and to minimize effort, the Metadata-driven copy task is a suitable choice.
upvoted 3 times
ExamKiller42
6 months, 3 weeks ago
From the documentation https://learn.microsoft.com/en-us/azure/data-factory/copy-data-tool-metadata-driven it looks like the Metadata-driven copy task retreives metaData from control table in order to perform the copy task from multiple sources to multiple destinations. That is not the requirement here. The built-in copy task checks one source folder and looks at the metaData of the files present there and copies only the files that were newly added or modified since the last run: https://learn.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-lastmodified-copy-data-tool So the answer has to be - D. Run the Copy Data tool and select Built-in copy task.
upvoted 3 times
...
j888
7 months, 1 week ago
I agree A is more appropriate.
upvoted 1 times
...
...
MarkJoh
9 months, 1 week ago
A is correct because of the requirement "Minimize the effort to create the pipeline."
upvoted 3 times
Lewiasskick
8 months, 1 week ago
metadriven is for copy huge amounts of objects (for example, thousands of tables) or load data from large variety of sources
upvoted 1 times
...
...
kkk5566
1 year ago
Selected Answer: D
D is correct
upvoted 1 times
...
auwia
1 year, 2 months ago
Selected Answer: D
Create a data factory. Use the Built-in Copy Data tool to create a pipeline. Monitor the pipeline and activity runs.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago