exam questions

Exam DP-700 All Questions

View all questions & answers for the DP-700 exam

Exam DP-700 topic 2 question 14 discussion

Actual exam question from Microsoft's DP-700
Question #: 14
Topic #: 2
[All DP-700 Questions]

You have a Fabric workspace that contains a lakehouse named Lakehouse1.
In an external data source, you have data files that are 500 GB each. A new file is added every day.
You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirements
Trigger the process when a new file is added.
Provide the highest throughput.
Which type of item should you use to ingest the data?

  • A. Data pipeline
  • B. Environment
  • C. KQL queryset
  • D. Dataflow Gen2
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
henryphchan
2 months, 1 week ago
Selected Answer: A
same as the previous question
upvoted 1 times
...
2e6975f
2 months, 1 week ago
Selected Answer: A
For high-throughput, event-triggered ingestion of large files into a lakehouse without transformations, Data pipeline is the most appropriate and efficient item in Fabric.
upvoted 3 times
...
8d6881f
2 months, 4 weeks ago
Selected Answer: A
Data Pipeline is not right as at 2025-Jan as the storage trigger is still in preview, so it doesn't satisfy the requirement. But it's probably the best option.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago