exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 63 discussion

Actual exam question from Microsoft's DP-203
Question #: 63
Topic #: 2
[All DP-203 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.
Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse.
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
juanlu46
Highly Voted 2 years, 5 months ago
Selected Answer: A
I think is A. Yes. You can execute R code in a notebook, and then call it from Data Factory. You can check it at "Databricks Notebook activity" header: https://docs.microsoft.com/en-US/azure/data-factory/transform-data And also: https://docs.microsoft.com/en-us/azure/databricks/spark/latest/sparkr/overview
upvoted 22 times
Gikan
8 months, 3 weeks ago
No is the answer: It is all true, what you wrote down, but Synapse Analytics has got its own "data factory". Its name is "pipeline". I do not see a chance to set up to data factory the sick as DWH in Synapse.
upvoted 1 times
Gikan
8 months, 3 weeks ago
I tried it. I was wrong.
upvoted 2 times
biafko
8 months, 2 weeks ago
Next time only comment when you are sure. People like you are making it more confusing for people who are trying to learn and prepare for the exam.
upvoted 10 times
...
...
...
...
ExamDestroyer69
Highly Voted 9 months, 2 weeks ago
Selected Answer: A
**VARIATIONS OF THIS QUESTION** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that copies the data to a staging table in the data warehouse, and then uses a stored procedure to execute the R script. **NO** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse. **YES** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse. **NO** Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse. **YES**
upvoted 6 times
...
kkk5566
Most Recent 1 year, 1 month ago
Selected Answer: A
yes can do it
upvoted 1 times
...
vctrhugo
1 year, 4 months ago
Selected Answer: A
A. Yes The proposed solution meets the goal of designing a daily process to ingest incremental data from the staging zone, transform the data using an R script, and insert the transformed data into a data warehouse in Azure Synapse Analytics. The solution involves using an Azure Data Factory (ADF) schedule trigger to execute a pipeline that executes an Azure Databricks notebook and then inserts the data into the data warehouse.
upvoted 3 times
...
esaade
1 year, 7 months ago
Yes, this solution meets the goal of ingesting incremental data from the staging zone, transforming the data by executing an R script, and inserting the transformed data into a data warehouse in Azure Synapse Analytics. By using an Azure Data Factory schedule trigger, you can schedule the pipeline to run on a daily basis. The pipeline can execute an Azure Databricks notebook, which can perform the transformation using R scripts, and then insert the transformed data into the data warehouse.
upvoted 4 times
...
vrodriguesp
1 year, 9 months ago
Selected Answer: A
yes, you can execute R script in notebook and call it via adf
upvoted 4 times
...
urielramoss
1 year, 10 months ago
Selected Answer: A
the answer is YES. I already used this solution in a previous project.
upvoted 4 times
...
rzeng
1 year, 11 months ago
should be YES
upvoted 2 times
...
dom271219
2 years, 1 month ago
Selected Answer: A
We do sth like it in my company
upvoted 4 times
...
Deeksha1234
2 years, 2 months ago
Selected Answer: A
answer should be A
upvoted 5 times
...
Sriramiyer92
2 years, 2 months ago
Selected Answer: A
A. R Language is supported in ADB. ADB notebooks, can be called from ADF pipeline(Use Notebook Activity) to link to the ADB notebook
upvoted 3 times
...
Davico93
2 years, 3 months ago
I don't know guys, it's kind of tricky, in 2 next questions, it says "inser the TRANSFORMED data" and here it says jus "DATA".... what do you think?
upvoted 2 times
...
evega
2 years, 4 months ago
Para mi es la respuesta A. En un pipeline de ADF puede tener una actividad de notebook para databricks, el cual permitirá ejecutar el notebook una vez al día a través de un trigger.
upvoted 3 times
...
OCHT
2 years, 5 months ago
Selected Answer: A
R in notebook and call via Data Factory
upvoted 4 times
...
MS_Nikhil
2 years, 5 months ago
Selected Answer: A
You can execute R code in a notebook.
upvoted 4 times
hbad
2 years, 5 months ago
The correct answer should be No, based on the how it is worded and the following logic: In Azure Data Factory a Databricks Activity can be used to execute a Databricks notebook. However, it cannot pass the data along to the next activity ( dbutils.notebook.exit("returnValue") only passes a string). Given that the way this is worded it says " execute a pipeline that executes an Azure Databricks notebook, and then inserts the data " the "then" implies a next step which wont work as cant pass the data along. If the transformation and insert both happened in the notebook only then it would work. https://docs.microsoft.com/en-US/azure/data-factory/transform-data-databricks-notebook
upvoted 2 times
nefarious_smalls
2 years, 5 months ago
Yea but you do not have to pass the data along in ADF. You can insert it into Synapse from the notebook.
upvoted 2 times
hbad
2 years, 5 months ago
precisely my point, either both things ( R and Insert) should be in the one workbook OR you need two workbooks. The wording indicates 2 steps rather than all in one book: "notebook" being the first step and “then” indicating another step.
upvoted 2 times
Igor85
1 year, 10 months ago
i don't see any problem to run R and write Synapse dedicated SQL pool in the same notebook https://learn.microsoft.com/en-us/azure/databricks/external-data/synapse-analytics
upvoted 2 times
...
...
...
...
...
romega2
2 years, 5 months ago
Selected Answer: A
I agree that Yes
upvoted 2 times
...
gauravgogs
2 years, 5 months ago
I think it should be Yes. i.e. A R Script is well supported by databricks notepad
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago