exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 49 discussion

Actual exam question from Microsoft's DP-203
Question #: 49
Topic #: 2
[All DP-203 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.
Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that copies the data to a staging table in the data warehouse, and then uses a stored procedure to execute the R script.
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
NaiCob
Highly Voted 2 years, 10 months ago
Correct answer: No - you cannot execute the R script using a stored procedure activity
upvoted 55 times
auwia
1 year, 3 months ago
I agree, I've found an articol where it's saying that you can run R script from a Custom .net activity, or better if you have BYOC HDInsight cluster that already has R Installed on it.
upvoted 1 times
...
...
Daemon69
Highly Voted 2 years, 9 months ago
I select A because you can use R script in sp_execute_external_script
upvoted 15 times
Rossana
1 year, 5 months ago
The answer is NO for other reasons than the SP. Concerning the SP: To execute an R script within a stored procedure in Synapse Analytics, you can use the sp_execute_external_script system stored procedure. This procedure can be used to execute R scripts, as well as scripts written in other languages such as Python.
upvoted 2 times
...
sparkchu
2 years, 6 months ago
i admire your thought, but context looks wanna discriminate the inavailability of R in SP not like that in Databricks.
upvoted 6 times
...
...
MBRSDG
Most Recent 6 months, 2 weeks ago
Selected Answer: B
"and then uses a stored procedure to execute the R script" you cannot use stored procedures to execute R scripts: it's just absurd. Stored procedures are SQL statements.
upvoted 1 times
...
ExamDestroyer69
9 months, 2 weeks ago
Selected Answer: B
**VARIATIONS OF THIS QUESTION** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that copies the data to a staging table in the data warehouse, and then uses a stored procedure to execute the R script. **NO** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse. **YES** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse. **NO** Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse. **YES**
upvoted 6 times
...
TheReg81
1 year ago
While the most voted is 'No', the green bar indicates 'Yes'. My first inclination was 'No' as well, because it's not the easiest or most logical way to do it. Then again, there's many roads to Rome. Does it meet the goal, yes.
upvoted 2 times
...
kkk5566
1 year, 1 month ago
Selected Answer: B
B should be correct
upvoted 1 times
...
FRANCIS_A_M
1 year, 6 months ago
Correct Answer B: The solution proposed does not meet the goal because it suggests executing the R script using a stored procedure in the data warehouse. Azure Synapse Analytics does not support executing R scripts directly within stored procedures. Instead, you should use Azure Data Factory to orchestrate the process, using an Azure Machine Learning activity to execute the R script for data transformation before loading the transformed data into Azure Synapse Analytics.
upvoted 3 times
...
Kamekung
1 year, 7 months ago
Btw.. Is it worth to pay for accessing the rest of pages? Since the actual value is community discussion. And beyond this point, it's supposed to be less people.
upvoted 3 times
FRANCIS_A_M
1 year, 6 months ago
I have paid for further access and would say it is worth it. The community discussion continues
upvoted 2 times
...
...
bubby248
1 year, 8 months ago
Cant we fix answers correctly in the portal, instead of relying on votes
upvoted 3 times
...
mckovin
1 year, 8 months ago
Correct
upvoted 1 times
...
millusmiley
1 year, 9 months ago
Next page is asking for contributor access, anyone have credentials or how we can skip the payment
upvoted 2 times
CNBOOST2
1 year, 8 months ago
I think this is not possible we have to pay :(
upvoted 3 times
...
Dusica
5 months, 3 weeks ago
you can't skip the payment. It is not too much money for its worth
upvoted 1 times
...
...
Dusica
1 year, 9 months ago
Selected Answer: B
there is a staging zone in Azure Data Lake Storage. The very fact that A suggest copying into DWH staging zone makes it invalid so any other discussion is unnecessary. It is B
upvoted 3 times
...
akk_1289
1 year, 9 months ago
his solution does not meet the goal of the daily process you have described. While using an Azure Data Factory schedule trigger to execute a pipeline is a good approach for scheduling the process to run on a daily basis, the pipeline you have described does not include any steps to transform the data using an R script. To meet the goal of the daily process, you will need to include a step in the pipeline to execute the R script that transforms the data. One way to do this would be to use an Azure Data Factory activity, such as an Execute R Script activity, to run the R script on the data as it is being copied from the staging zone to the staging table in the data warehouse. You can then use a stored procedure or another Data Factory activity, such as an SQL activity, to insert the transformed data into the final destination table in the data warehouse.
upvoted 1 times
...
Tj87
2 years, 2 months ago
Synapse doesn't support R at the moment https://docs.microsoft.com/en-us/answers/questions/222624/is-azure-synapse-analytics-supporting-r-language.html
upvoted 2 times
...
Deeksha1234
2 years, 2 months ago
should be B
upvoted 3 times
...
nilubabu
2 years, 4 months ago
As per problem, Azure Data Lake Storage account that contains a staging zone. From staging zone, transform the data and insert into Azure Synapse Analytics. But the solution providing as copy data to a staging table in data warehouse. As per problem, staging will be in Azure Data Lake Storage account, not in data warehouse. Answer is 'B'
upvoted 5 times
...
rafaelptu
2 years, 7 months ago
Sim, o script vai ser executado e carregado posteriormente a execução pode ser chamada pela sp_exec_external_script
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago