exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 65 discussion

Actual exam question from Microsoft's DP-203
Question #: 65
Topic #: 2
[All DP-203 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.
Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse.
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
juanlu46
Highly Voted 2 years, 5 months ago
Selected Answer: A
The correct answer is "A. Yes" You can execute R code in a notebook, and then call it from Data Factory. You can check it at "Databricks Notebook activity" header: https://docs.microsoft.com/en-US/azure/data-factory/transform-data And also: https://docs.microsoft.com/en-us/azure/databricks/spark/latest/sparkr/overview
upvoted 13 times
juanlu46
2 years, 5 months ago
I'm Sorry, in the statement there isn't mention to "Data factory", but you can use a Databrick's job also, therefore the solution meet the goal. https://docs.microsoft.com/en-us/azure/databricks/jobs#--run-a-job
upvoted 12 times
...
bp_a_user
1 year, 5 months ago
...but where is the ingest done?
upvoted 2 times
...
...
Dusica
Most Recent 5 months, 2 weeks ago
well it is a trick question and I would hate to get it. It says execute DBr job which can be only executed from ADF
upvoted 2 times
...
dakku987
9 months, 1 week ago
Selected Answer: A
A. Yes Explanation: Scheduling an Azure Databricks job that executes an R notebook and then inserts the data into the data warehouse is a valid solution that meets the goal. Azure Databricks is a cloud-based platform that integrates with Apache Spark, providing a collaborative environment for big data analytics and machine learning. It supports multiple programming languages, including R.
upvoted 1 times
...
ExamDestroyer69
9 months, 2 weeks ago
Selected Answer: A
**VARIATIONS OF THIS QUESTION** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that copies the data to a staging table in the data warehouse, and then uses a stored procedure to execute the R script. **NO** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse. **YES** Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse. **NO** Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse. **YES**
upvoted 3 times
...
kkk5566
1 year, 1 month ago
Selected Answer: A
Yes, this solution would meet the goal.
upvoted 1 times
...
esaade
1 year, 7 months ago
Yes, this solution would meet the goal. An Azure Databricks job can be scheduled to run on a regular basis, such as daily, and can execute an R notebook that reads data from Azure Data Lake Storage, transforms the data using R code, and then writes the transformed data to the data warehouse in Azure Synapse Analytics.
upvoted 3 times
...
vrodriguesp
1 year, 9 months ago
Selected Answer: A
should be yes, you can schedule notebook directly from databricks
upvoted 3 times
...
lemonpotato
1 year, 9 months ago
Selected Answer: A
Has to be Yes
upvoted 1 times
...
XiltroX
1 year, 10 months ago
The Answer is A. You can only execute R notebook in Databricks and not in Data Factory. The key word here is Databricks.
upvoted 1 times
...
greenlever
2 years ago
Selected Answer: A
1. extract data from Azure Data Lake Storage Gen2 into Azure Databricks, 2. run transformations on the data in Azure Databricks, 3. load the transformed data into Azure Synapse Analytics.
upvoted 2 times
...
Deeksha1234
2 years, 2 months ago
Selected Answer: A
yes, its possible
upvoted 1 times
...
demirsamuel
2 years, 4 months ago
Selected Answer: A
I go for A as well
upvoted 2 times
...
observador081
2 years, 4 months ago
You have an Azure subscription that includes the following resources: VNet1, a virtual network Subnet1, a subnet in VNet1 WebApp1, a web app application service NSG1, a network security group You create an application security group named ASG1. Which resource can use ASG1? Selecione somente uma resposta. VNet1 Subnet1 WebApp1 NSG1
upvoted 2 times
allagowf
1 year, 11 months ago
the anwser is : VNet1
upvoted 1 times
...
...
cuongthh
2 years, 4 months ago
Selected Answer: A
I go for A.
upvoted 2 times
...
HoangTr
2 years, 4 months ago
I go for A. Databrick should have an option to trigger the job on selected schedule, it doesn't need data factory to trigger.
upvoted 2 times
...
KHawk
2 years, 5 months ago
I would go for No. You can create a Spark Submit Job to run R Code but as shown in the second link, Databricks Utilities is not supoorted which would be necessary in my opinion to connect to Data Lake https://docs.microsoft.com/en-us/azure/databricks/jobs What do you think ? https://docs.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/examples#spark-submit-api-example-r
upvoted 2 times
Davico93
2 years, 3 months ago
you made me doubt about it
upvoted 1 times
...
...
Andushi
2 years, 5 months ago
Selected Answer: A
The solution meet the goal
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago