exam questions

Exam AZ-304 All Questions

View all questions & answers for the AZ-304 exam

Exam AZ-304 topic 3 question 13 discussion

Actual exam question from Microsoft's AZ-304
Question #: 13
Topic #: 3
[All AZ-304 Questions]

HOTSPOT -
You on-premises network contains a file server named Server1 that stores 500 GB of data.
You need to use Azure Data Factory to copy the data from Server1 to Azure Storage.
You add a new data factory.
What should you do next? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: Install a self-hosted integration runtime
The Integration Runtime is a customer-managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.

Box 2: Create a pipeline -
With ADF, existing data processing services can be composed into data pipelines that are highly available and managed in the cloud. These data pipelines can be scheduled to ingest, prepare, transform, analyze, and publish data, and ADF manages and orchestrates the complex data and processing dependencies
Reference:
https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql-azure-adf

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
verbatim86
Highly Voted 4 years, 1 month ago
correct / https://docs.microsoft.com/pl-pl/azure/data-factory/tutorial-hybrid-copy-data-tool
upvoted 30 times
...
BenBen
Highly Voted 4 years, 1 month ago
I still don't get why we should use ADF to move files to Azure haha
upvoted 14 times
sunmonkey
3 years, 12 months ago
Most likely to transform the data somehow during the process.
upvoted 4 times
demonite
3 years, 11 months ago
Yep ETL
upvoted 1 times
...
...
pentium75
3 years, 7 months ago
So that you can later save 4993.14 USD per month by replacing ADF with AZCOPY, see topic 3 question 3 ;) https://www.examtopics.com/discussions/microsoft/view/38657-exam-az-304-topic-3-question-3-discussion/
upvoted 80 times
anthonyphuc
3 years, 4 months ago
the question just comes for getting the knowledge :)))
upvoted 1 times
...
tteesstt
3 years, 6 months ago
Lmao, good one!
upvoted 3 times
...
wwwmmm
2 years, 3 months ago
Right on, and it's a huge strategic cost-saving item in your department in future!
upvoted 1 times
...
...
...
Harald105
Most Recent 3 years, 4 months ago
Nicely played, pentium75 :)
upvoted 5 times
...
syu31svc
3 years, 7 months ago
https://docs.microsoft.com/en-us/azure/data-factory/create-self-hosted-integration-runtime?tabs=data-factory "A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network" https://docs.microsoft.com/en-us/azure/data-factory/introduction "With Data Factory, you can use the Copy Activity in a data pipeline to move data from both on-premises and cloud source data stores to a centralization data store in the cloud for further analysis" Answer is correct
upvoted 4 times
...
pentium75
3 years, 7 months ago
In second box, there is an option "Provision Azure-SQL Server SSIS runtime," which is obviously wrong as we need a self-hosted (not an Azure-SQL) runtime. But still, don't we have to provision the self-hosted SSIS runtime in Azure Data Factory before we deploy it to the on-premise server?
upvoted 2 times
...
mahwish
3 years, 10 months ago
2nnd is create an import export job
upvoted 2 times
...
arytech
3 years, 10 months ago
It seems to be correct for me, as there is no "copy data tool" option in the data factory bombo box, the most approximated one is "create a pipeline" as described in the following references (the last one hits the nail for me). References: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview https://docs.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-copy-data-tool
upvoted 3 times
...
SnakePlissken
3 years, 10 months ago
1. StorageV2 Only storage type with storage tiers. The Central Europe region is no Azure region. In that geographic region, only Germany West Central has storage accounts with storage tiers. By the way, France Central is not situated in Central Europe geographically. 2. ZRS Protection against single datacenter failure. https://en.wikipedia.org/wiki/Central_Europe https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy#redundancy-in-the-primary-region
upvoted 1 times
awalao
3 years, 4 months ago
Hey watch where your comment. your comment should be in the next question. lmao
upvoted 3 times
...
...
nabylion
4 years, 1 month ago
It should mention :data to be migrated is from a SQL Server database...
upvoted 2 times
lawry
3 years, 7 months ago
nope, because it is not a SQL Server DB, then box2 select the first one; if it is a SQL Server DB, the better way is to use the 3rd one for box2;
upvoted 1 times
...
...
prashantjoge
4 years, 1 month ago
the link provided has no relation to the question asked... Not sure if this question makes sense?
upvoted 3 times
...
Mikie889
4 years, 1 month ago
Correct...
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago