exam questions

Exam AZ-305 All Questions

View all questions & answers for the AZ-305 exam

Exam AZ-305 topic 2 question 2 discussion

Actual exam question from Microsoft's AZ-305
Question #: 2
Topic #: 2
[All AZ-305 Questions]

You have an Azure subscription that contains an Azure Blob Storage account named store1.
You have an on-premises file server named Server1 that runs Windows Server 2016. Server1 stores 500 GB of company files.
You need to store a copy of the company files from Server1 in store1.
Which two possible Azure services achieve this goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

  • A. an Azure Logic Apps integration account
  • B. an Azure Import/Export job
  • C. Azure Data Factory
  • D. an Azure Analysis services On-premises data gateway
  • E. an Azure Batch account
Show Suggested Answer Hide Answer
Suggested Answer: BC 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Eltooth
Highly Voted 2 years, 12 months ago
Selected Answer: BC
B & C are correct
upvoted 36 times
Eltooth
2 years, 11 months ago
https://docs.microsoft.com/en-gb/azure/storage/blobs/storage-blobs-introduction#move-data-to-blob-storage
upvoted 11 times
...
...
sw1000
Highly Voted 1 year, 6 months ago
Selected Answer: BC
A. an Azure Logic Apps integration account no, this is an integration service with visual flows with If-Then style logic. It does not support a way to import data from on-premise to blobstorage B. an Azure Import/Export job Agree, with other people here. C. Azure Data Factory Agree, is a way of importing data, but looking at 500GB it is a bit of overkill D. an Azure Analysis services On-premises data gateway not a data import option E. an Azure Batch account Is part of Azure Batch service and involve HPC job scheduling etc. but is not a way of importing or exporting data from on-premise to Azure Note: For 500GB we would probably use AzCopy instead. If it was a Typo and actually 500TB we would use Azure Data Box Heavy or maybe the Azure Import/Export Service if you provide your own drives.
upvoted 19 times
...
SeMo0o0o0o
Most Recent 3 weeks, 2 days ago
Selected Answer: BC
B & C are correct
upvoted 1 times
...
Thanveer
4 weeks, 1 day ago
B. an Azure Import/Export job C. Azure Data Factory
upvoted 1 times
...
Thaitanium
5 months, 1 week ago
B and C
upvoted 1 times
...
23169fd
5 months, 2 weeks ago
Selected Answer: BC
B. an Azure Import/Export job Why: You can use the Azure Import/Export service to securely transfer large amounts of data to Azure Blob Storage by shipping hard drives to an Azure data center. C. Azure Data Factory Why: Azure Data Factory can be used to create a data pipeline that moves files from on-premises to Azure Blob Storage, enabling automated and scheduled transfers.
upvoted 1 times
23169fd
5 months, 2 weeks ago
Why Not Other Options: A. Azure Logic Apps integration account: Designed for integrating workflows and not typically used for bulk data transfer. D. Azure Analysis Services On-premises data gateway: Used for accessing on-premises data sources from Azure Analysis Services, not for transferring files to Blob Storage. E. Azure Batch account: Intended for running large-scale parallel and batch compute jobs, not for transferring files to Blob Storage.
upvoted 1 times
...
...
Lazylinux
7 months, 1 week ago
Selected Answer: BC
Given answer is correct
upvoted 1 times
...
JimmyYop
10 months, 2 weeks ago
appeared in Exam 01/2024
upvoted 4 times
...
BShelat
12 months ago
Well B & C seem to be the answers. For B, though windows 2016 is NOT a supported version based on following link. https://learn.microsoft.com/en-us/azure/import-export/storage-import-export-requirements
upvoted 1 times
TomdeBom
9 months, 3 weeks ago
I think those OS requirements where only meant to describe older versions of Windows that are still support (I know, this is bad documentation form MS part, but MS Learn is far from perfect, documentation wise.) The support is about the waimportexport.exe tool used. https://learn.microsoft.com/en-us/previous-versions/azure/storage/common/storage-import-export-tool-preparing-hard-drives-import#requirements-for-waimportexportexe states Windows 7, Windows Server 2008 R2, or a newer Windows operating system are supported!
upvoted 1 times
...
...
nav109
1 year ago
Got this on Nov. 17, 2023
upvoted 4 times
...
stonwall12
1 year, 2 months ago
Correct Answer - B & C: Azure Import/Export & Azure Data Factory Azure Import/Export: - This is used for transferring large amounts of data to and from Azure Blob, File, and Disk storage using physical hard drives. It would be suitable for transferring 500 GB of data. https://docs.microsoft.com/en-us/azure/storage/common/storage-import-export-service Azure Data Factory: - Azure Data Factory is a cloud-based data integration service that can move and integrate data from various sources to various destinations. It would be suitable for copying files from Server1 to Blob Storage. https://learn.microsoft.com/en-us/azure/data-factory/introduction
upvoted 2 times
...
memo454
1 year, 2 months ago
This question is on today's exam. The exam is easier than AZ-104.
upvoted 5 times
...
iamhyumi
1 year, 3 months ago
Got this on Sept. 5, 2023
upvoted 4 times
...
lvz
1 year, 6 months ago
ok, I will go with ADF, however I dont see question mentioning the connectivity between on-prem and Azure AD. I think ADF can only be used when on-prem is connected with Azure AD.
upvoted 2 times
...
NotMeAnyWay
1 year, 8 months ago
Selected Answer: BC
B. an Azure Import/Export job C. Azure Data Factory B. Azure Import/Export job: This service allows you to securely import or export large amounts of data to or from Azure Blob Storage by shipping hard disk drives to an Azure data center. You can use the Azure Import/Export service to transfer the company files from your on-premises server to the Azure Blob Storage account. C. Azure Data Factory: It is a cloud-based data integration service that enables you to create, schedule, and manage data pipelines. You can create a pipeline in Azure Data Factory to copy data from your on-premises file server to Azure Blob Storage. You will need to use a Self-hosted Integration Runtime installed on your on-premises server to facilitate the data movement between your on-premises server and Azure Blob Storage.
upvoted 5 times
...
memyself2
1 year, 9 months ago
This was a question was on my exam today (2/26/23) - Scored 844 I agree with this answer
upvoted 6 times
...
ukivanlamlpi
1 year, 9 months ago
Selected Answer: BE
files is not fit for data factory
upvoted 2 times
AdventureChick
1 year, 2 months ago
Data Factory can move files. It isn't just for DBs. I accidentally upvoted this when I went to click reply.
upvoted 2 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago