exam questions

Exam AZ-305 All Questions

View all questions & answers for the AZ-305 exam

Exam AZ-305 topic 4 question 63 discussion

Actual exam question from Microsoft's AZ-305
Question #: 63
Topic #: 4
[All AZ-305 Questions]

You have an on-premises application named App1 that uses an Oracle database.
You plan to use Azure Databricks to transform and load data from App1 to an Azure Synapse Analytics instance.
You need to ensure that the App1 data is available to Databricks.
Which two Azure services should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Azure Data Box Gateway
  • B. Azure Import/Export service
  • C. Azure Data Lake Storage
  • D. Azure Data Box Edge
  • E. Azure Data Factory
Show Suggested Answer Hide Answer
Suggested Answer: CE 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Snownoodles
Highly Voted 2 years, 2 months ago
Selected Answer: CE
The correct answer should be C and E ADF moves data from on-prem Oracle to Data Lake storage, which makes data ready for DataBrick https://docs.microsoft.com/en-us/azure/data-factory/load-azure-data-lake-storage-gen2 DataBricks "ETL" data to Synapse: https://docs.microsoft.com/en-us/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse
upvoted 37 times
mufflon
2 years, 2 months ago
yes, this is the only answer if they dont ask for how to get the data to azure
upvoted 3 times
...
np2021
1 year, 9 months ago
I thought this also at first, but the first line of the question indicates on-premises Oracle data. So i think the question is suggesting "getting the data to Azure/into a lake so DataBricks can process it". In which case this is Import/DataFactory requirement. THis is very difficult call to make, i think when sitting the test just assume you will get 1/2 points on this one.
upvoted 4 times
Fidel_104
9 months ago
Indeed, but Data Factory has a lot of connectors, including one that makes it possible to extract data from an Oracle DB :) So you can simply set up a data pipeline is Data Factory that extracts the data from Oracle, and saves it to ADLS - therefore I believe CE is right. More info here: https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory
upvoted 1 times
Arthur_zw
2 months, 3 weeks ago
You install the Self-Hosted Integration Runtime to connect ADF to the Oracle DB then set up a linked service then data pipelines. So I believe C&E
upvoted 1 times
...
...
...
...
d365ppp
Highly Voted 1 year, 11 months ago
Selected Answer: BE
Two Services not storage
upvoted 8 times
pkkalra
1 year, 9 months ago
Azure lake storage is a cloud "service" offered by MS
upvoted 8 times
...
...
SeMo0o0o0o
Most Recent 3 weeks ago
Selected Answer: CE
C & E are correct
upvoted 1 times
...
Len83
3 months, 3 weeks ago
This question appeared in the exam, August 2024. I gave this same answers listed here. I scored 870
upvoted 3 times
...
Paul_white
1 year ago
Selected Answer: CE
To ensure that the data from App1 is available to Azure Databricks, you should include the following Azure services in your solution: 1. Azure Data Factory (E): Azure Data Factory can be used to create a data pipeline for ETL (Extract, Transform, Load) processes, which can move your data from the on-premises Oracle database to Azure¹⁴. 2. Azure Data Lake Storage (C): Azure Data Lake Storage can act as the intermediary storage area where the transformed data can be placed. Azure Databricks is tightly integrated with Azure Data Lake Storage, making it an ideal choice for storing your data. Please note that while Azure Data Box Gateway and Azure Data Box Edge are used for offline transfer of large amounts of data, and Azure Import/Export service is used for importing large amounts of data into Azure, they might not be necessary if your data can be transferred online or isn't extremely large.
upvoted 3 times
...
FurnishedFlapjack
1 year, 3 months ago
Selected Answer: CE
From this link it looks like you can directly link to an Oracle DB from ADF, doesn't look like import/export would be required. I'm going with CE https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory
upvoted 2 times
AdventureChick
1 year, 2 months ago
Yes - you can connect ADF to pretty much anything for ETL/ELT
upvoted 2 times
...
...
sawanti
1 year, 3 months ago
Selected Answer: CE
ADF - to extract and load data to Data Lake Data Lake - as it's the only storage generally supported by Databricks
upvoted 1 times
...
NotMeAnyWay
1 year, 4 months ago
Selected Answer: CE
C. Azure Data Lake Storage E. Azure Data Factory Azure Data Lake Storage is a secure, scalable and reliable data lake that allows you to perform analytics on large amounts of data. It's a great choice for storing large volumes of data, like what App1 might produce. Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows for moving and transforming data at scale. In this case, it can be used to create a pipeline to move the data from your on-premises Oracle database to Azure Data Lake Storage, making it available for further processing with Azure Databricks. Azure Data Factory has built-in support for a wide range of data sources, including Oracle. After the data is stored in Azure Data Lake Storage, you can use Azure Databricks to transform the data and load it into the Azure Synapse Analytics instance.
upvoted 5 times
...
wpestan
1 year, 6 months ago
Selected Answer: BE
B and E, teacher correct in question in Azure Course
upvoted 1 times
sawanti
1 year, 3 months ago
Databricks can only read data from Data Lake (and some external sources, but that's not the case). Where do you have Data Lake in your solution??? CE is correct (ADF to Extract data from on-premise system and load to Data Lake, and then Data Lake to be mounted to Databricks)
upvoted 1 times
...
...
Tr619899
1 year, 6 months ago
Azure Data Lake Storage (Option C): Azure Data Lake Storage provides a scalable and secure repository for storing large amounts of data. You can ingest data from App1 into Azure Data Lake Storage, and then make it available for processing in Azure Databricks. Azure Data Factory (Option E): Azure Data Factory is a fully managed data integration service that allows you to orchestrate and automate data movement and data transformation workflows. You can use Azure Data Factory to extract data from App1, transform it using Azure Databricks, and then load it into Azure Synapse Analytics.
upvoted 1 times
...
yonie
1 year, 7 months ago
Selected Answer: CE
In AZ-304 it was ADL and ADF meaning CE https://www.examtopics.com/discussions/microsoft/view/51579-exam-az-304-topic-3-question-20-discussion/
upvoted 7 times
...
rex303
1 year, 8 months ago
Selected Answer: CE
This scenario should be consistent with using C and E. Azure Data Factory is a recommended solution for migrating Oracle data. Azure Data Lake storage can then hold the data in a useable format for the chosen solution: Azure Databricks. Azure Data Box Gateway does not natively support Oracle. Azure Data Box Edge is an appliance not a service. And the azure import/export service is for one-shot migrations not really suitable for this scenario.
upvoted 2 times
...
cp2323
1 year, 9 months ago
Selected Answer: CE
CE should be the answer, why somone want to use Azure Import/Export service!
upvoted 2 times
...
zellck
1 year, 9 months ago
Selected Answer: CE
CE is the answer. https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-introduction#designed-for-enterprise-big-data-analytics Data Lake Storage Gen2 makes Azure Storage the foundation for building enterprise data lakes on Azure. Designed from the start to service multiple petabytes of information while sustaining hundreds of gigabits of throughput, Data Lake Storage Gen2 allows you to easily manage massive amounts of data. https://learn.microsoft.com/en-us/azure/data-factory/introduction Big data requires a service that can orchestrate and operationalize processes to refine these enormous stores of raw data into actionable business insights. Azure Data Factory is a managed cloud service that's built for these complex hybrid extract-transform-load (ETL), extract-load-transform (ELT), and data integration projects.
upvoted 5 times
zellck
1 year, 9 months ago
https://learn.microsoft.com/en-us/azure/synapse-analytics/migration-guides/oracle/7-beyond-data-warehouse-migration A key reason to migrate your existing data warehouse to Azure Synapse Analytics is to utilize a globally secure, scalable, low-cost, cloud-native, pay-as-you-use analytical database. With Azure Synapse, you can integrate your migrated data warehouse with the complete Microsoft Azure analytical ecosystem to take advantage of other Microsoft technologies and modernize your migrated data warehouse. Those technologies include: - Azure Data Lake Storage for cost effective data ingestion, staging, cleansing, and transformation. Data Lake Storage can free up the data warehouse capacity occupied by fast-growing staging tables. - Azure Data Factory for collaborative IT and self-service data integration with connectors to cloud and on-premises data sources and streaming data.
upvoted 2 times
...
...
Rams_84zO6n
1 year, 9 months ago
Selected Answer: BE
I remember answering another question Topic 1 Q26 https://www.examtopics.com/exams/microsoft/az-305/view/6/ The solution suggests import/export might be good option to ingest on-premise data continuously upstream to processing the data with ADF pipeline.
upvoted 1 times
...
Rams_84zO6n
1 year, 9 months ago
Selected Answer: BE
on-premise data => Azure Synapse Link for dataverse (import/export) => Data Factory (data pipeline) => data bricks
upvoted 1 times
...
Eusouzati
1 year, 9 months ago
Selected Answer: CE
Is Correct C. Azure Data Lake Storage E. Azure Data Factory
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago