exam questions

Exam AZ-305 All Questions

View all questions & answers for the AZ-305 exam

Exam AZ-305 topic 1 question 5 discussion

Actual exam question from Microsoft's AZ-305
Question #: 5
Topic #: 1
[All AZ-305 Questions]

HOTSPOT -
You plan to deploy Azure Databricks to support a machine learning application. Data engineers will mount an Azure Data Lake Storage account to the Databricks file system. Permissions to folders are granted directly to the data engineers.
You need to recommend a design for the planned Databrick deployment. The solution must meet the following requirements:
✑ Ensure that the data engineers can only access folders to which they have permissions.
✑ Minimize development effort.
✑ Minimize costs.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: Premium -
Premium Databricks SKU is required for credential passhtrough.

Box 2: Credential passthrough -
Athenticate automatically to Azure Data Lake Storage Gen1 (ADLS Gen1) and Azure Data Lake Storage Gen2 (ADLS Gen2) from Azure Databricks clusters using the same Azure Active Directory (Azure AD) identity that you use to log into Azure Databricks. When you enable Azure Data Lake Storage credential passthrough for your cluster, commands that you run on that cluster can read and write data in Azure Data Lake Storage without requiring you to configure service principal credentials for access to storage.
Reference:
https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Tyler2021
Highly Voted 2 months, 2 weeks ago
Databricks SKU should be a Premium plan. As the doc states both cloud storage access and credential passthrough features will need a Premium plan. https://docs.microsoft.com/en-us/azure/databricks/sql/user/security/cloud-storage-access https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough#adls-aad-credentials
upvoted 81 times
sadako
2 years, 9 months ago
Premium Credential Passthrough
upvoted 22 times
...
Shadow983
2 years, 11 months ago
Agree. The SKU should be Premium.
upvoted 15 times
...
Shadoken
2 years, 4 months ago
«Standard clusters with credential passthrough are limited to a single user. Standard clusters support Python, SQL, Scala, and R. On Databricks Runtime 6.0 and above, SparkR is supported; on Databricks Runtime 10.1 and above, sparklyr is supported.» - https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough#--enable-azure-data-lake-storage-credential-passthrough-for-a-standard-cluster Yes, we need premium SKU
upvoted 10 times
daws08322
2 months, 2 weeks ago
https://learn.microsoft.com/en-us/azure/databricks/data-governance/credential-passthrough/adls-passthrough Requirements Premium plan. See Upgrade or Downgrade an Azure Databricks Workspace for details on upgrading a standard plan to a premium plan. An Azure Data Lake Storage Gen1 or Gen2 storage account. Azure Data Lake Storage Gen2 storage accounts must use the hierarchical namespace to work with Azure Data Lake Storage credential passthrough. See Create a storage account for instructions on creating a new ADLS Gen2 account, including how to enable the hierarchical namespace. Properly configured user permissions to Azure Data Lake Storage. An Azure Databricks administrator needs to ensure that users have the correct roles, for example, Storage Blob Data Contributor, to read and write data stored in Azure Data Lake Storage. See Use the Azure portal to assign an Azure role for access to blob and queue data. You cannot use a cluster configured with ADLS credentials, for example, service principal credentials, with credential passthrough.
upvoted 2 times
ageorgieva
5 months, 1 week ago
Hi daws08311 "This documentation has been retired and might not be updated. Credential passthrough is deprecated starting with Databricks Runtime 15.0 and will be removed in future Databricks Runtime versions. Databricks recommends that you upgrade to Unity Catalog. Unity Catalog simplifies security and governance of your data by providing a central place to administer and audit data access across multiple workspaces in your account. See What is Unity Catalog?. For heightened security and governance posture, contact your Azure Databricks account team to disable credential passthrough in your Azure Databricks account."
upvoted 1 times
...
...
...
...
NotMeAnyWay
Highly Voted 2 months, 2 weeks ago
Recommended design for the planned Databricks deployment that meets the given requirements: - Databricks SKU: Premium - Premium SKU provides access control for DBFS root and FUSE mount points. This will ensure that the data engineers can only access folders to which they have permissions. - Cluster Configuration: Credentials passthrough - Credentials passthrough allows users to authenticate with Azure Data Lake Storage using their own Azure AD credentials. This minimizes development effort and costs, as it does not require additional Azure AD application registration and service principal management. Therefore, the recommended design for the planned Databricks deployment is to use Premium SKU for access control of DBFS root and FUSE mount points, and to configure credentials passthrough for authentication with Azure Data Lake Storage. This design meets the requirements of ensuring data engineers can only access folders to which they have permissions, minimizing development effort and costs.
upvoted 14 times
...
SeMo0o0o0o
Most Recent 3 weeks, 3 days ago
CORRECT
upvoted 1 times
...
BShelat
2 months, 2 weeks ago
https://learn.microsoft.com/en-us/azure/databricks/archive/credential-passthrough/adls-passthrough This documentation has been retired and might not be updated. Credential passthrough is a legacy data governance model. Databricks recommends that you upgrade to Unity Catalog. Unity Catalog simplifies security and governance of your data by providing a central place to administer and audit data access across multiple workspaces in your account. See What is Unity Catalog?. Note: I think in future Data lake storage account access will be through Unity Catalog to govern the data for Databricks. So new question in future tests may be related to Unity Catalog and managed identities.
upvoted 6 times
stjokerli
11 months, 2 weeks ago
Verified
upvoted 2 times
...
...
stonwall12
2 months, 2 weeks ago
Correct Answer - Databrick SKU: Premium Premium SKU for Azure Databricks provides enhanced security features, including integration with Azure Active Directory (Azure AD). By using Azure AD, you can enforce role-based access control (RBAC) and allow for directory-based authentication. https://learn.microsoft.com/en-us/azure/databricks/introduction/ https://azure.microsoft.com/en-au/pricing/details/databricks/ Correct Answer - Cluster Configuration: Credential Passthrough Credential passthrough allows users to authenticate to Azure Data Lake Storage using their personal Azure Active Directory (Azure AD) credentials. As a result, they will only be able to access the folders and data to which they have been granted permission. NOTE: Credential passthrough is a legacy data governance model. Databricks recommends that you upgrade to Unity Catalog. https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/
upvoted 2 times
...
Ghoshy
2 months, 2 weeks ago
It is Standard and Credentials Passthrough considering the fact that we need to minimize costs. You do not need to use the Premium SKU of Azure Data Lake Storage to enable credential passthrough or to support multiple users. Both of these features are available in both the Standard and Premium SKUs of Azure Data Lake Storage. The Premium SKU of Azure Data Lake Storage offers additional features and performance improvements, such as higher throughput and lower latencies, but it is not required to enable credential passthrough or to support multiple users.
upvoted 3 times
SpurdoSparde
1 year, 11 months ago
Could you provide any reference though? A link to documentation would be valuable
upvoted 2 times
...
...
Munishrrm
6 months ago
how can we save the cost if select as premium?. As per question, they asked to minimize the cost
upvoted 1 times
...
Munishrrm
6 months, 1 week ago
For minimizing the cost, how can we chose premium option?
upvoted 1 times
...
mtc9
10 months, 3 weeks ago
Credentials can give you access based on RBAC to whole container and this requires folder level access, which would need SAS. Isnt the anwer then Secret Scope?
upvoted 1 times
...
flash007
1 year, 3 months ago
premium plan is required for databricks
upvoted 2 times
...
KrisDeb
1 year, 6 months ago
Just a heads up, will be probably removed after the exam update: 'Credential passthrough is a legacy data governance model. Databricks recommends that you upgrade to Unity Catalog.' https://learn.microsoft.com/en-us/azure/databricks/data-governance/credential-passthrough/adls-passthrough
upvoted 6 times
...
jj22222
1 year, 9 months ago
Premium Credential Passthrough
upvoted 2 times
...
jameslee
1 year, 9 months ago
https://learn.microsoft.com/en-us/azure/databricks/data-governance/credential-passthrough/adls-passthrough#adls-aad-credentials "Azure Data Lake Storage credential passthrough is supported with Azure Data Lake Storage Gen1 and Gen2 only. Azure Blob storage does not support credential passthrough."
upvoted 1 times
...
OPT_001122
1 year, 10 months ago
1 Premium 2 Credential Passthrough
upvoted 2 times
...
Bummer_boy
1 year, 10 months ago
Didn't know it should have been premium SKU for the cluster. Never used this feature in practice
upvoted 1 times
...
ejml
2 years, 5 months ago
Documentation is clear: Standard clusters with credential passthrough are limited to a single user. Standard clusters support Python, SQL, Scala, and R. On Databricks Runtime 6.0 and above, SparkR is supported; on Databricks Runtime 10.1 and above, sparklyr is supported. So, we need premium
upvoted 4 times
Shadoken
2 years, 4 months ago
Yes, you are right: https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough#--enable-azure-data-lake-storage-credential-passthrough-for-a-standard-cluster
upvoted 2 times
...
...
OCHT
2 years, 5 months ago
Then , What kind of plan required for Databrick SKU , either Premium or Standard ?
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...