Correct answer is E. storage location is optional.
"(Optional) Enter a Storage location for output data from the pipeline. The system uses a default location if you leave Storage location empty"
D. A location of a target database for the written data
Why this is correct: When creating a Delta Live Tables (DLT) pipeline, you must specify the target database where the resulting data will be written. This ensures that the output of the pipeline is stored properly.
Why the other options are incorrect:
A. A key-value pair configuration: While configurations are useful, they are not mandatory when setting up a DLT pipeline.
B. The preferred DBU/hour cost: You don't specify a cost directly; the DBU is associated with the cluster used.
C. A path to cloud storage location for the written data: While storage paths may be specified, the target database location is required.
E. At least one notebook library: You specify the transformation logic (which could be in notebooks), but this is not a strict requirement for setting up the pipeline itself.
This is a key requirement for creating a Delta Live Tables pipeline. You need to specify notebooks that contain the ETL logic to be executed by the pipeline.
Per Databaricks documentation (see below), you need to select a destination for datasets published by the pipeline, either the Hive metastore or Unity Catalog I think A is incorrect because it uses the term "Notebook Library" and not just "Notebook".
Databricks doc: https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html
"you need to select a destination for datasets published by the pipeline". This is true if you have a notebook that is writing out a result dataset. However, nothing in this question or documentation states that a Delta Live Tables Pipeline --MUST-- contain a notebook that write dataset results.
E. At least one notebook library to be executed.
Explanation:
https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html
Delta Live Tables pipelines execute notebook libraries as part of their operations. These notebooks contain the logic, code, or instructions defining the data processing steps, transformations, or actions to be performed within the pipeline.
Specifying at least one notebook library to be executed is crucial when creating a new Delta Live Tables pipeline, as it defines the sequence of operations and the logic to be executed on the data within the pipeline, aligning with the documentation provided.
This should be E. As per the link https://docs.databricks.com/en/delta-live-tables/tutorial-pipelines.html
Create a pipeline
Click Jobs Icon Workflows in the sidebar, click the Delta Live Tables tab, and click Create Pipeline.
Give the pipeline a name and click File Picker Icon to select a notebook.
Select Triggered for Pipeline Mode.
(Optional) Enter a Storage location for output data from the pipeline. The system uses a default location if you leave Storage location empty.
(Optional) Specify a Target schema to publish your dataset to the Hive metastore or a Catalog and a Target schema to publish your dataset to Unity Catalog. See Publish datasets.
(Optional) Click Add notification to configure one or more email addresses to receive notifications for pipeline events. See Add email notifications for pipeline events.
Click Create.
Ans E : i think it might be E - https://docs.databricks.com/en/delta-live-tables/settings.html - this doc says that target schema and storage may be optional so it leaves us with E
A path to a cloud storage location for the written data - considering this option is talking about the source data being stored in cloud storage and being ingested to DLT using an autoloader.
upvoted 3 times
...
Log in to ExamTopics
Sign in:
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
Stemix
Highly Voted 10 months agohakimipous
Most Recent 3 days, 16 hours agoColje
1 month, 3 weeks ago80370eb
3 months, 2 weeks agoShinigami76
5 months, 2 weeks agobenni_ale
6 months, 4 weeks agoBigMF
8 months, 1 week ago7082935
2 months, 4 weeks agoazure_bimonster
10 months, 1 week agoAzure_2023
10 months, 1 week agoGaryn
10 months, 4 weeks agosaaaaaa
11 months, 1 week ago55f31c8
12 months agoHuroye
1 year agokishore1980
1 year agomeow_akk
1 year, 1 month agoSyd
1 year agokishanu
1 year, 1 month ago