Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Data Engineer Associate All Questions

View all questions & answers for the Certified Data Engineer Associate exam

Exam Certified Data Engineer Associate topic 1 question 77 discussion

Actual exam question from Databricks's Certified Data Engineer Associate
Question #: 77
Topic #: 1
[All Certified Data Engineer Associate Questions]

A data engineer and data analyst are working together on a data pipeline. The data engineer is working on the raw, bronze, and silver layers of the pipeline using Python, and the data analyst is working on the gold layer of the pipeline using SQL. The raw source of the pipeline is a streaming input. They now want to migrate their pipeline to use Delta Live Tables.

Which of the following changes will need to be made to the pipeline when migrating to Delta Live Tables?

  • A. None of these changes will need to be made
  • B. The pipeline will need to stop using the medallion-based multi-hop architecture
  • C. The pipeline will need to be written entirely in SQL
  • D. The pipeline will need to use a batch source in place of a streaming source
  • E. The pipeline will need to be written entirely in Python
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
hussamAlHunaiti
Highly Voted 5 months, 2 weeks ago
Selected Answer: D
I had the exam today and option A & B weren't exist, correct answer is D.
upvoted 9 times
...
vigaro
Highly Voted 5 months ago
Selected Answer: D
None is never a solution
upvoted 7 times
...
gul1016
Most Recent 2 days, 22 hours ago
The correct answer is: A. None of these changes will need to be made. Explanation: Delta Live Tables (DLT) supports both Python and SQL: The data engineer can continue writing transformations for the raw, bronze, and silver layers in Python. The data analyst can work on the gold layer in SQL. Medallion-based architecture: Delta Live Tables is well-suited for the medallion architecture (raw -> bronze -> silver -> gold). It is commonly used to build reliable and maintainable data pipelines. Streaming sources: Delta Live Tables fully supports streaming inputs and can handle both batch and streaming sources natively. Flexibility in implementation: Delta Live Tables does not impose restrictions that require pipelines to be written entirely in either SQL or Python. Both languages can coexist in the same pipeline as needed. Thus, no major changes are required for the migration to Delta Live Tables.
upvoted 1 times
...
lj114
1 week ago
Selected Answer: A
A is correct
upvoted 1 times
...
ajay1709
2 weeks, 2 days ago
Right answer is not listed here. The right answer is "Different notbook may jused for SQL and Python"
upvoted 1 times
...
CommanderBigMac
2 months ago
Selected Answer: D
D is the answer
upvoted 1 times
...
9d4d68a
2 months, 3 weeks ago
A. None of these changes will need to be made. You can continue using the medallion-based architecture, and you do not need to switch entirely to SQL or Python. Delta Live Tables will work with your existing streaming sources and support both SQL and Python.
upvoted 1 times
...
80370eb
3 months ago
Selected Answer: A
When migrating to Delta Live Tables, you can continue using the medallion-based architecture, work with streaming sources, and write the pipeline in either SQL or Python. Therefore, no major changes are required for the pipeline in this scenario.
upvoted 2 times
...
jaromarg
5 months, 2 weeks ago
D: Delta Live Tables is primarily designed to work with batch processing rather than streaming. This means that when migrating a pipeline to Delta Live Tables, any streaming sources used in the original pipeline will need to be replaced with batch sources. In the scenario described, where the raw source of the pipeline is a streaming input, the data engineer and data analyst will need to modify their pipeline to read data from a batch source instead. This could involve changing the way data is ingested and processed to align with batch processing paradigms rather than streaming. Additionally, Delta Live Tables enables the integration of both SQL and Python code within a pipeline, so there's no strict requirement to write the pipeline entirely in SQL or Python. Both the data engineer's Python code for the raw, bronze, and silver layers and the data analyst's SQL code for the gold layer can still be used within the Delta Live Tables environment. Overall, the key change needed when migrating to Delta Live Tables in this scenario is transitioning from a streaming input source to a batch source to align with the batch processing nature of Delta Live Tables.
upvoted 4 times
jaromarg
5 months, 2 weeks ago
Yes It must be A: Language Support: DLT allows the use of both SQL and Python, so you can integrate the existing Python and SQL code within the DLT framework.
upvoted 1 times
...
...
benni_ale
6 months, 3 weeks ago
Selected Answer: A
A is correct
upvoted 2 times
...
Arunava05
7 months ago
Cleared the exam today . Option A and B were not available in the exam . There was a different option which was correct.
upvoted 3 times
...
AndreFR
11 months, 1 week ago
Selected Answer: A
B - DLT support medallion architecture (see example in : https://docs.databricks.com/en/delta-live-tables/transform.html#combine-streaming-tables-and-materialized-views-in-a-single-pipeline) C - DLT can mix Python and SQL using multiple notebooks (according to https://docs.databricks.com/en/delta-live-tables/tutorial-python.html You cannot mix languages within a Delta Live Tables source code file. You can use multiple notebooks or files with different languages in a pipeline) D - DLT manage streaming sources using streaming tables (ex : https://docs.databricks.com/en/delta-live-tables/load.html#load-data-from-a-message-bus) E - DLT support python and sql (https://docs.databricks.com/en/delta-live-tables/tutorial-python.html and https://docs.databricks.com/en/delta-live-tables/tutorial-sql.html) Correct answer is A by elimination
upvoted 4 times
...
kz_data
11 months, 2 weeks ago
Selected Answer: A
I think the answer is A
upvoted 1 times
...
nedlo
11 months, 3 weeks ago
Selected Answer: A
It should be A. Medallion architecture can be used in DLT pipeline https://www.databricks.com/glossary/medallion-architecture "Databricks provides tools like Delta Live Tables (DLT) that allow users to instantly build data pipelines with Bronze, Silver and Gold tables from just a few lines of code."
upvoted 2 times
...
Huroye
1 year ago
the correct answer is A. DLT needs a notebook where you specify the processing
upvoted 3 times
...
mokrani
1 year ago
Selected Answer: A
Response A: They have to adapt their notebook's code to be able to decalre the DLT pipeline. However, this option is not proposed in the answers so I think it might be A
upvoted 1 times
...
hsks
1 year ago
Answer should be A.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...