Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Certified Data Engineer Professional All Questions

View all questions & answers for the Certified Data Engineer Professional exam

Exam Certified Data Engineer Professional topic 1 question 13 discussion

Actual exam question from Databricks's Certified Data Engineer Professional
Question #: 13
Topic #: 1
[All Certified Data Engineer Professional Questions]

An upstream system is emitting change data capture (CDC) logs that are being written to a cloud object storage directory. Each record in the log indicates the change type (insert, update, or delete) and the values for each field after the change. The source table has a primary key identified by the field pk_id.
For auditing purposes, the data governance team wishes to maintain a full record of all values that have ever been valid in the source system. For analytical purposes, only the most recent value for each record needs to be recorded. The Databricks job to ingest these records occurs once per hour, but each individual record may have changed multiple times over the course of an hour.
Which solution meets these requirements?

  • A. Create a separate history table for each pk_id resolve the current state of the table by running a union all filtering the history tables for the most recent state.
  • B. Use MERGE INTO to insert, update, or delete the most recent entry for each pk_id into a bronze table, then propagate all changes throughout the system.
  • C. Iterate through an ordered set of changes to the table, applying each in turn; rely on Delta Lake's versioning ability to create an audit log.
  • D. Use Delta Lake's change data feed to automatically process CDC data from an external system, propagating all changes to all dependent tables in the Lakehouse.
  • E. Ingest all log information into a bronze table; use MERGE INTO to insert, update, or delete the most recent entry for each pk_id into a silver table to recreate the current table state.
Show Suggested Answer Hide Answer
Suggested Answer: E 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Eertyy
Highly Voted 1 year, 2 months ago
The answer given is correct
upvoted 9 times
Eertyy
1 year, 2 months ago
I want to correct my response.It seems the right answer Option D, it leverages Delta Lake's built-in capabilities for handling CDC data. It is designed to efficiently capture, process, and propagate changes, making it a more robust and scalable solution, particularly for large-scale data scenarios with frequent updates and auditing requirements.
upvoted 2 times
Starvosxant
1 year, 1 month ago
Databricks is NOT able to process CDC alone. It needs a intermediare Tool to make it on an object storage and then ingest it. So how can be D?
upvoted 5 times
...
sturcu
1 year, 1 month ago
the D states: process CDC data from an external system. so this delta CDF.
upvoted 1 times
...
...
...
benni_ale
Most Recent 1 week, 3 days ago
Selected Answer: E
You can only read the change data feed for enabled tables. You must explicitly enable the change data feed option using one of the following methods: TBLPROPERTIES (delta.enableChangeDataFeed = true) . this means it is a delta feature or in other words it is a feature supported by delta tables. the data to process in the question is external so it is not a delta table => can't be B... Hopefully I am correct.
upvoted 1 times
benni_ale
1 week, 3 days ago
i meant can't be D
upvoted 1 times
...
...
benni_ale
1 month, 1 week ago
Selected Answer: E
E . databricks cdc is not set to process external cdc. if u have external cdc u could send to bronze for auditing purposes and use bronze to get silver where u have only valid records
upvoted 1 times
...
databrick_work
2 months, 1 week ago
E is correct
upvoted 1 times
...
imatheushenrique
5 months, 3 weeks ago
E. This is the correct answer because it meets the requirements of maintaining a full record of all values that have ever been valid in the source system and recreating the current table state with only the most recent value for each record. The code ingests all log information into a bronze table, which preserves the raw CDC data as it is. Then, it uses merge into to perform an upsert operation on a silver table, which means it will insert new records or update or delete existing records based on the change type and the pk_id columns. This way, the silver table will always reflect the current state of the source table, while the bronze table will keep the history of all changes.
upvoted 4 times
...
spaceexplorer
10 months ago
Selected Answer: E
The answer is E
upvoted 2 times
...
RafaelCFC
10 months, 2 weeks ago
Selected Answer: E
Complimenting kz_data's response, be aware that the data that is being consumed is not a Databrick's CDC data feed object, but rather, CDC coming from somewhere else, that is, just regular data. So, indeed, it can't be processed without another tool.
upvoted 1 times
...
kz_data
10 months, 2 weeks ago
Selected Answer: E
Answer E is correct, as the CDC captured from the external database may contain duplicates for the same pk_id (key) due to multiple updates within the processed hour, we need to take the most recent update for the pk_id, and then MERGE into a silver table.
upvoted 2 times
...
a560fe1
10 months, 2 weeks ago
CDF captures changes only from a Delta table and is only forward-looking once enabled. The CDC logs are writing to object storage. So you would need to ingestion those and merge into downstream tables, hence the answer is E
upvoted 2 times
...
hamzaKhribi
11 months, 4 weeks ago
Selected Answer: D
For me the answer is D, the question states that CDC logs are emitted on an external storage meaning it can be ingested into the bronze layer on a table with CDF enabled. In this case we let databricks handle the complexity of following changes and only worry about data quality. meaning with CDF enabled databricks will already work the audit data for us with the table_changes of the pre-image and post-image and also give us the last updated value for our use case. here is a similar example: https://www.databricks.com/blog/2021/06/09/how-to-simplify-cdc-with-delta-lakes-change-data-feed.html
upvoted 3 times
spaceexplorer
10 months ago
This article shows exactly why D is not right. Since "CDF captures changes only from a Delta table and is only forward-looking once enabled."
upvoted 4 times
...
...
sturcu
1 year, 1 month ago
Selected Answer: E
E is correct
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...