exam questions

Exam Professional Data Engineer All Questions

View all questions & answers for the Professional Data Engineer exam

Exam Professional Data Engineer topic 1 question 25 discussion

Actual exam question from Google's Professional Data Engineer
Question #: 25
Topic #: 1
[All Professional Data Engineer Questions]

You want to use Google Stackdriver Logging to monitor Google BigQuery usage. You need an instant notification to be sent to your monitoring tool when new data is appended to a certain table using an insert job, but you do not want to receive notifications for other tables. What should you do?

  • A. Make a call to the Stackdriver API to list all logs, and apply an advanced filter.
  • B. In the Stackdriver logging admin interface, and enable a log sink export to BigQuery.
  • C. In the Stackdriver logging admin interface, enable a log sink export to Google Cloud Pub/Sub, and subscribe to the topic from your monitoring tool.
  • D. Using the Stackdriver API, create a project sink with advanced log filter to export to Pub/Sub, and subscribe to the topic from your monitoring tool.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
jvg637
Highly Voted 4 years, 7 months ago
I would choose D. A and B are wrong since don't notify anything to the monitoring tool. C has no filter on what will be notified. We want only some tables.
upvoted 49 times
...
MaxNRG
Highly Voted 2 years, 11 months ago
D as the key requirement is to have notification on a particular table. It can be achieved using advanced log filter to filter only the table logs and create a project sink to Cloud Pub/Sub for notification. Refer GCP documentation - Advanced Logs Filters: https://cloud.google.com/logging/docs/view/advanced-queries A is wrong as advanced filter will help in filtering. However, there is no notification sends. B is wrong as it would send all the logs and BigQuery does not provide notifications. C is wrong as it would send all the logs.
upvoted 15 times
...
suwalsageen12
Most Recent 5 months, 2 weeks ago
D is the correct answer because: - we need to advance filtering to filter the logs for the specific table - we need to use monitoring tool for notification.
upvoted 1 times
...
axantroff
11 months, 1 week ago
Selected Answer: D
Good point by MaxNRG about reducing the number of logs sending to Pub/Sub
upvoted 2 times
...
ruben82
12 months ago
Theorically Pub/Sub could filters log to forward the right ones to the correct topic. https://cloud.google.com/pubsub/docs/subscription-message-filter So C could be accepted, but It's better if filtering is performed earlier, so in this case D is more performing
upvoted 1 times
...
rtcpost
1 year ago
Selected Answer: D
D. Using the Stackdriver API, create a project sink with an advanced log filter to export to Pub/Sub, and subscribe to the topic from your monitoring tool. This approach allows you to set up a custom log sink with an advanced filter that targets the specific table and then export the log entries to Google Cloud Pub/Sub. Your monitoring tool can subscribe to the Pub/Sub topic, providing you with instant notifications when relevant events occur without being inundated with notifications from other tables. Options A and B do not offer the same level of customization and specificity in targeting notifications for a particular table. Option C is almost correct but doesn't mention the use of an advanced log filter in the sink configuration, which is typically needed to filter the logs to a specific table effectively. Using the Stackdriver API for more advanced configuration is often necessary for fine-grained control over log filtering.
upvoted 2 times
...
suku2
1 year, 1 month ago
Selected Answer: D
D makes sense.
upvoted 1 times
...
GCP_PDE_AG
1 year, 2 months ago
D should be the answer
upvoted 1 times
...
Mathew106
1 year, 3 months ago
Selected Answer: D
A and B mention nothing about notifications and C would push all data. It's D.
upvoted 1 times
...
bha11111
1 year, 7 months ago
Selected Answer: D
D makes sense
upvoted 1 times
...
Jackalski
1 year, 10 months ago
Selected Answer: D
"advanced log filter" is the key word here, all other options push all data ...
upvoted 2 times
...
Jasar
1 year, 11 months ago
Selected Answer: D
D is the best choice
upvoted 1 times
...
alecuba16
2 years, 6 months ago
Selected Answer: D
Using the Stackdriver API, create a project sink with advanced log filter to export to Pub/Sub, and subscribe to the topic from your monitoring tool.
upvoted 4 times
...
devric
2 years, 6 months ago
Selected Answer: D
D. Option B doesn't make sense
upvoted 2 times
...
samdhimal
2 years, 9 months ago
correct answer -> Using the Stackdriver API, create a project sink with advanced log filter to export to Pub/Sub, and subscribe to the topic from your monitoring tool. Option C is also most likely right answer but it doesn't have the filter. We don't want all the tables. We only want one. So the correct answer is D. Logging sink - Using a Logging sink, you can direct specific log entries to your business logic. In this example, you can use Cloud Audit logs for Compute Engine which use the resource type gce_firewall_rule to filter for the logs of interest. You can also add an event type GCE_OPERATION_DONE to the filter to capture only the completed log events. Here is the Logging filter used to identify the logs. You can try out the query in the Logs Viewer. Pub/Sub topic – In Pub/Sub, you can create a topic to which to direct the log sink and use the Pub/Sub message to trigger a cloud function. Reference: https://cloud.google.com/blog/products/management-tools/automate-your-response-to-a-cloud-logging-event
upvoted 3 times
...
santoshindia
2 years, 9 months ago
Selected Answer: D
explained by MaxNRG
upvoted 3 times
...
medeis_jar
2 years, 9 months ago
Selected Answer: D
as explained by MaxNRG
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago