Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Cloud DevOps Engineer All Questions

View all questions & answers for the Professional Cloud DevOps Engineer exam

Exam Professional Cloud DevOps Engineer topic 1 question 87 discussion

Actual exam question from Google's Professional Cloud DevOps Engineer
Question #: 87
Topic #: 1
[All Professional Cloud DevOps Engineer Questions]

Your company has a Google Cloud resource hierarchy with folders for production, test, and development. Your cyber security team needs to review your company's Google Cloud security posture to accelerate security issue identification and resolution. You need to centralize the logs generated by Google Cloud services from all projects only inside your production folder to allow for alerting and near-real time analysis. What should you do?

  • A. Enable the Workflows API and route all the logs to Cloud Logging.
  • B. Create a central Cloud Monitoring workspace and attach all related projects.
  • C. Create an aggregated log sink associated with the production folder that uses a Pub/Sub topic as the destination.
  • D. Create an aggregated log sink associated with the production folder that uses a Cloud Logging bucket as the destination.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
ReachTango73
Highly Voted 1 year, 1 month ago
D is correct as when you use buckets you can do log analysis
upvoted 11 times
mshafa
1 year ago
Does it address near-real-time analysis?
upvoted 1 times
YushiSato
1 year ago
There is a delay with the Cloud Storage Bucket, but no delay appears to occur with the Log Bucket. https://cloud.google.com/logging/docs/export/configure_export_v2 > New log sinks to Cloud Storage buckets might take several hours to start routing logs. Sinks to Cloud Storage are processed hourly while other destination types are processed in real time.
upvoted 1 times
...
...
...
mohan999
Most Recent 1 week, 6 days ago
Many seem to getting confused with logging bucket and cloud storage buckets. Both are different things. Cloud storage is never an option for logs analysis but with custom logging bucket, you can analyze the logs and infact all logs that are seen in logs explorer are stored in default & required logging bucket. Coming to the Pub/Sub, it is messaging service & can be good option for near real time analysis when you are sending these logs to any other 3rd party tool like SIEM or Splunk analysis tool. You can't just analyze logs just from retrieving the logs from Pub/Sub topic. Unless there is another destination, Pub/Sub will be useless. In the question, it is not mentioned whether any 3rd part tool is being used, so D is correct option.
upvoted 1 times
...
ccpmad
2 months, 1 week ago
Selected Answer: D
it is D, why should we use pub/sub for logs? are we crazy?
upvoted 3 times
...
surfer111
3 months, 1 week ago
Customers would be very annoyed if they had to use an additional technology for something as simple as logs and analysis. Configuring pub/sub is foreign to a lot of orgs when they are used to tech like kafka. All you need is a sink to a Cloud Logging bucket. D
upvoted 1 times
...
alpha_canary
9 months ago
Selected Answer: D
https://cloud.google.com/logging/docs/export/using_exported_logs#:~:text=Logs%20that%20you%20route%20to%20Cloud%20Logging%20buckets%20are%20available%20immediately.
upvoted 1 times
...
fixeres
9 months, 1 week ago
Selected Answer: D
So, after some research. The correct answer is D. There seems to be a lot of discussion, wether near-real time analysis is given or not. In Fact both C and D support near real-time analysis. https://cloud.google.com/logging/docs/export/using_exported_logs "Logs that you route to Cloud Logging buckets are available immediately." https://cloud.google.com/logging/docs/export/pubsub " Routed logs are generally available within seconds of their arrival to Logging, with 99% of logs available in less than 60 seconds." The main difference lays somewhere else. The question states that the logs are retrieved from different projects. And in fact for this use case Cloud Logging is the preffered option: https://cloud.google.com/logging/docs/export/configure_export_v2 Cloud Logging: " A log bucket can store logs that are received by multiple Google Cloud projects."
upvoted 1 times
...
alpha_canary
9 months, 2 weeks ago
Selected Answer: C
It clearly mentions here that "Sinks to Cloud Storage are processed hourly while other destination types are processed in real time." https://cloud.google.com/logging/docs/export/configure_export_v2#:~:text=New%20log%20sinks%20to%20Cloud%20Storage%20buckets%20might%20take%20several%20hours%20to%20start%20routing%20logs.%20Sinks%20to%20Cloud%20Storage%20are%20processed%20hourly%20while%20other%20destination%20types%20are%20processed%20in%20real%20time. D is eliminated
upvoted 3 times
fixeres
9 months, 1 week ago
This is correct, but you are mixing two things up. A Cloud Logging Bucket is not the same as a Cloud Storage Bucket. https://cloud.google.com/logging/docs/export/configure_export_v2 A Cloud Logging Bucket does process in real-time and is the preferred Option here.
upvoted 1 times
...
...
Feliphus
11 months ago
Selected Answer: D
I would choose C ans to export the logs to a SIEM product, but if we can use a Cloud Logging bucket as a central repository I prefer and the statement doesn't say anything about exportation.
upvoted 2 times
...
Nkay17
11 months, 2 weeks ago
Answer is C. Cloud Logging includes the capability for log archival in Google Cloud Storage and the ability to send logs to Google BigQuery. In addition, Cloud Logging also allows you to forward these logs to any custom endpoint including third party log management services for advanced and tailored log analytics via the near real-time streaming Google Cloud Pub/Sub API.
upvoted 1 times
...
filipemotta
11 months, 3 weeks ago
Selected Answer: D
By creating an aggregated log sink at the folder level for production, you can collect logs from all projects within that folder. Using a Cloud Logging bucket as the destination simplifies management and enables straightforward integration with Cloud Monitoring and alerting tools for security analysis.
upvoted 1 times
...
Andrei_Z
1 year ago
Selected Answer: C
I would vote C because from a security perspective it would be better to stream the logs to a SIEM or SOAR for near-real time analysis and alerting. A SIEM is not really mentioned here but streaming them to a bucket and analysing them from stackdriver would be nuts
upvoted 4 times
...
YushiSato
1 year ago
Selected Answer: D
I think D is correct
upvoted 1 times
...
mshafa
1 year ago
Selected Answer: C
C seems an to be correct.
upvoted 4 times
...
activist
1 year, 1 month ago
Answer C seems to be correct. https://cloudplatform.googleblog.com/2015/06/Real-Time-Log-Streaming-and-Analysis-with-Google-Cloud-Platform-Logentries.html
upvoted 4 times
...
lelele2023
1 year, 1 month ago
C is the answer: sink is the native feature of GCP to route logs and this excludes A and B. Also being asked to achieve near-real time analysis, and the pub-sub works better then a bucket.
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...