exam questions

Exam Associate Cloud Engineer All Questions

View all questions & answers for the Associate Cloud Engineer exam

Exam Associate Cloud Engineer topic 1 question 114 discussion

Actual exam question from Google's Associate Cloud Engineer
Question #: 114
Topic #: 1
[All Associate Cloud Engineer Questions]

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google-recommended practices to obtain the combined logs for all projects. What should you do?

  • A. Navigate to Stackdriver Logging and select resource.labels.project_id="*"
  • B. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
  • C. Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
  • D. Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Verve
Highly Voted 4 years, 1 month ago
Its B.
upvoted 26 times
...
[Removed]
Highly Voted 3 years, 11 months ago
The question is to view log past 60 days. B, c, D talks about deleting an object or truncation of table data
upvoted 11 times
[Removed]
3 years, 11 months ago
Answer should be A
upvoted 3 times
[Removed]
3 years, 11 months ago
Also A specifically talks about aggregation
upvoted 4 times
[Removed]
3 years, 11 months ago
Also by default, you have a lot of flexibility when viewing logging in stack driver , to filter and query.
upvoted 2 times
xtian2900
3 years, 11 months ago
what about minimum retention is 30 days ? is it true ?
upvoted 3 times
[Removed]
3 years, 11 months ago
Ur correct so minimally is 30 for data access logs https://cloud.google.com/logging/quotas then B is the way to go.
upvoted 3 times
...
...
...
...
...
...
ccpmad
Most Recent 3 months, 1 week ago
2024, there is not "Stackdriver Logging Export, but for 2020 it is B
upvoted 3 times
...
IshwarChandra
5 months, 1 week ago
resource.labels.project_id="*" is not a correct query because "*" returns 0 records so option A is not a correct answer
upvoted 1 times
...
Cynthia2023
8 months ago
Selected Answer: B
When it comes to log data, you're typically dealing with high-volume time-series data that is partitioned by time (e.g., by day). In such cases, setting a partition expiration is often more appropriate because it ensures that you're continuously retaining a rolling window of log data (for example, the last 60 days' worth) and automatically purging older data, rather than deleting the entire table at once after a certain period.
upvoted 3 times
Cynthia2023
8 months ago
In BigQuery, setting an expiration time for tables can be applied in two contexts: Table Expiration: When you set a table expiration time at the table level, it applies to the entire table. This means that the entire table will be deleted once the specified expiration time has elapsed since the table's creation time. Partition Expiration: For partitioned tables, you can set a partition expiration time, which applies to individual partitions within the table. Each partition's data will be deleted once the specified expiration time has elapsed since the creation of that specific partition. This is particularly useful for time-series data, like logs, where you might want to only keep recent data and allow older data to be automatically purged.
upvoted 2 times
...
...
Romio2023
9 months ago
I dont get the options
upvoted 2 times
...
kelliot
9 months, 1 week ago
Selected Answer: B
I guess it's B
upvoted 2 times
...
BAofBK
10 months ago
The correct answer is B
upvoted 1 times
...
scanner2
12 months ago
Selected Answer: B
Provides storage of log entries in BigQuery datasets. You can use big data analysis capabilities on the stored logs. Logging sinks stream logging data into BigQuery in small batches, which lets you query data without running a load job. You can set a default table expiration time at the dataset level, or you can set a table's expiration time when the table is created. A table's expiration time is often referred to as "time to live" or TTL. When a table expires, it is deleted along with all of the data it contains. https://cloud.google.com/logging/docs/export/configure_export_v2#overview https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
upvoted 2 times
...
Captain1212
1 year ago
Selected Answer: B
B is thecorrect answer, we can use bq to get 60 days logs and analyse
upvoted 1 times
...
Neha_Pallavi
1 year ago
B. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
upvoted 1 times
...
Prat25200607
1 year, 5 months ago
Selected Answer: B
https://cloud.google.com/architecture/security-log-analytics
upvoted 1 times
...
sai_learner
2 years, 1 month ago
All options are wrong , they are talking about deletion after 60 days, but questions asks us to analyse logs of past 60 days
upvoted 5 times
FeaRoX
1 year, 7 months ago
You are absolutely wrong - meaning of "past 60 days" is same as "last 60 days" in that sentence.
upvoted 1 times
...
...
AzureDP900
2 years, 2 months ago
B is right for sure
upvoted 1 times
...
Tirthankar17
2 years, 2 months ago
Selected Answer: B
B is the correct answer.
upvoted 2 times
...
dttncl
2 years, 10 months ago
I believe B is the answer. All that matters in this scenario is the logs for the past 60 days. We can use BigQuery to analyze contents so C is incorrect. We need to configure a BQ as the sink for the logs export so we can query and analyze log data in the future. Therefore D is incorrect. https://cloud.google.com/logging/docs/audit/best-practices#export-best-practices Since we only care about the logs within 60 days, we can set the expiration time to 60 to retain only the logs within that time frame. Once data is beyond 60 days old, it wouldn't be included in future analyzations. https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
upvoted 6 times
ryzior
2 years, 5 months ago
I think here we have the case described in details: https://cloud.google.com/architecture/exporting-stackdriver-logging-for-security-and-access-analytics
upvoted 1 times
...
...
ankatsu2010
2 years, 10 months ago
D should be the correct answer. To 'quickly analyze', you need to use BQ, next, you always need access to the logs 'for past 60days'. This means you have to export logs on a daily basis. You don't want to do this job manually right?
upvoted 1 times
ankatsu2010
2 years, 10 months ago
My apologies, B is correct... 'Sink' can route logging data to BQ automatically.
upvoted 3 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago