Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?
A.
1. In the BigQuery dataset that contains all the tables to be queried, add a label for each user that can launch a query. 2. Open the Billing page of the project. 3. Select Reports. 4. Select BigQuery as the product and filter by the user you want to check.
B.
1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. 2. Perform a BigQuery query on the generated table to extract the information you need.
C.
1. Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. 2. Develop a Dataflow pipeline to compute the cost of queries split by users.
D.
1. Activate billing export into BigQuery. 2. Perform a BigQuery query on the billing table to extract the information you need.
B is the correct answer https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring
A is incorrect as there is not billing page for a project, its billing account that handles all org billing.
"details about the query that was executed, like the SQL code, the job ID and, most important, the user who executed the query and the amount of data that was processed. With that information, you can compute the total cost of the query using a simple multiplication equation: cost per TB processed * numbers of TB processed" means it will be an estimation, not the real numbers.
The correct answer is:
D.
1. Activate billing export into BigQuery.
2. Perform a BigQuery query on the billing table to extract the information you need.
✅ Why this is correct:
Billing export to BigQuery allows you to analyze cloud usage and costs in detail.
You can break down costs by service (like BigQuery), project, labels, or even user (if billing data is detailed enough).
This method is real-time (or near real-time) and doesn’t require complex pipelines.
The correct answer is B. 1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. 2. Perform a BigQuery query on the generated table to extract the information you need.
Explanation
BigQuery automatically logs data access events (which include query execution details) to Cloud Logging. By exporting these logs via a logging sink to BigQuery, you can write SQL queries to analyze:
- The cost of each query.
- Which users are executing the most costly queries.
This approach provides near–real-time insights into query performance and cost, aligning with the pay-per-use nature of BigQuery and allowing you to monitor and optimize expenses effectively.
Google recommends to export cloud billing data to bigquery to control cost in real-time: https://cloud.google.com/billing/docs/how-to/export-data-bigquery
Read again... the question relates to BQ queries cost. The link states:
Use the Detailed usage export to analyze costs at the resource level, and identify specific resources that might be driving up costs. The detailed export includes resource-level information for the following products:
Compute Engine
Google Kubernetes Engine (GKE)
Cloud Run functions
Cloud Run....
Big Query is not listed
B is the correct option. Why not A: While the Billing page offers reports with user-level cost breakdowns, it doesn't provide real-time information or detailed query data. Why not D: Billing export can provide cost data in BigQuery, but it doesn't capture details about individual queries or users, making it insufficient for the specific needs of identifying costly queries and high-spending users.
I tend to agree with the GPT4 summary:
In summary,
Option B is more focused on analyzing specific BigQuery usage patterns and costs down to the level of individual queries and users. It's better for real-time analysis of query activities.
Option D, on the other hand, provides a broader overview of all costs associated with the Google Cloud project, which is beneficial for general cost management but less so for in-depth analysis of specific BigQuery queries and user activities.
For the specific need to discover the most costly queries and which users are responsible, Option B is more targeted and appropriate
B: because of" https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring
And because https://cloud.google.com/billing/docs/how-to/export-data-bigquery#example-queries. is not mentioning anything about query per user.
B
The answer is D. 1. Activate billing export into BigQuery. 2. Perform a BigQuery query on the billing table to extract the information you need.
Explanation:
A. This option is not correct because adding a label for each user in the BigQuery dataset will not allow you to monitor the cost of queries or find out which users spend the most.
B. This option is not correct because BigQuery data access logs do not include billing information or query costs.
C. This option is not correct because BigQuery data access logs stored in Cloud Storage do not include billing information or query costs, and developing a Dataflow pipeline to compute the cost of queries would be unnecessarily complex.
D. This is the correct option because activating billing export into BigQuery will allow you to query the billing data in real-time to discover the most costly queries and which users spend the most.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
kuboraam
Highly Voted 2 years, 7 months agoVSMu
2 years, 2 months agoMahmoud_E
Highly Voted 2 years, 5 months agojlambdan
1 year, 11 months ago[Removed]
1 year, 7 months agogaufchamp
Most Recent 1 week, 3 days agoOnoPa
2 weeks, 2 days agofrank_tsai_tech
3 weeks agoJamesKarianis
7 months, 2 weeks agodesertlotus1211
4 months, 1 week agodesertlotus1211
4 months, 1 week agodesertlotus1211
4 months, 1 week agodija123
1 year agomesodan
1 year, 1 month agoPime13
1 year, 2 months agoammonia_free
1 year, 2 months ago91d8ca7
1 year, 3 months agoammonia_free
1 year, 2 months agotheBestStudent
1 year, 4 months agosomeone2011
1 year, 6 months agodsyouness
1 year, 6 months agosomeone2011
1 year, 6 months agodaidaidai
1 year, 7 months agodidek1986
1 year, 8 months agogary_cooper
1 year, 9 months ago