You write a Python script to connect to Google BigQuery from a Google Compute Engine virtual machine. The script is printing errors that it cannot connect to BigQuery. What should you do to fix the script?
A.
Install the latest BigQuery API client library for Python
B.
Run your script on a new virtual machine with the BigQuery access scope enabled
C.
Create a new service account with BigQuery access and execute your script with that user
D.
Install the bq component for gcloud with the command gcloud components install bq.
A - If client library was not installed, the python scripts won't run - since the question states the script reports "cannot connect" - the client library must have been installed. so it's B or C.
B - https://cloud.google.com/bigquery/docs/authorization an access scope is how your client application retrieve access_token with access permission in OAuth when you want to access services via API call - in this case, it is possible that the python script use an API call instead of library, if this is true, then access scope is required. client library requires no access scope (as it does not go through OAuth)
C - service account is Google Cloud's best practice
So prefer C.
Access scopes are the legacy method of specifying permissions for your instance. read from > https://cloud.google.com/compute/docs/access/service-accounts . So , I would go with C
agree
access scope is enabled by default
https://cloud.google.com/bigquery/docs/authorization#authenticate_with_oauth_20
If you use the BigQuery client libraries, you do not need this information, as this is done for you automatically.
Sorry, B is ok. You can create service account, add user to service account, and grant the user role as Service Account User. You still need to enable BigQuery scope to make the Python script running the instance to access BigQuery.
Stop confusing people, B) doesn't make any sense. Why would you use or create a whole new VM just because of a permission issue? If anything, just stop the instance and edit the scope of the default Compute Service Account and grant it the role through IAM. C) is the most appropriate answer since you can only set scopes of the default Compute Service Account, if you're using any other, there's no scope option - its access is dictated strictly by IAM in such scenario. So C) is the answer: Stop the VM, change the Service Account with the appropriate permissions and done. B) would still need to have permission the set through IAM & Admin, the scope isn't enough with the default Compute Service Account.
cloud guy1, relax. tartar is the hero for google cloud and if you read his answer, he explains the service account user's role granting on this one as that is the best practice
Configure the Python API to use a service account with relevant BigQuery access enabled. is the right answer.
It is likely that this service account this script is running under does not have the permissions to connect to BigQuery and that could be causing issues. You can prevent these by using a service account that has the necessary roles to access BigQuery.
Ref: https://cloud.google.com/bigquery/docs/reference/libraries#cloud-console
A service account is a special kind of account used by an application or a virtual machine (VM) instance, not a person.
Ref: https://cloud.google.com/iam/docs/service-accounts
Create a new service account with BigQuery access and execute your script with that user: If you want to run the script on an existing virtual machine, you can create a new service account with the necessary permissions to access BigQuery and then execute the script using that service account. This will allow the script to connect to BigQuery and access the data it needs.
A and C are correct, but we eliminated A because they mentioned "cannot connect" which means the script can run which means the client library was already installed, so final answer is only "C"
"C" was chosen because in order to access BigQuery, the script needs to authenticate and be authorized. The recommended way to do this for applications running on Compute Engine is to use a service account. Create a service account with the appropriate permissions (e.g., "BigQuery Data Editor") to access your BigQuery data. When running the script, make sure it uses the service account credentials to authenticate. This can be done by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of the service account key file.
Tricky question.
However, as you can read in gcloud compute instances create documentation:
--scopes=[SCOPE,…]
If not provided, the instance will be assigned the default scopes, described below. However, if neither --scopes nor --no-scopes are specified and the project has no default service account, then the instance will be created with no scopes. Note that the level of access that a service account has is determined by a combination of access scopes and IAM roles so you must configure both access scopes and IAM roles for the service account to work properly.
So, probably, B is the right one, as for the "new vm", I guess that this is because you don't want to stop the current one before having the working one ready...
You don't need to create a new VM to have different access scopes:
https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam
This weakens answer B.
When a user-managed service account is attached to the instance, the access scope defaults to cloud-platform:
https://cloud.google.com/compute/docs/access/service-accounts#scopes_best_practice
See Step 6 in: https://cloud.google.com/compute/docs/instances/change-service-account#changeserviceaccountandscopes
These facts leave C as the valid answer.
C. Create a new service account with BigQuery access and execute your script with that user.
Service accounts are used for server-to-server interactions, such as those between a virtual machine and BigQuery. You would need to create a service account that has the necessary permissions to access BigQuery, then download the service account key in JSON format. Once you have the key, you can set an environment variable (GOOGLE_APPLICATION_CREDENTIALS) to the path of the JSON key file before running your script, which will authenticate your requests to BigQuery.
The answer is C
https://cloud.google.com/bigquery/docs/authentication
For most services, you must attach the service account when you create the resource that will run your code; you cannot add or replace the service account later. Compute Engine is an exception—it lets you attach a service account to a VM instance at any time.
Connecting to BigQuery from a script requires proper authorization. Service accounts provide a secure way to grant access without sharing user credentials.
B is silly because there's no need to create a new VM just to change the access scope. You can edit the existing VM's access scope, although you do have to stop it first.
Closest is C. https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#gcloud
The confusion part is that it should never use the word user to represent service account
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
kalschi
Highly Voted 2 months agorishab86
3 years, 1 month agoVika
3 years, 9 months agoMQQNB
2 years, 3 months agoMusk
4 years, 5 months agoKouShikyou
Highly Voted 5 years, 1 month agotartar
4 years, 3 months agotartar
4 years, 3 months agocloudguy1
4 years, 3 months agocertificatores
4 years agotechalik
3 years, 12 months agonitinz
3 years, 8 months ago[Removed]
1 year, 11 months agonitinz
3 years, 8 months agoEkramy_Elnaggar
Most Recent 1 week, 5 days agoHungdv
3 months, 2 weeks agokingfighers
5 months, 2 weeks agokingfighers
5 months, 2 weeks agoa2le
5 months, 2 weeks agoRobert0
6 months agoresearched_answer_boi
7 months agosantoshchauhan
8 months, 2 weeks agoPowerboy
8 months, 2 weeks agotosinogunfile
9 months, 3 weeks agohzaoui
10 months, 2 weeks agopublic_figure
11 months agothewalker
1 year agospatters
1 year agoyilexar
1 year, 2 months agoduzapo
1 year, 2 months agoheretolearnazure
1 year, 3 months ago