exam questions

Exam Professional Cloud Architect All Questions

View all questions & answers for the Professional Cloud Architect exam

Exam Professional Cloud Architect topic 3 question 3 discussion

Actual exam question from Google's Professional Cloud Architect
Question #: 3
Topic #: 3
[All Professional Cloud Architect Questions]

For this question, refer to the Helicopter Racing League (HRL) case study. The HRL development team releases a new version of their predictive capability application every Tuesday evening at 3 a.m. UTC to a repository. The security team at HRL has developed an in-house penetration test Cloud Function called
Airwolf. The security team wants to run Airwolf against the predictive capability application as soon as it is released every Tuesday. You need to set up Airwolf to run at the recurring weekly cadence. What should you do?

  • A. Set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function.
  • B. Set up a Cloud Logging sink and a Cloud Storage bucket that triggers a Cloud Function.
  • C. Configure the deployment job to notify a Pub/Sub queue that triggers a Cloud Function.
  • D. Set up Identity and Access Management (IAM) and Confidential Computing to trigger a Cloud Function.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
umashankar_a
Highly Voted 3 years, 2 months ago
Answer C seems to be ok. Triggering Pub/Sub to invoke Cloud Functions seems to be relevant. Cloud Storage doesn't make any sense. It would have been straight forward if Cloud Scheduler is mentioned in Option C instead of Deployment Job. But if you make a bit of research on deployment jobs, it's pointing me to cron jobs which is making perfect sense. https://cloud.google.com/appengine/docs/flexible/nodejs/scheduling-jobs-with-cron-yaml https://cloud.google.com/scheduler/docs/tut-pub-sub
upvoted 58 times
elainexs
2 years, 3 months ago
Cannot understand why push CICD event to pub/sub... which is only one event, why need pub/sub
upvoted 6 times
...
stefanop
2 years, 11 months ago
But the question requires a scheduled execution, not one triggered by the deployment job. Shouldn’t A be the correct answer?
upvoted 5 times
Nimbus2021
2 years, 9 months ago
I think no because question mentions "as soon as it is released every Tuesday."
upvoted 3 times
...
Gino17m
4 months, 3 weeks ago
Teh question requires recurring not scheduled execution
upvoted 2 times
...
...
...
MamthaSJ
Highly Voted 3 years, 2 months ago
Answer is A
upvoted 18 times
nandoD
1 year, 5 months ago
Please elaborate.
upvoted 3 times
...
...
dija123
Most Recent 5 months, 2 weeks ago
Selected Answer: C
Totally agree with C
upvoted 1 times
...
mouthwash
9 months, 3 weeks ago
Passed the GCP test today, answer is C The key is to use google's native tools
upvoted 7 times
...
Jconnor
9 months, 4 weeks ago
How is A even an option? What do you use cloud storage for? A good architecture is event driven, as it would be more resilient to failures, change in time, error and it is easier to debung, log and scale. That is what Pub/Sub is for.
upvoted 1 times
...
thewalker
10 months ago
Selected Answer: A
A is simple and clean compared to the other options provided.
upvoted 1 times
MikeH20
9 months, 2 weeks ago
Where does a Cloud Storage bucket come into play here? Nothing in the question implies anything about storage. If they needed a place to store the results of the Airwolf job, then sure. But that isn't mentioned anywhere.
upvoted 2 times
...
...
Sarin
11 months, 1 week ago
Answer C seems to be right. There are 2 requirements here, 1. Run every time it is released on Tuesday 2. Set Airwolf to run weekly Since a new version of the predictive capability application is released every tuesday evening at 3.00 am, the deployment job would run every time its released which is every week recurring. So both the requirements above are satisfied
upvoted 2 times
...
sampon279
1 year, 2 months ago
Selected Answer: C
Should be C. Cannot be A, to schedule cloud task you need to know when the deployment is complete, deployments usually are unpredictable and do not meet scheduled time. With option C, CICD pipeline which deploys the code and publish a message to pub/sub to trigger cloud function - better solution to trigger via http endpoint if that is an option. pub/sub is till okay.
upvoted 3 times
...
WinSxS
1 year, 6 months ago
Selected Answer: C
To run Airwolf against the predictive capability application as soon as it is released every Tuesday, you should configure the deployment job to notify a Pub/Sub queue that triggers a Cloud Function.
upvoted 2 times
...
zerg0
1 year, 7 months ago
Selected Answer: A
Cloud task is supports scheduling
upvoted 1 times
...
tdotcat
1 year, 8 months ago
Selected Answer: C
c fits scenario
upvoted 1 times
...
main_street
1 year, 9 months ago
Answer A seems correct since cloud tasks support scheduled delivery but pub/sub doesn't see https://cloud.google.com/pubsub/docs/choosing-pubsub-or-cloud-tasks
upvoted 2 times
jlambdan
1 year, 3 months ago
https://cloud.google.com/tasks/docs/comp-tasks-sched it seems to be scheduling of task ahead of time, not scheduling at fixed time interval.
upvoted 1 times
...
...
omermahgoub
1 year, 9 months ago
To set up Airwolf to run at a recurring weekly cadence, the correct option would be C: Configure the deployment job to notify a Pub/Sub queue that triggers a Cloud Function. To set up Airwolf to run at the desired weekly cadence, you can configure the deployment job to send a notification to a Pub/Sub queue when a new version of the predictive capability application is released. Then, you can set up a Cloud Function that is triggered by messages in the Pub/Sub queue and runs the Airwolf penetration test. This way, the Cloud Function will be triggered every time a new message is published to the queue, which will occur every Tuesday evening at 3 a.m. UTC when a new version of the application is released.
upvoted 4 times
omermahgoub
1 year, 9 months ago
Option A, Set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function, would not be the correct solution because Cloud Tasks is a service for creating and managing asynchronous tasks that are executed later, but it does not support recurring schedules. Option B, Set up a Cloud Logging sink and a Cloud Storage bucket that triggers a Cloud Function, would not be the correct solution because Cloud Logging is a service for collecting, viewing, and analyzing logs, but it does not support triggering Cloud Functions on a recurring basis. Option D, Set up Identity and Access Management (IAM) and Confidential Computing to trigger a Cloud Function, would not be the correct solution because IAM is a service for managing access to Google Cloud resources and Confidential Computing is a service for running sensitive workloads in hardware-isolated environments, but neither of these services can be used to trigger Cloud Functions on a recurring basis.
upvoted 3 times
kat1969
1 year, 8 months ago
This conflicts with your earlier statements? Is this statement intended as a correction?
upvoted 1 times
nandoD
1 year, 5 months ago
how I see it, the first post is the correct answer explanation, the second post is why the other 3 answers are wrong.
upvoted 1 times
...
...
...
...
thamaster
1 year, 9 months ago
answer A does not make sense why put a cloud task and check a storage (which is never updated) for cloud function? If the release has some late the task run for nothing. Pub sub + cloud function is best practice
upvoted 2 times
amelm
1 year, 6 months ago
That's what I taught too. "Why do I need Cloud Storage?"
upvoted 2 times
...
...
omermahgoub
1 year, 9 months ago
The correct answer is A: Set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function. To set up Airwolf to run at a recurring weekly cadence, you should set up Cloud Tasks and a Cloud Storage bucket that triggers a Cloud Function. Cloud Tasks is a fully managed service that allows you to schedule and execute background jobs in a scalable and reliable way. You can use Cloud Tasks to create a recurring task that runs at a specified interval (e.g., every week). When the task is triggered, it can send a message to a Cloud Storage bucket, which can then trigger a Cloud Function to run the Airwolf penetration test. Option B: Setting up a Cloud Logging sink and a Cloud Storage bucket would not allow you to schedule the task to run at a recurring weekly cadence. Option C: Configuring the deployment job to notify a Pub/Sub queue would not allow you to schedule the task to run at a recurring weekly cadence. Option D: Setting up Identity and Access Management (IAM) and Confidential Computing would not allow you to schedule the task to run at a recurring weekly cadence.
upvoted 1 times
...
surajkrishnamurthy
1 year, 9 months ago
Selected Answer: C
C Is the Correct Answer
upvoted 1 times
...
Jackalski
1 year, 10 months ago
Selected Answer: A
I vote on A cloud task can trigger CF ... with limit to 30 days - here it is weekly - so far so good however not sure why it would need any cloud storage .. potentially to store results of dony by CF answer C - has no schedule option example: https://cloud.google.com/tasks/docs/tutorial-gcf
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago