Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Cloud DevOps Engineer All Questions

View all questions & answers for the Professional Cloud DevOps Engineer exam

Exam Professional Cloud DevOps Engineer topic 1 question 125 discussion

Actual exam question from Google's Professional Cloud DevOps Engineer
Question #: 125
Topic #: 1
[All Professional Cloud DevOps Engineer Questions]

Your team is building a service that performs compute-heavy processing on batches of data. The data is processed faster based on the speed and number of CPUs on the machine. These batches of data vary in size and may arrive at any time from multiple third-party sources. You need to ensure that third parties are able to upload their data securely. You want to minimize costs, while ensuring that the data is processed as quickly as possible. What should you do?

  • A. Provide a secure file transfer protocol (SFTP) server on a Compute Engine instance so that third parties can upload batches of data, and provide appropriate credentials to the server.
    Create a Cloud Function with a google.storage.object.finalize Cloud Storage trigger. Write code so that the function can scale up a Compute Engine autoscaling managed instance group
    Use an image pre-loaded with the data processing software that terminates the instances when processing completes.
  • B. Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket.
    Use a standard Google Kubernetes Engine (GKE) cluster and maintain two services: one that processes the batches of data, and one that monitors Cloud Storage for new batches of data.
    Stop the processing service when there are no batches of data to process.
  • C. Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket.
    Create a Cloud Function with a google.storage.object.finalize Cloud Storage trigger. Write code so that the function can scale up a Compute Engine autoscaling managed instance group.
    Use an image pre-loaded with the data processing software that terminates the instances when processing completes.
  • D. Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket.
    Use Cloud Monitoring to detect new batches of data in the bucket and trigger a Cloud Function that processes the data.
    Set a Cloud Function to use the largest CPU possible to minimize the runtime of the processing.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
mouthwash
2 months ago
D is best. C says writing code and all that could take time. Speed is key with using as much instant cloud services as possible.
upvoted 1 times
...
6a8c7ad
3 months, 1 week ago
Selected Answer: D
D over C
upvoted 1 times
...
xhilmi
11 months, 3 weeks ago
Selected Answer: C
The recommended solution is (option C) Provide a Cloud Storage bucket for third parties to upload batches of data, and utilize a Cloud Function with a google.storage.object.finalize trigger to scale up a Compute Engine autoscaling managed instance group. This approach ensures secure data uploads to a Cloud Storage bucket with proper IAM access controls. The Cloud Function, triggered upon new object finalization in the bucket, scales up a managed instance group with pre-loaded data processing software, optimizing for compute-heavy tasks. The instances terminate upon completion, minimizing costs. This design efficiently leverages serverless and autoscaling capabilities, ensuring quick and cost-effective processing of data batches arriving at varying times from multiple sources.
upvoted 2 times
...
Andrei_Z
1 year ago
Selected Answer: C
I would say C. GCS is not that expensive and you can set rules to archive old data. GCE is optimal for compute heavy batch jobs compared to cloud functions.
upvoted 2 times
...
Jason_Cloud_at
1 year, 1 month ago
Selected Answer: C
I would go with C , using GCS is cost effective and secure compared to other options. D. Cloud function with large CPU results in high cost.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...