exam questions

Exam Professional Cloud Architect All Questions

View all questions & answers for the Professional Cloud Architect exam

Exam Professional Cloud Architect topic 11 question 2 discussion

Actual exam question from Google's Professional Cloud Architect
Question #: 2
Topic #: 11
[All Professional Cloud Architect Questions]

At Dress4Win, an operations engineer wants to create a tow-cost solution to remotely archive copies of database backup files.
The database files are compressed tar files stored in their current data center.
How should he proceed?

  • A. Create a cron script using gsutil to copy the files to a Coldline Storage bucket.
  • B. Create a cron script using gsutil to copy the files to a Regional Storage bucket.
  • C. Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
  • D. Create a Cloud Storage Transfer Service job to copy the files to a Regional Storage bucket.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Ayzen
Highly Voted 4 years ago
Should be C: https://cloud.google.com/storage-transfer/docs/on-prem-overview Especially, when Google docs explicitly states, that custom scripts are unreliable, slow, insecure, difficult to maintain and troubleshoot.
upvoted 37 times
cetanx
3 years, 9 months ago
I would go with A Storage Transfer Service has many valuable features but it comes with some dependencies such as; - min 300-Mbps internet connection - A docker engine on-prem (app runs inside a container) https://cloud.google.com/storage-transfer/docs/on-prem-overview#what_requirements_does_have ... and these may not be available at Dress4Win (we have no data if D4W satisfies these requirements) Based on the recommendations here: https://cloud.google.com/storage-transfer/docs/overview#gsutil [# gsutil rsync] command seems to be a better option in a cron job with regular intervals as it will be much easier to implement compared to setting up Storage Transfer Service.
upvoted 8 times
Jphix
3 years, 3 months ago
We[re talking about potentially 100s of TBs of data based on the case study (at least 65TBs as that's how much they are using in their NAS storage for backups/logs). I certainly hope they have the minimum 300-Mbps connection and a computer in their data center that they can install docker on....
upvoted 5 times
...
...
[Removed]
7 months, 2 weeks ago
https://cloud.google.com/storage-transfer/docs/transfer-options
upvoted 2 times
...
...
SamirJ
Highly Voted 3 years, 6 months ago
Answer should be C. As per the latest case study on google cloud website , they have DB storage of 1 PB out of which 600 TB is used. So you get the size of the data. These are the thumb rules as per GCP documentation - Transfer scenario Recommendation Transferring from another cloud storage provider Use Storage Transfer Service Transferring less than 1 TB from on-premises Use gsutil Transferring more than 1 TB from on-premises Use Transfer service for on-premises data https://cloud.google.com/storage-transfer/docs/overview
upvoted 21 times
AdityaGupta
3 years, 6 months ago
I agree with Samir, when there is nothing mentioned about data size, refer the case study again. Storage appliance section mentioned total size and available size. Which means we should be using storage transfer service. I will go with option C.
upvoted 2 times
...
...
massacare
Most Recent 8 months, 3 weeks ago
Selected Answer: C
Although Dress4Win already removed from PCA case studies list, the answer should be C.
upvoted 2 times
...
jabrrJ68w02ond1
1 year, 5 months ago
IMPORTANT: Dress4Win is not anymore part of the officially listed case studies: https://cloud.google.com/certification/guides/professional-cloud-architect
upvoted 9 times
...
alexandercamachop
1 year, 7 months ago
Selected Answer: C
Answer is C.
upvoted 1 times
...
ramzez4815
1 year, 8 months ago
Selected Answer: C
C is the correct answer
upvoted 2 times
...
Aiffone
2 years, 3 months ago
I'd go with C, transfer service. gsutil is best used for transfer within GCS
upvoted 1 times
...
burner_1984
2 years, 3 months ago
Storage Transfer Service is to be used when data is available online, not in physical datacenter
upvoted 1 times
...
GCPCloudArchitectUser
2 years, 3 months ago
Dress4Win case is not listed as exam case study https://cloud.google.com/certification/guides/professional-cloud-architect
upvoted 4 times
...
ravisar
2 years, 4 months ago
Here are the guidelines from Google: From Azure/AWS Transfer: Storage Transfer Service Between two different bucket: Storage Transfer service For less than 1 TB From Private datacenter to Google: gsutil For more than 1 TB with enough bandwidth for Private datacenter to Google - Use Storage Transfer Service for on-premises data Not enough bandwidth to meet project deadline for private data center to Google for more than 1 TB - Transfer Appliance. (Transfer Appliance is recommended for data that exceeds 20 TB or would take more than a week to upload) I assume the DB size will be more than 1 TB. (2 million TerramEarth vehicles each generate generates 200 to 500 megabytes of data per day) Since it is more than 1 TB, based on google guidelines, I will go with Storage Transfer Service Answer C https://cloud.google.com/storage-transfer/docs/overview https://cloud.google.com/architecture/migration-to-google-cloud-transferring-your-large-datasets Case: https://services.google.com/fh/files/blogs/master_case_study_terramearth.pdf
upvoted 1 times
GCPCloudArchitectUser
2 years, 3 months ago
This question is for Dress4Win case study and you are referring different one
upvoted 1 times
...
...
joe2211
2 years, 5 months ago
Selected Answer: C
vote C
upvoted 1 times
...
Amirso
2 years, 7 months ago
IMO option A is correct. According to the technical requirement; - Support multiple VPN connections between the production data center and cloud environment. Cloud VPN tunnel can support up to 3 gigabits per second (Gbps). There is no deadline for this usecase, And also by considering the industry I can say the database size wouldn’t be bigger than 1TB; hence gsutil is suitable for this case.
upvoted 1 times
...
victory108
2 years, 9 months ago
C. Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
upvoted 2 times
...
MamthaSJ
2 years, 9 months ago
Answer is C
upvoted 2 times
...
Pb55
3 years ago
Follow these rules of thumb when deciding whether to use gsutil or Storage Transfer Service: Transfer scenario Recommendation Transferring from another cloud storage provider Use Storage Transfer Service. Transferring less than 1 TB from on-premises Use gsutil. Transferring more than 1 TB from on-premises Use Transfer service for on-premises data. Transferring less than 1 TB from another Cloud Storage region Use gsutil. Transferring more than 1 TB from another Cloud Storage region Use Storage Transfer Service. https://cloud.google.com/storage-transfer/docs/overview
upvoted 1 times
...
jasim21
3 years ago
Answer is C Current DB disk size is 5 TB & backup size is 600 TB. If size is more than 1 TB google recommend transfer service. regardless from other cloud/on-premise https://cloud.google.com/storage-transfer/docs/overview#gsutil
upvoted 2 times
...
mrhege
3 years ago
"Fibre channel SAN - MySQL databases - 1 PB total storage; 400 TB available" Definitely a use-case for Storage Transfer Service. (C)
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago