At Dress4Win, an operations engineer wants to create a tow-cost solution to remotely archive copies of database backup files. The database files are compressed tar files stored in their current data center. How should he proceed?
A.
Create a cron script using gsutil to copy the files to a Coldline Storage bucket.
B.
Create a cron script using gsutil to copy the files to a Regional Storage bucket.
C.
Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
D.
Create a Cloud Storage Transfer Service job to copy the files to a Regional Storage bucket.
Should be C: https://cloud.google.com/storage-transfer/docs/on-prem-overview
Especially, when Google docs explicitly states, that custom scripts are unreliable, slow, insecure, difficult to maintain and troubleshoot.
I would go with A
Storage Transfer Service has many valuable features but it comes with some dependencies such as;
- min 300-Mbps internet connection
- A docker engine on-prem (app runs inside a container)
https://cloud.google.com/storage-transfer/docs/on-prem-overview#what_requirements_does_have
... and these may not be available at Dress4Win (we have no data if D4W satisfies these requirements)
Based on the recommendations here: https://cloud.google.com/storage-transfer/docs/overview#gsutil
[# gsutil rsync] command seems to be a better option in a cron job with regular intervals as it will be much easier to implement compared to setting up Storage Transfer Service.
We[re talking about potentially 100s of TBs of data based on the case study (at least 65TBs as that's how much they are using in their NAS storage for backups/logs). I certainly hope they have the minimum 300-Mbps connection and a computer in their data center that they can install docker on....
Answer should be C. As per the latest case study on google cloud website , they have DB storage of 1 PB out of which 600 TB is used. So you get the size of the data.
These are the thumb rules as per GCP documentation -
Transfer scenario Recommendation
Transferring from another cloud storage provider Use Storage Transfer Service
Transferring less than 1 TB from on-premises Use gsutil
Transferring more than 1 TB from on-premises Use Transfer service for on-premises data
https://cloud.google.com/storage-transfer/docs/overview
I agree with Samir, when there is nothing mentioned about data size, refer the case study again. Storage appliance section mentioned total size and available size. Which means we should be using storage transfer service. I will go with option C.
IMPORTANT: Dress4Win is not anymore part of the officially listed case studies: https://cloud.google.com/certification/guides/professional-cloud-architect
Here are the guidelines from Google:
From Azure/AWS Transfer: Storage Transfer Service
Between two different bucket: Storage Transfer service
For less than 1 TB From Private datacenter to Google: gsutil
For more than 1 TB with enough bandwidth for Private datacenter to Google - Use Storage Transfer Service for on-premises data
Not enough bandwidth to meet project deadline for private data center to Google for more than 1 TB - Transfer Appliance. (Transfer Appliance is recommended for data that exceeds 20 TB or would take more than a week to upload)
I assume the DB size will be more than 1 TB. (2 million TerramEarth vehicles each generate generates 200 to 500 megabytes of data per day)
Since it is more than 1 TB, based on google guidelines, I will go with Storage Transfer Service Answer C
https://cloud.google.com/storage-transfer/docs/overview
https://cloud.google.com/architecture/migration-to-google-cloud-transferring-your-large-datasets
Case: https://services.google.com/fh/files/blogs/master_case_study_terramearth.pdf
IMO option A is correct.
According to the technical requirement;
- Support multiple VPN connections between the production data center and cloud
environment.
Cloud VPN tunnel can support up to 3 gigabits per second (Gbps).
There is no deadline for this usecase, And also by considering the industry I can say the database size wouldn’t be bigger than 1TB; hence gsutil is suitable for this case.
Follow these rules of thumb when deciding whether to use gsutil or Storage Transfer Service:
Transfer scenario Recommendation
Transferring from another cloud storage provider Use Storage Transfer Service.
Transferring less than 1 TB from on-premises Use gsutil.
Transferring more than 1 TB from on-premises Use Transfer service for on-premises data.
Transferring less than 1 TB from another Cloud Storage region Use gsutil.
Transferring more than 1 TB from another Cloud Storage region Use Storage Transfer Service.
https://cloud.google.com/storage-transfer/docs/overview
Answer is C
Current DB disk size is 5 TB & backup size is 600 TB.
If size is more than 1 TB google recommend transfer service. regardless from other cloud/on-premise
https://cloud.google.com/storage-transfer/docs/overview#gsutil
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
Ayzen
Highly Voted 4 years agocetanx
3 years, 9 months agoJphix
3 years, 3 months ago[Removed]
7 months, 2 weeks agoSamirJ
Highly Voted 3 years, 6 months agoAdityaGupta
3 years, 6 months agomassacare
Most Recent 8 months, 3 weeks agojabrrJ68w02ond1
1 year, 5 months agoalexandercamachop
1 year, 7 months agoramzez4815
1 year, 8 months agoAiffone
2 years, 3 months agoburner_1984
2 years, 3 months agoGCPCloudArchitectUser
2 years, 3 months agoravisar
2 years, 4 months agoGCPCloudArchitectUser
2 years, 3 months agojoe2211
2 years, 5 months agoAmirso
2 years, 7 months agovictory108
2 years, 9 months agoMamthaSJ
2 years, 9 months agoPb55
3 years agojasim21
3 years agomrhege
3 years ago