exam questions

Exam Professional Data Engineer All Questions

View all questions & answers for the Professional Data Engineer exam

Exam Professional Data Engineer topic 1 question 318 discussion

Actual exam question from Google's Professional Data Engineer
Question #: 318
Topic #: 1
[All Professional Data Engineer Questions]

You are using BigQuery with a regional dataset that includes a table with the daily sales volumes. This table is updated multiple times per day. You need to protect your sales table in case of regional failures with a recovery point objective (RPO) of less than 24 hours, while keeping costs to a minimum. What should you do?

  • A. Schedule a daily export of the table to a Cloud Storage dual or multi-region bucket.
  • B. Schedule a daily copy of the dataset to a backup region.
  • C. Schedule a daily BigQuery snapshot of the table.
  • D. Modify ETL job to load the data into both the current and another backup region.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
joelcaro
3 days, 16 hours ago
Selected Answer: D
Opción D: Modify ETL job to load the data into both the current and another backup region Evaluación: Ajustar el ETL para escribir en dos tablas (una en la región principal y otra en una región de respaldo) asegura que los datos estén disponibles en ambas ubicaciones casi en tiempo real. Esto garantiza un RPO de menos de 24 horas, ya que las actualizaciones intradía se reflejan en ambas regiones. Aunque podría aumentar los costos de almacenamiento por duplicar los datos, es la solución más efectiva y directa para proteger contra fallos regionales.
upvoted 1 times
...
mdell
4 days, 16 hours ago
Selected Answer: B
In most cases, it is cheaper to copy a BigQuery dataset to a new region directly rather than exporting it to a Cloud Storage bucket and then loading it into a new BigQuery dataset in the desired region, as you only pay for data transfer costs when copying within BigQuery, while exporting to a bucket incurs additional storage charges for the exported data in Cloud Storage, even if it's only temporary. Key points to consider: No extra storage cost for copying: When copying a BigQuery dataset to a new region, you only pay for the data transfer cost, not the storage of the data in a separate location. Storage cost for exporting: Exporting data to a Cloud Storage bucket means you are charged for the storage of that data in the bucket until you delete it, even if you are just temporarily storing it for transfer.
upvoted 1 times
...
HectorLeon2099
2 weeks, 3 days ago
Selected Answer: A
Option A is the most cost efficient: https://cloud.google.com/blog/topics/developers-practitioners/backup-disaster-recovery-strategies-bigquery
upvoted 4 times
mdell
4 days, 17 hours ago
Additionally it only mentions backing up the sales table and not the entire dataset
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago