You are designing a Dataflow pipeline for a batch processing job. You want to mitigate multiple zonal failures at job submission time. What should you do?
A.
Submit duplicate pipelines in two different zones by using the --zone flag.
B.
Set the pipeline staging location as a regional Cloud Storage bucket.
C.
Specify a worker region by using the --region flag.
D.
Create an Eventarc trigger to resubmit the job in case of zonal failure when submitting the job.
- Specifying a worker region (instead of a specific zone) allows Google Cloud's Dataflow service to manage the distribution of resources across multiple zones within that region
C. Specify a worker region by using the --region flag.
This ensures that your Dataflow job is submitted to a region rather than a specific zone, providing higher availability and resilience against zonal failures
https://cloud.google.com/dataflow/docs/guides/pipeline-workflows#zonal-failures
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
Matt_108
Highly Voted 9 months, 2 weeks agoraaad
Highly Voted 9 months, 2 weeks agoPime13
Most Recent 3 months, 2 weeks agoJyoGCP
8 months, 1 week agoSofiia98
9 months, 2 weeks agoscaenruy
9 months, 3 weeks ago