exam questions

Exam Professional Cloud Security Engineer All Questions

View all questions & answers for the Professional Cloud Security Engineer exam

Exam Professional Cloud Security Engineer topic 1 question 83 discussion

Actual exam question from Google's Professional Cloud Security Engineer
Question #: 83
Topic #: 1
[All Professional Cloud Security Engineer Questions]

As adoption of the Cloud Data Loss Prevention (Cloud DLP) API grows within your company, you need to optimize usage to reduce cost. Cloud DLP target data is stored in Cloud Storage and BigQuery. The location and region are identified as a suffix in the resource name.
Which cost reduction options should you recommend?

  • A. Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets.
  • B. Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets.
  • C. Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.
  • D. Use FindingLimits and TimespanContfig to sample data and minimize transformation units.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
[Removed]
Highly Voted 3 years, 6 months ago
Ans - C https://cloud.google.com/dlp/docs/inspecting-storage#sampling https://cloud.google.com/dlp/docs/best-practices-costs#limit_scans_of_files_in_to_only_relevant_files
upvoted 14 times
[Removed]
3 years, 6 months ago
https://cloud.google.com/dlp/docs/inspecting-storage#limiting-gcs
upvoted 1 times
...
...
passtest100
Highly Voted 3 years, 7 months ago
C is the right one.
upvoted 5 times
...
Xoxoo
Most Recent 7 months, 1 week ago
Selected Answer: C
To optimize usage of the Cloud Data Loss Prevention (Cloud DLP) API and reduce cost, you should consider using sampling and CloudStorageRegexFileSet to limit scans 1. By sampling data, you can limit the amount of data that the DLP API scans, thereby reducing costs 1. You can use the rowsLimit and bytesLimitPerFile options to sample data and limit scans to specific files in Cloud Storage 1. You can also use CloudStorageRegexFileSet to limit scans to only specific files in Cloud Storage 1. In addition, you can set appropriate rowsLimit value on BigQuery data hosted outside the US to further optimize usage and reduce costs 1.
upvoted 2 times
...
AzureDP900
1 year, 5 months ago
C is right
upvoted 4 times
...
AwesomeGCP
1 year, 6 months ago
Selected Answer: C
C . Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.
upvoted 4 times
...
cloudprincipal
1 year, 10 months ago
Selected Answer: C
https://cloud.google.com/dlp/docs/inspecting-storage#sampling
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago