Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Cloud Architect All Questions

View all questions & answers for the Professional Cloud Architect exam

Exam Professional Cloud Architect topic 1 question 15 discussion

Actual exam question from Google's Professional Cloud Architect
Question #: 15
Topic #: 1
[All Professional Cloud Architect Questions]

Your application needs to process credit card transactions. You want the smallest scope of Payment Card Industry (PCI) compliance without compromising the ability to analyze transactional data and trends relating to which payment methods are used.
How should you design your architecture?

  • A. Create a tokenizer service and store only tokenized data
  • B. Create separate projects that only process credit card data
  • C. Create separate subnetworks and isolate the components that process credit card data
  • D. Streamline the audit discovery phase by labeling all of the virtual machines (VMs) that process PCI data
  • E. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with the auditor
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
AD2AD4
Highly Voted 4 years, 5 months ago
Final Decision to go with Option A. I have done PCI DSS Audit for my project and thats the best suited case. 100% sure to use tokenised data instead of actual card number
upvoted 46 times
Musk
4 years, 5 months ago
But with A you cannot extract statistics. That is the second r4equirement.
upvoted 4 times
Arimaverick
3 years, 10 months ago
Analyzing Transaction does not require Credit Card number I guess. Only amount of transaction or balance what is needed. We also perform something similar with transactional data with tokenized PII information. So CC can be tokenized. So answer should be A.
upvoted 5 times
...
Musk
4 years, 3 months ago
Thinking about that better, I think you can because you are only tokenizing the sensitive data, not the transaction type.
upvoted 2 times
...
RitwickKumar
2 years, 3 months ago
You can as the generated token for a given credit card would be same(generally but there are approaches which can give you different token for the same sensitive data input). Only thing that you won't know is the actual card number which is not required for the trend analysis. When the trend analysis involves referential integrity then tokenization process becomes challenging but still once data is tokenized correctly you should be able to perform any kind of the analysis.
upvoted 2 times
...
...
AzureDP900
2 years, 1 month ago
I agree. A is the best option
upvoted 3 times
...
...
omermahgoub
Highly Voted 1 year, 11 months ago
To minimize the scope of Payment Card Industry (PCI) compliance while still allowing for the analysis of transactional data and trends related to payment methods, you should consider using a tokenizer service and storing only tokenized data, as described in option A. Tokenization is a process of replacing sensitive data, such as credit card numbers, with unique, randomly-generated tokens that cannot be used for fraudulent purposes. By using a tokenizer service and storing only tokenized data, you can reduce the scope of PCI compliance to only the tokenization service, rather than the entire application. This can help minimize the amount of sensitive data that needs to be protected and reduce the overall compliance burden.
upvoted 29 times
oxfordcommaa
1 year, 10 months ago
man, this is an amazing answer. props
upvoted 5 times
ccpmad
5 months, 2 weeks ago
thanks to chagpt, are you serious saying that?
upvoted 1 times
...
...
Saxena_Vibhor
10 months, 4 weeks ago
Nicely explained, thanks.
upvoted 1 times
...
...
Ekramy_Elnaggar
Most Recent 1 week, 5 days ago
Selected Answer: A
1. Reduced PCI Scope: Tokenization replaces sensitive credit card data with non-sensitive tokens. This significantly reduces the scope of PCI DSS compliance, as you no longer store actual cardholder data. Only the tokenization service needs to be PCI compliant. 2. Data Analysis: You can still analyze transactional data and trends using the tokenized data. The tokens can be linked back to the original cardholder data if needed for specific analysis or reporting purposes, but this would be done in a controlled and secure environment.
upvoted 1 times
...
hzaoui
10 months, 2 weeks ago
Selected Answer: A
A is correct
upvoted 1 times
...
devinss
1 year, 2 months ago
Not sure why 100% agree on A. To limit PCI DSS scope, the data handling should be done in a separate project with very limited access. Only in this project should tokenization be done and made available for analytics. The first requirement however, is isolating the payment and tokenization code in a separate project. Answer should be B.
upvoted 2 times
...
heretolearnazure
1 year, 3 months ago
Tokenizing is the best way to protect PCI information.
upvoted 1 times
...
nocrush
1 year, 4 months ago
Selected Answer: A
A is my best option
upvoted 1 times
...
KjChen
2 years ago
Selected Answer: A
https://cloud.google.com/architecture/tokenizing-sensitive-cardholder-data-for-pci-dss
upvoted 4 times
...
andreavale
2 years ago
Selected Answer: A
ok for A
upvoted 1 times
...
minmin2020
2 years, 1 month ago
Selected Answer: A
A. Create a tokenizer service and store only tokenized data
upvoted 1 times
...
BiddlyBdoyng
2 years, 1 month ago
B appears the most thorough but the question asks to comply with the smallest scope, network segmentation is not a must. Tokenization is simpler. C is similar to B, more than required. D & E do not address the problem.
upvoted 1 times
...
holerina
2 years, 2 months ago
correct answer is A use tokenize
upvoted 2 times
...
abirroy
2 years, 2 months ago
Selected Answer: A
Correct answer A
upvoted 1 times
...
Nirca
2 years, 7 months ago
Selected Answer: A
The mandatory stage in PCI is having a encryption/ description system. Data must not be stored as is with PAN. So A IS A MUST. The rest are nice to have.
upvoted 1 times
...
ryzior
2 years, 7 months ago
Selected Answer: A
I think it should be A and C - the paper states clearly, a proper network segmentation is still required to disparate the vault and token servers from the rest of the flat network.
upvoted 1 times
...
sjmsummer
2 years, 10 months ago
I chose A. But why C is not good?
upvoted 1 times
...
vincy2202
2 years, 11 months ago
A is the correct answer https://cloud.google.com/architecture/tokenizing-sensitive-cardholder-data-for-pci-dss#a_service_for_handling_sensitive_information
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...