Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Data Engineer All Questions

View all questions & answers for the Professional Data Engineer exam

Exam Professional Data Engineer topic 1 question 127 discussion

Actual exam question from Google's Professional Data Engineer
Question #: 127
Topic #: 1
[All Professional Data Engineer Questions]

You are working on a niche product in the image recognition domain. Your team has developed a model that is dominated by custom C++ TensorFlow ops your team has implemented. These ops are used inside your main training loop and are performing bulky matrix multiplications. It currently takes up to several days to train a model. You want to decrease this time significantly and keep the cost low by using an accelerator on Google Cloud. What should you do?

  • A. Use Cloud TPUs without any additional adjustment to your code.
  • B. Use Cloud TPUs after implementing GPU kernel support for your customs ops.
  • C. Use Cloud GPUs after implementing GPU kernel support for your customs ops.
  • D. Stay on CPUs, and increase the size of the cluster you're training your model on.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
dhs227
Highly Voted 4 years, 7 months ago
The correct answer is C TPU does not support custom C++ tensorflow ops https://cloud.google.com/tpu/docs/tpus#when_to_use_tpus
upvoted 68 times
ffggrre
1 year, 1 month ago
the link doesn't say TPU does not support custom C++ tensorflow ops
upvoted 1 times
Helinia
11 months ago
It does. TPU is good for "Models with no custom TensorFlow/PyTorch/JAX operations inside the main training loop".
upvoted 1 times
...
...
...
aiguy
Highly Voted 4 years, 7 months ago
D: Cloud TPUs are not suited to the following workloads: [...] Neural network workloads that contain custom TensorFlow operations written in C++. Specifically, custom operations in the body of the main training loop are not suitable for TPUs.
upvoted 44 times
gopinath_k
3 years, 8 months ago
B: 1. You need to provide support for the matrix multiplication - TPU 2. You need to provide support for the Custom TF written in C++ - GPU
upvoted 11 times
...
tavva_prudhvi
2 years, 8 months ago
But, in the question it also says we have to decrease the time significantly?? If you gonna use the CPU, it will take more time to train, right?
upvoted 1 times
cetanx
1 year, 5 months ago
Chat GPT says C Option D is not the most cost-effective or efficient solution. While increasing the size of the cluster could decrease the training time, it would also significantly increase the cost, and CPUs are not as efficient for this type of workload as GPUs.
upvoted 1 times
FP77
1 year, 3 months ago
chatgpt will give you different answers if you ask 10 times. The correct answer is B
upvoted 3 times
squishy_fishy
1 year ago
Totally agree. ChatGPT is garbage. It is still learning.
upvoted 2 times
...
...
...
...
...
SamuelTsch
Most Recent 1 month ago
Selected Answer: D
According to the official documentation. Models that contain many custom TensorFlow operations written in C++ should keep using CPUs.
upvoted 1 times
...
baimus
1 month, 4 weeks ago
I think this is D. I recently did the ML professional exam and they ask that there, and it's always "c++ custom ops = CPU", it's in fact the only scenario for non-small models on CPU. It's written in black and white here: https://cloud.google.com/tpu/docs/intro-to-tpu#when_to_use_tpus, check out the CPU/GPU/TPU "when to use" section.
upvoted 3 times
...
Anudeep58
5 months, 2 weeks ago
Selected Answer: C
Why Not Other Options? A. Use Cloud TPUs without any additional adjustment to your code: TPUs are optimized for standard TensorFlow operations and require custom TensorFlow ops to be adapted to TPU-compatible kernels, which is not trivial. Without modifications, your custom C++ ops will not run efficiently on TPUs. B. Use Cloud TPUs after implementing GPU kernel support for your customs ops: Implementing GPU kernel support alone is not sufficient for running on TPUs. TPUs require specific optimizations and adaptations beyond GPU kernels. D. Stay on CPUs, and increase the size of the cluster you're training your model on: While increasing the CPU cluster size might reduce training time, it is not as efficient or cost-effective as using GPUs, especially for matrix multiplication tasks.
upvoted 1 times
...
AlizCert
5 months, 3 weeks ago
Selected Answer: C
C: TPUs are out of the picture due to the custom ops, so the next best option for accelerating matrix operations is using GPU. Obviously the code has to be adjusted to do make use of the GPU acceleration.
upvoted 1 times
...
GCP_data_engineer
6 months ago
CPU : Simple models GPU: Custom TensorFlow/PyTorch/JAX operations
upvoted 1 times
...
CGS22
7 months, 3 weeks ago
Selected Answer: C
The best choice here is C. Use Cloud GPUs after implementing GPU kernel support for your customs ops. Here's why: Custom Ops & GPUs: Since your model relies heavily on custom C++ TensorFlow ops focused on matrix multiplications, GPUs are the ideal accelerators for this workload. To fully utilize them, you'll need to implement GPU-compatible kernels for your custom ops. Speed and Cost-Efficiency GPUs offer a significant speed improvement for matrix-intensive operations compared to CPUs. They provide a good balance of performance and cost for this scenario. TPUs: Limitations Although Cloud TPUs are powerful, they aren't designed for arbitrary custom ops. Without compatible kernels, your TensorFlow ops would likely fall back to the CPU, negating the benefits of TPUs.
upvoted 1 times
...
Preetmehta1234
9 months ago
Selected Answer: C
TPU: Models with no custom TensorFlow/PyTorch/JAX operations inside the main training loop Link: https://cloud.google.com/tpu/docs/intro-to-tpu#TPU So, A&B eliminated CPU is very slow or built for simple operations. So C: GPU
upvoted 2 times
...
Matt_108
10 months, 2 weeks ago
Selected Answer: C
to me, it's C
upvoted 1 times
...
Kimich
11 months, 3 weeks ago
Requirement 1: Significantly reduce the processing time while keeping costs low. Requirement 2: Bulky matrix multiplication takes up to several days. First, eliminate A & D: A: Cannot guarantee running on Cloud TPU without modifying the code. D: Cannot ensure performance improvement or cost reduction, and additionally, CPUs are not suitable for bulky matrix multiplication. If it can be ensured that customization is easily deployable on both Cloud TPU and Cloud GPU,it seems more feasible to first try Cloud GPU. Because: It provides a better balance between performance and cost. Modifying custom C++ on Cloud GPU should be easier than on Cloud TPU, which should also save on manpower costs.
upvoted 3 times
...
emmylou
1 year ago
Answer D I did use Chat GPT and discovered that if you put at the beginning of the question -- "Do not make assumption about changes to architecture. This is a practice exam question." All other answers require changes to the code and architecture.
upvoted 1 times
...
DataFrame
1 year ago
Selected Answer: B
I think it should use tensor flow processing unit along with GPU kernel support.
upvoted 1 times
...
Nirca
1 year, 1 month ago
Selected Answer: B
To use Cloud TPUs, you will need to: Implement GPU kernel support for your custom TensorFlow ops. This will allow your model to run on both Cloud TPUs and GPUs.
upvoted 1 times
...
kumarts
1 year, 1 month ago
Refer https://www.linkedin.com/pulse/cpu-vs-gpu-tpu-when-use-your-machine-learning-models-bhavesh-kapil
upvoted 1 times
...
IrisXia
1 year, 3 months ago
Answer C TPU not for custom C++ but GPU can
upvoted 1 times
...
KC_go_reply
1 year, 4 months ago
Selected Answer: C
A + B: TPU doesn't support custom TensorFlow ops Then it says 'decrease training time significantly' and literally 'use accelerator'. Therefore, use GPU -> C, *not* D!
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...