Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam Professional Data Engineer All Questions

View all questions & answers for the Professional Data Engineer exam

Exam Professional Data Engineer topic 1 question 1 discussion

Actual exam question from Google's Professional Data Engineer
Question #: 1
Topic #: 1
[All Professional Data Engineer Questions]

Your company built a TensorFlow neutral-network model with a large number of neurons and layers. The model fits well for the training data. However, when tested against new data, it performs poorly. What method can you employ to address this?

  • A. Threading
  • B. Serialization
  • C. Dropout Methods
  • D. Dimensionality Reduction
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
henriksoder24
Highly Voted 2 years, 2 months ago
Answer is C. Bad performance of a model is either due to lack of relationship between dependent and independent variables used, or just overfit due to having used too many features and/or bad features. A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed B: Serialization is only changing data into byte streams. This won't be useful. C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it. D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well, so D is not the answer.
upvoted 26 times
...
SamuelTsch
Most Recent 1 month ago
Selected Answer: C
It occurs overfitting problem. A general idea is to simplify the model. A GENERALIZATION related method should be used.
upvoted 1 times
...
rtcpost
2 months ago
Selected Answer: C
C. Dropout Methods Dropout is a regularization technique commonly used in neural networks to prevent overfitting. It helps improve the generalization of the model by randomly setting a fraction of the neurons to zero during each training iteration, which prevents the network from relying too heavily on specific neurons. This, in turn, can lead to better performance on new, unseen data.
upvoted 1 times
...
rocky48
2 months ago
Selected Answer: C
A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed B: Serialization is only changing data into byte streams. This won't be useful. C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it. D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well. So, C is the answer.
upvoted 1 times
...
trashbox
6 months, 3 weeks ago
Selected Answer: C
Dropout Methods are useful to prevent a TensorFlow model from overfitting
upvoted 1 times
...
azmiozgen
1 year, 4 months ago
Selected Answer: C
Answer is C. Dropout methods are used to mitigate overfitting. Hence, it is commonly used in training phase and it's beneficial for test-time performance.
upvoted 1 times
...
dgteixeira
1 year, 5 months ago
Selected Answer: C
Answer is C
upvoted 1 times
...
AmmarFasih
1 year, 6 months ago
Selected Answer: C
Dropout is a regularization technique commonly used in model training with TensorFlow and other deep learning frameworks. It is employed to prevent overfitting, a phenomenon where a model learns to perform well on the training data but fails to generalize well to new, unseen data.
upvoted 1 times
...
IgnacioBL
1 year, 8 months ago
Selected Answer: C
Answer is C
upvoted 2 times
...
Morock
1 year, 9 months ago
Selected Answer: C
Dropout is a regularization method to remove random selection of fixed number of unit in a neural network layer. So pick C for this question.
upvoted 1 times
...
enghabeth
1 year, 9 months ago
Selected Answer: C
becouse it is a regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data.
upvoted 1 times
...
samdhimal
1 year, 10 months ago
C. Dropout Methods Dropout is a regularization technique that can be used to prevent overfitting of the model to the training data. It works by randomly dropping out a certain percentage of neurons during training, which helps to reduce the complexity of the model and prevent it from memorizing the training data. This can improve the model's ability to generalize to new data and reduce the risk of poor performance when tested against new data.
upvoted 4 times
samdhimal
1 year, 10 months ago
A. Threading: it's not a method to address overfitting, it's a technique to improve the performance of the model by parallelizing the computations using multiple threads. B. Serialization: it's a technique to save the model's architecture and trained parameters to a file, it's helpful when you want to reuse the model later, but it doesn't address overfitting problem. D. Dimensionality Reduction: it's a technique that can be used to reduce the number of features in the data, it's helpful when the data contains redundant or irrelevant features, but it doesn't address overfitting problem directly.
upvoted 4 times
...
...
korntewin
1 year, 10 months ago
answer is likely to be C, but D (dimensionality reduction) can also be used to mitigate overfitting too! Not sure which one is the correct answer.
upvoted 2 times
...
Brillianttyagi
1 year, 11 months ago
Selected Answer: C
Correct
upvoted 1 times
...
odacir
1 year, 11 months ago
Answer is C. We are in an over feting problem with nerual-net model. Read at least the abstract of this. https://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf
upvoted 1 times
...
ovokpus
2 years ago
Selected Answer: C
Correct
upvoted 3 times
...
Atnafu
2 years, 1 month ago
C is the correct answer. Dropouts: are special layers that you can add to control some operations or add functionalities. If training has an issue you can take them out. In the question, it fits training but not new data, i.e., overfitting. Solution for overfitting is dropout
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...