Your company built a TensorFlow neutral-network model with a large number of neurons and layers. The model fits well for the training data. However, when tested against new data, it performs poorly. What method can you employ to address this?
Answer is C.
Bad performance of a model is either due to lack of relationship between dependent and independent variables used, or just overfit due to having used too many features and/or bad features.
A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed
B: Serialization is only changing data into byte streams. This won't be useful.
C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it.
D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well, so D is not the answer.
C. Dropout Methods
Dropout is a regularization technique commonly used in neural networks to prevent overfitting. It helps improve the generalization of the model by randomly setting a fraction of the neurons to zero during each training iteration, which prevents the network from relying too heavily on specific neurons. This, in turn, can lead to better performance on new, unseen data.
A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed
B: Serialization is only changing data into byte streams. This won't be useful.
C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it.
D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well.
So, C is the answer.
Answer is C. Dropout methods are used to mitigate overfitting. Hence, it is commonly used in training phase and it's beneficial for test-time performance.
Dropout is a regularization technique commonly used in model training with TensorFlow and other deep learning frameworks. It is employed to prevent overfitting, a phenomenon where a model learns to perform well on the training data but fails to generalize well to new, unseen data.
C. Dropout Methods
Dropout is a regularization technique that can be used to prevent overfitting of the model to the training data. It works by randomly dropping out a certain percentage of neurons during training, which helps to reduce the complexity of the model and prevent it from memorizing the training data. This can improve the model's ability to generalize to new data and reduce the risk of poor performance when tested against new data.
A. Threading: it's not a method to address overfitting, it's a technique to improve the performance of the model by parallelizing the computations using multiple threads.
B. Serialization: it's a technique to save the model's architecture and trained parameters to a file, it's helpful when you want to reuse the model later, but it doesn't address overfitting problem.
D. Dimensionality Reduction: it's a technique that can be used to reduce the number of features in the data, it's helpful when the data contains redundant or irrelevant features, but it doesn't address overfitting problem directly.
Answer is C.
We are in an over feting problem with nerual-net model.
Read at least the abstract of this.
https://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf
C is the correct answer.
Dropouts: are special layers that you can add to control some operations or add functionalities. If training has an issue you can take them out.
In the question, it fits training but not new data, i.e., overfitting.
Solution for overfitting is dropout
upvoted 1 times
...
Log in to ExamTopics
Sign in:
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
henriksoder24
Highly Voted 2 years, 2 months agoSamuelTsch
Most Recent 1 month agortcpost
2 months agorocky48
2 months agotrashbox
6 months, 3 weeks agoazmiozgen
1 year, 4 months agodgteixeira
1 year, 5 months agoAmmarFasih
1 year, 6 months agoIgnacioBL
1 year, 8 months agoMorock
1 year, 9 months agoenghabeth
1 year, 9 months agosamdhimal
1 year, 10 months agosamdhimal
1 year, 10 months agokorntewin
1 year, 10 months agoBrillianttyagi
1 year, 11 months agoodacir
1 year, 11 months agoovokpus
2 years agoAtnafu
2 years, 1 month ago