Your company built a TensorFlow neutral-network model with a large number of neurons and layers. The model fits well for the training data. However, when tested against new data, it performs poorly. What method can you employ to address this?
Answer is C.
Bad performance of a model is either due to lack of relationship between dependent and independent variables used, or just overfit due to having used too many features and/or bad features.
A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed
B: Serialization is only changing data into byte streams. This won't be useful.
C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it.
D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well, so D is not the answer.
Correct answer is C.
A - Is not Threading because it is used to accelerate the training in order to reduce training time.
B - Is not Serialization because it transforms (serializes into bytes) the training data but does not increase or change the original nature.
D - Is not dimensionality reduction because the model fits the training data.
https://docs.google.com/document/d/1VV6vkkjShXDgPLSG6V_7-0dweLmZTUnYiTSxo6C5ERY/edit?tab=t.0
Correct answer is C.
A - Is not Threading because it is used to accelerate the training in order to reduce training time.
B - Is not Serialization because it transforms (serializes into bytes) the training data but does not increase or change the original nature.
D - Is not dimensionality reduction because the model fits the training data.
C. Dropout Methods
Dropout is a regularization technique commonly used in neural networks to prevent overfitting learns-google.blogspot.com It helps improve the generalization of the model by randomly setting a fraction of the neurons to zero during each training iteration, which prevents the network from relying too heavily on specific neurons. This, in turn, can lead to better performance on new, unseen data.
Dropout is a specific technique to prevent overfitting by randomly disabling a certain percentage of neurons during training. This helps the network avoid relying too heavily on a subset of neurons, thereby improving its ability to generalize to new data.
C. Dropout Methods
Dropout is a regularization technique commonly used in neural networks to prevent overfitting. It helps improve the generalization of the model by randomly setting a fraction of the neurons to zero during each training iteration, which prevents the network from relying too heavily on specific neurons. This, in turn, can lead to better performance on new, unseen data.
A: Threading parallelisation can reduce training time, but if the selected featuers are the same then the resulting performance won't have changed
B: Serialization is only changing data into byte streams. This won't be useful.
C: This can show which features are bad. E.g. if it is one feature causing bad performance, then the dropout method will show it, so you can remove it from the model and retrain it.
D: This would become clear if the model did not fit the training data well. But the question says that the model fits the training data well.
So, C is the answer.
Answer is C. Dropout methods are used to mitigate overfitting. Hence, it is commonly used in training phase and it's beneficial for test-time performance.
Dropout is a regularization technique commonly used in model training with TensorFlow and other deep learning frameworks. It is employed to prevent overfitting, a phenomenon where a model learns to perform well on the training data but fails to generalize well to new, unseen data.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
henriksoder24
Highly Voted 2 years, 7 months agoPaxtons_Aunders
Most Recent 2 weeks agomonyu
3 weeks, 6 days agonocoxe
1 month agoAhamada
1 month, 1 week agoonschamekh
2 months, 2 weeks agojithinlife
3 months agoSamuelTsch
5 months, 2 weeks agortcpost
6 months, 1 week agorocky48
6 months, 1 week agotrashbox
11 months agoazmiozgen
1 year, 8 months agodgteixeira
1 year, 9 months agoAmmarFasih
1 year, 10 months agoIgnacioBL
2 years agoMorock
2 years, 1 month agoenghabeth
2 years, 1 month ago