exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 39 discussion

A Machine Learning Specialist built an image classification deep learning model. However, the Specialist ran into an overfitting problem in which the training and testing accuracies were 99% and 75%, respectively.
How should the Specialist address this issue and what is the reason behind it?

  • A. The learning rate should be increased because the optimization process was trapped at a local minimum.
  • B. The dropout rate at the flatten layer should be increased because the model is not generalized enough.
  • C. The dimensionality of dense layer next to the flatten layer should be increased because the model is not complex enough.
  • D. The epoch number should be increased because the optimization process was terminated before it reached the global minimum.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
DonaldCMLIN
Highly Voted 3 years, 1 month ago
DROPOUT HELPS PREVENT OVERFITTING https://keras.io/layers/core/#dropout THE BEAUTIFUL ANSER SHOULD BE B.
upvoted 55 times
rsimham
3 years, 1 month ago
agree. it should be B
upvoted 10 times
...
...
syu31svc
Highly Voted 3 years ago
https://kharshit.github.io/blog/2018/05/04/dropout-prevent-overfitting Answer is B 100%
upvoted 5 times
...
fm99
Most Recent 6 months, 2 weeks ago
Selected Answer: B
Increasing dropout rate will reduce complexity of the model which inturn reduces overfitting
upvoted 1 times
...
VR10
8 months, 1 week ago
This is clearly B, dont get why the answer is marked as D.
upvoted 1 times
...
endeesa
11 months ago
Selected Answer: B
Regularization will seek to obtain similar accuracies in train and test sets. Anything else will make the overfitting worse
upvoted 1 times
...
elvin_ml_qayiran25091992razor
11 months, 3 weeks ago
Selected Answer: B
B is correct, D so stup*d answer
upvoted 1 times
...
loict
1 year, 1 month ago
Selected Answer: B
A. NO - accuracy on training set is high B. YES - increased dropout rate => reduce model complexity => less overfitting C. NO - we want to reduce model complexity D. NO - the model converged
upvoted 2 times
...
DavidRou
1 year, 1 month ago
Selected Answer: B
I don't understand why the highlighted "right" answer is D. To increase the number of epochs will make the situation even worse than it is; dropout is the right action to take in this case
upvoted 2 times
...
kaike_reis
1 year, 2 months ago
Selected Answer: B
B is correct
upvoted 1 times
...
nilmans
1 year, 4 months ago
agree, B makes more sense here
upvoted 1 times
...
soonmo
1 year, 4 months ago
Selected Answer: B
Definitely B because overfitting comes from complex model that captures patterns of training data well. But D is getting this model more complex, worsening overfitting.
upvoted 1 times
soonmo
1 year, 4 months ago
Correct my reasoning! D is worsening overfitting because it feeds more data after overfitting arises. D is used for underfitted models.
upvoted 1 times
...
...
earthMover
1 year, 5 months ago
Selected Answer: B
Increasing Epoch only makes things worse on a overfitting model. You should perform regularization by introducing drop outs to generalize the model.
upvoted 1 times
...
user009
1 year, 7 months ago
Option B is the correct answer because increasing the dropout rate at the flatten layer helps prevent overfitting by randomly dropping out units during training, effectively creating a more robust model that can generalize better to new data. Dropout is a regularization technique that helps prevent overfitting by forcing the model to learn redundant representations of the data. By increasing the dropout rate at the flatten layer, the model becomes more generalized, which should help to improve the testing accuracy.
upvoted 1 times
...
AjoseO
1 year, 8 months ago
Selected Answer: B
Overfitting occurs when a model is too complex and memorizes the training data instead of learning the underlying pattern. As a result, the model performs well on the training data but poorly on new, unseen data. Increasing the dropout rate, a regularization technique, can help combat overfitting by randomly dropping out some neurons during training, which prevents the model from relying too heavily on any single feature.
upvoted 1 times
...
sqavi
1 year, 8 months ago
Selected Answer: B
Model is overfitting, I will go with option B, increasing epoch will cause more overfitting
upvoted 2 times
...
desperatestudent
1 year, 9 months ago
Selected Answer: B
it should answer B.
upvoted 1 times
...
Shailendraa
2 years, 1 month ago
12-sep exam
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago