exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 29 discussion

An insurance company is developing a new device for vehicles that uses a camera to observe drivers' behavior and alert them when they appear distracted. The company created approximately 10,000 training images in a controlled environment that a Machine Learning Specialist will use to train and evaluate machine learning models.
During the model evaluation, the Specialist notices that the training error rate diminishes faster as the number of epochs increases and the model is not accurately inferring on the unseen test images.
Which of the following should be used to resolve this issue? (Choose two.)

  • A. Add vanishing gradient to the model.
  • B. Perform data augmentation on the training data.
  • C. Make the neural network architecture complex.
  • D. Use gradient checking in the model.
  • E. Add L2 regularization to the model.
Show Suggested Answer Hide Answer
Suggested Answer: BE 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
vetal
Highly Voted 3 years, 1 month ago
The model must have been overfitted. Regularization helps to solve the overfitting problem in machine learning (as well as data augmentation). Correct answers should be BE.
upvoted 36 times
rajs
3 years ago
Agreed 100%
upvoted 5 times
...
jasonsunbao
3 years, 1 month ago
agree on BE
upvoted 3 times
...
...
benson2021
Highly Voted 2 years, 12 months ago
Answer: BE https://www.kdnuggets.com/2019/12/5-techniques-prevent-overfitting-neural-networks.html 5 techniques to prevent overfitting: 1. Simplifying the model 2. Early stopping 3. Use data argumentation 4. Use regularization 5. Use dropouts
upvoted 15 times
...
JonSno
Most Recent 2 months, 1 week ago
Selected Answer: BE
he issue described suggests that the model is overfitting to the training data: Training error decreases quickly, meaning the model is learning the training set very well. Poor performance on unseen test data, indicating overfitting. To resolve overfitting, the Machine Learning Specialist should: Perform Data Augmentation (B) Expands the training dataset artificially by applying transformations (e.g., rotations, flips, brightness changes, cropping). Helps the model generalize better by exposing it to more diverse variations of the same class. Add L2 Regularization (E) Also known as weight decay, it penalizes large weights, preventing the model from memorizing the training data. Encourages simpler models, which reduces variance and improves generalization.
upvoted 2 times
...
delfoxete
8 months, 3 weeks ago
Selected Answer: BE
agreed with vetal
upvoted 1 times
...
loict
1 year, 1 month ago
Selected Answer: BE
A. NO - vanishing gradient is somebody bad they might happen and prevent convergence, we don't want that or something we can add explicitly. it is a result of the learning B. YES - we have a overfitting problem so more training examples will help C. NO - we already have good accuracy on the training set D. NO - gradient checking is to find bugs in model implementation E. YES - we have a overfitting problem
upvoted 2 times
...
John_Pongthorn
2 years, 8 months ago
B. Perform data augmentation on the training data. ( it should add validation data as well) data should be distributed among train validation and test.
upvoted 1 times
...
KM226
2 years, 9 months ago
Selected Answer: BE
Answer B&E looks good
upvoted 2 times
...
engomaradel
2 years, 12 months ago
B & E is the correct ans
upvoted 1 times
...
roytruong
3 years ago
BE is exact
upvoted 3 times
stamarpadar
3 years ago
BE are the correct answers
upvoted 4 times
...
...
VB
3 years ago
Looks like B and D are correct.. For D -> https://www.youtube.com/watch?v=P6EtCVrvYPU
upvoted 3 times
C10ud9
3 years ago
gradient checking doesn't resolve the issue, but adding it will confirm / deny the issue. So, it helps to validate the issue but not resolve. I would say B, E are correct
upvoted 3 times
...
...
VB
3 years ago
L2 regularization tries to reduce the possibility of overfitting by keeping the values of the weights and biases small.
upvoted 3 times
...
hughhughhugh
3 years ago
why not becuase of vanishing gradient?
upvoted 1 times
It626
1 year, 9 months ago
Vanishing gradients are a problem when training a NN. Answer A mentions that the solution should be to add that, which is not possible. Correct solution is BE. https://www.kdnuggets.com/2022/02/vanishing-gradient-problem.html
upvoted 1 times
...
...
PRC
3 years ago
This is L2 Regularization....Do you think this is the right answer?
upvoted 1 times
...
WWODIN
3 years, 1 month ago
agree BE
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago