exam questions

Exam AWS Certified Machine Learning Engineer - Associate MLA-C01 All Questions

View all questions & answers for the AWS Certified Machine Learning Engineer - Associate MLA-C01 exam

Exam AWS Certified Machine Learning Engineer - Associate MLA-C01 topic 1 question 18 discussion

An ML engineer has trained a neural network by using stochastic gradient descent (SGD). The neural network performs poorly on the test set. The values for training loss and validation loss remain high and show an oscillating pattern. The values decrease for a few epochs and then increase for a few epochs before repeating the same cycle.
What should the ML engineer do to improve the training process?

  • A. Introduce early stopping.
  • B. Increase the size of the test set.
  • C. Increase the learning rate.
  • D. Decrease the learning rate.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
ninomfr64
1 month, 3 weeks ago
Selected Answer: D
A. No, early stopping is for preventing overfitting B. No, increasing test will not help with oscillating loss C. No, increasing learning rate will make things worsening D. Oscillating loss in training is a sign that the training is not converging, this can happen when learning rate is too high. Reducing learning rate will help here
upvoted 4 times
...
gulf1324
1 month, 4 weeks ago
Selected Answer: D
Oscillating patterns in a train/validation loss shows it's not converging to a minima. Low learning rate will make it converge.
upvoted 1 times
...
feelgoodfactor
2 months, 2 weeks ago
Selected Answer: D
The oscillating loss during training is a clear sign that the learning rate is too high. Reducing the learning rate will stabilize the optimization process, allowing the model to converge smoothly.
upvoted 2 times
...
motk123
2 months, 3 weeks ago
Selected Answer: D
The oscillating pattern of training and validation loss indicates that the learning rate is too high. A high learning rate causes the model to overshoot the optimal point in the loss landscape, leading to oscillations instead of convergence. Reducing the learning rate allows the model to make smaller, more precise updates to the weights, improving convergence. A. Early stopping prevents overfitting by halting training when validation performance stops improving. However, it does not address the root cause of oscillating loss. B. The size of the test set does not affect the training dynamics or loss patterns. C. Increasing the learning rate would worsen the oscillations and prevent the model from converging.
upvoted 4 times
...
GiorgioGss
3 months ago
Selected Answer: D
oscillating = decrease the learning rate
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago