exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 88 discussion

A web-based company wants to improve its conversion rate on its landing page. Using a large historical dataset of customer visits, the company has repeatedly trained a multi-class deep learning network algorithm on Amazon SageMaker. However, there is an overfitting problem: training data shows 90% accuracy in predictions, while test data shows 70% accuracy only.
The company needs to boost the generalization of its model before deploying it into production to maximize conversions of visits to purchases.
Which action is recommended to provide the HIGHEST accuracy model for the company's test and validation data?

  • A. Increase the randomization of training data in the mini-batches used in training
  • B. Allocate a higher proportion of the overall data to the training dataset
  • C. Apply L1 or L2 regularization and dropouts to the training
  • D. Reduce the number of layers and units (or neurons) from the deep learning network
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
knightknt
Highly Voted 2 years ago
I think C will be answer, because we even don't know how many layers now, so apply L1,L2 and dropouts layer will be first resort to solve overfitting. If it still does not work, then to reduce layers
upvoted 11 times
...
mamun4105
Most Recent 7 months, 2 weeks ago
D: D is the correct answer. C could be the answer only if it is a regression problem. You cannot apply L1 (Lasso regression) and L2 (Ridge regression) to classification problems. However, you can use dropout here.
upvoted 1 times
DimLam
5 months, 3 weeks ago
Why do you think it works only for regression problems? L1/L2 regularizations are just adding penalties to loss functions. I don't see any problems with applying it to DL model
upvoted 1 times
...
...
Mickey321
8 months ago
Selected Answer: C
C Regulization
upvoted 2 times
...
kaike_reis
8 months, 3 weeks ago
Selected Answer: C
if you see overfit think regularization.
upvoted 1 times
...
Khalil11
1 year ago
Selected Answer: C
C is the correct answer: The overfitting problem can be addressed by applying regularization techniques such as L1 or L2 regularization and dropouts. Regularization techniques add a penalty term to the cost function of the model, which helps to reduce the complexity of the model and prevent it from overfitting to the training data. Dropouts randomly turn off some of the neurons during training, which also helps to prevent overfitting.
upvoted 2 times
...
Valcilio
1 year, 1 month ago
Selected Answer: C
D can work, but C is a better answer!
upvoted 2 times
...
drcok87
1 year, 2 months ago
C and D both seems to be correct but, seems like removing layer is first step in to optimization https://www.kaggle.com/general/175912 d
upvoted 2 times
...
AjoseO
1 year, 2 months ago
Selected Answer: C
C. Apply L1 or L2 regularization and dropouts to the training" because regularization can help reduce overfitting by adding a penalty to the loss function for large weights, preventing the model from memorizing the training data. Dropout is a regularization technique that randomly drops out neurons during the training process, further reducing the risk of overfitting.
upvoted 1 times
...
albu44
1 year, 3 months ago
Selected Answer: D
"The first step when dealing with overfitting is to decrease the complexity of the model. To decrease the complexity, we can simply remove layers or reduce the number of neurons to make the network smaller." https://www.kdnuggets.com/2019/12/5-techniques-prevent-overfitting-neural-networks.html
upvoted 1 times
...
Peeking
1 year, 4 months ago
Selected Answer: D
Deep learning tuning order: 1. Number of layers 2. Number of neurons (indirectly implements dropout) 3. L1/L2 regularization 4. Dropout
upvoted 4 times
kaike_reis
8 months, 3 weeks ago
the problem is overfitting, not HP Tuning.
upvoted 1 times
Shakespeare
4 months, 1 week ago
Can be used for overfitting as well, but the problem does not say it is a deep learning algorithm being used so C would be more appropriate.
upvoted 1 times
...
...
...
Parth12
1 year, 9 months ago
Selected Answer: C
Here we are looking to reduce the Overfitting to improve the generalization. In order to do so, L1(or Lasso) regression has always been a good aide.
upvoted 3 times
mamun4105
7 months, 2 weeks ago
This is not a regression problem at all.
upvoted 1 times
...
...
mtp1993
1 year, 10 months ago
Selected Answer: C
C, Regularization and dropouts should be the first attempt
upvoted 3 times
...
ovokpus
1 year, 10 months ago
Selected Answer: C
Yes, C is right here. Regularization and Dropouts
upvoted 3 times
...
Abdelrahman_Omran
1 year, 12 months ago
Selected Answer: C
C is the answer
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago