exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 211 discussion

A bank wants to use a machine learning (ML) model to predict if users will default on credit card payments. The training data consists of 30,000 labeled records and is evenly balanced between two categories. For the model, an ML specialist selects the Amazon SageMaker built-in XGBoost algorithm and configures a SageMaker automatic hyperparameter optimization job with the Bayesian method. The ML specialist uses the validation accuracy as the objective metric.

When the bank implements the solution with this model, the prediction accuracy is 75%. The bank has given the ML specialist 1 day to improve the model in production.

Which approach is the FASTEST way to improve the model's accuracy?

  • A. Run a SageMaker incremental training based on the best candidate from the current model's tuning job. Monitor the same metric that was used as the objective metric in the previous tuning, and look for improvements.
  • B. Set the Area Under the ROC Curve (AUC) as the objective metric for a new SageMaker automatic hyperparameter tuning job. Use the same maximum training jobs parameter that was used in the previous tuning job.
  • C. Run a SageMaker warm start hyperparameter tuning job based on the current model’s tuning job. Use the same objective metric that was used in the previous tuning.
  • D. Set the F1 score as the objective metric for a new SageMaker automatic hyperparameter tuning job. Double the maximum training jobs parameter that was used in the previous tuning job.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
loict
7 months, 2 weeks ago
Selected Answer: C
A. NO - Incremental training not supported by XGBoost (https://docs.aws.amazon.com/sagemaker/latest/dg/incremental-training.html) B. NO - we don't want to change the objective and restart from scratch C. YES - warm start can leverage new data from production for further tuning D. NO - we don't want to start from the training from scratch or use F1 score as objective
upvoted 3 times
...
Mickey321
8 months, 2 weeks ago
Selected Answer: C
Answer C
upvoted 1 times
...
kaike_reis
8 months, 2 weeks ago
Given time constraint, I believe that C is the crrect one.
upvoted 2 times
...
Ahmedhadi_
1 year ago
Selected Answer: C
C is the correct answer because it uses the results from past HPO jobs and builds upon them to improve accuracy.
upvoted 3 times
...
mawsman
1 year ago
Selected Answer: C
I go with C - warm start, A is not supported on XGBoost, and other options will start tuning from scratch and might be just as bad as the inital tuning job. We only have 1 day, so more tuning with existing job to inform the new trainging job is the only option here https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-warm-start.html
upvoted 4 times
...
Mllb
1 year ago
C is the correct answer. You can't use Incremental training on Xgboost algorithm https://docs.aws.amazon.com/sagemaker/latest/dg/incremental-training.html
upvoted 1 times
Mllb
1 year ago
It appears in 2023-April-3
upvoted 1 times
...
...
SANDEEP_AWS
1 year, 1 month ago
Selected Answer: B
Since ROC-AUC is presumed to be one of the best for a binary classification. Hence option B. Option A -- Incremental training is suited wherein the training dataset gets updated frequently.
upvoted 3 times
...
drcok87
1 year, 2 months ago
https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-warm-start.html https://francesca-donadoni.medium.com/training-an-xgboost-model-for-pricing-analysis-using-aws-sagemaker-55d777708e52 https://docs.aws.amazon.com/sagemaker/latest/dg/incremental-training.html C
upvoted 3 times
drcok87
1 year, 2 months ago
also it cannot be a: "Only three built-in algorithms currently support incremental training: Object Detection - MXNet, Image Classification - MXNet, and Semantic Segmentation Algorithm." from https://docs.aws.amazon.com/sagemaker/latest/dg/incremental-training.html
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago