exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 314 discussion

A data scientist is trying to improve the accuracy of a neural network classification model. The data scientist wants to run a large hyperparameter tuning job in Amazon SageMaker. However, previous smaller tuning jobs on the same model often ran for several weeks. The ML specialist wants to reduce the computation time required to run the tuning job.

Which actions will MOST reduce the computation time for the hyperparameter tuning job? (Choose two.)

  • A. Use the Hyperband tuning strategy.
  • B. Increase the number of hyperparameters.
  • C. Set a lower value for the MaxNumberOfTrainingJobs parameter.
  • D. Use the grid search tuning strategy.
  • E. Set a lower value for the MaxParallelTrainingJobs parameter.
Show Suggested Answer Hide Answer
Suggested Answer: AC 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
ef12052
2 weeks, 3 days ago
Selected Answer: AC
Reducing parallelism increases sequential execution time, making the job take longer overall.
upvoted 1 times
...
wiss_90
6 months, 1 week ago
AE https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-considerations.html#automatic-model-tuning-num-hyperparameters
upvoted 2 times
...
luccabastos
7 months, 2 weeks ago
Selected Answer: AC
(ChatGPT) Option A (Hyperband): Efficiently utilizes computational resources. Reduces computation time by early stopping unpromising training jobs. Allows for a broader search of hyperparameter space within a shorter time. Option C (Lower MaxNumberOfTrainingJobs): Reduces the total number of training jobs. Directly decreases computation time. Helps stay within the small compute budget.
upvoted 2 times
sfwewv
1 month, 3 weeks ago
stop give GPT answer, GPT may not correct
upvoted 1 times
...
...
Peter_Hsieh
12 months ago
Selected Answer: AE
https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-considerations.html#automatic-model-tuning-num-hyperparameters
upvoted 4 times
...
ggrodskiy
12 months ago
Correct AE
upvoted 1 times
...
vkbajoria
1 year, 1 month ago
Selected Answer: AC
A for sure C will reduce the tuning time because we are limiting the no. of Max Training jobs. E would be find with object is not reducing time
upvoted 2 times
...
AIWave
1 year, 1 month ago
Selected Answer: AC
A: Yes - hyperband tuning early stops bad performing tuning jobs and reallocates resources to better performing ones B: No - Increase in hyperparams will increase time taken C: Yes - Limiting number of tuning jobs causes system not to run through entire list of tuning jobs reducing time D: No - grid search is computationally expensive and will take longer E: No - will increase time taken
upvoted 3 times
...
gsw
1 year, 1 month ago
A and D surely. Grid search to tune the hyperparameters.Grid search algorithm is more efficient than exhaustive search and will speed up tuning the hyperparameters.
upvoted 1 times
...
F1Fan
1 year, 1 month ago
Option A: Use the Hyperband tuning strategy. The Hyperband tuning strategy is a resource-efficient and time-saving approach for hyperparameter tuning. It works by running a set of hyperparameter configurations for a small number of training iterations and eliminating the poorly performing configurations early on. This strategy can significantly reduce the overall computation time compared to traditional methods like grid search or random search, especially for large hyperparameter spaces or time-consuming models like neural networks. Option E: Set a lower value for the MaxParallelTrainingJobs parameter. The MaxParallelTrainingJobs parameter in Amazon SageMaker specifies the maximum number of concurrent training jobs to be run in parallel during the hyperparameter tuning process. By setting a lower value for this parameter, the data scientist can limit the amount of computational resources used simultaneously, potentially reducing the overall computation time and cost.
upvoted 2 times
F1Fan
1 year, 1 month ago
On second thought A,C makes more sense. C. Set a lower value for the MaxNumberOfTrainingJobs parameter. - The MaxNumberOfTrainingJobs parameter specifies the maximum number of training jobs that can be created during the tuning job. - Setting a lower value for this parameter will limit the number of training jobs and potentially reduce the computation time. - However, it may also limit the exploration of the hyperparameter space and potentially lead to suboptimal results. - This option should be considered with caution and in conjunction with other strategies to ensure adequate hyperparameter exploration.
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago