A Machine Learning Specialist is developing a custom video recommendation model for an application. The dataset used to train this model is very large with millions of data points and is hosted in an Amazon S3 bucket. The Specialist wants to avoid loading all of this data onto an Amazon SageMaker notebook instance because it would take hours to move and will exceed the attached 5 GB Amazon EBS volume on the notebook instance.
Which approach allows the Specialist to use all the data to train the model?
JayK
Highly Voted 3 years, 6 months agoliangfb
Highly Voted 3 years, 6 months agoJonSno
Most Recent 2 months, 1 week agoreginav
4 months, 1 week agoMickey321
7 months agoloict
7 months agokyuhuck
1 year, 2 months agoVenkatesh_Babu
1 year, 9 months agoValcilio
2 years, 1 month agoyemauricio
2 years, 4 months agoShailendraa
2 years, 7 months agoHuy
3 years, 5 months agocloud_trail
3 years, 5 months agobobdylan1
3 years, 6 months agoWillnguyen22
3 years, 6 months agoGeeBeeEl
3 years, 6 months agoroytruong
3 years, 6 months ago