Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 134 discussion

A company is building a predictive maintenance model based on machine learning (ML). The data is stored in a fully private Amazon S3 bucket that is encrypted at rest with AWS Key Management Service (AWS KMS) CMKs. An ML specialist must run data preprocessing by using an Amazon SageMaker Processing job that is triggered from code in an Amazon SageMaker notebook. The job should read data from Amazon S3, process it, and upload it back to the same S3 bucket.
The preprocessing code is stored in a container image in Amazon Elastic Container Registry (Amazon ECR). The ML specialist needs to grant permissions to ensure a smooth data preprocessing workflow.
Which set of actions should the ML specialist take to meet these requirements?

  • A. Create an IAM role that has permissions to create Amazon SageMaker Processing jobs, S3 read and write access to the relevant S3 bucket, and appropriate KMS and ECR permissions. Attach the role to the SageMaker notebook instance. Create an Amazon SageMaker Processing job from the notebook.
  • B. Create an IAM role that has permissions to create Amazon SageMaker Processing jobs. Attach the role to the SageMaker notebook instance. Create an Amazon SageMaker Processing job with an IAM role that has read and write permissions to the relevant S3 bucket, and appropriate KMS and ECR permissions.
  • C. Create an IAM role that has permissions to create Amazon SageMaker Processing jobs and to access Amazon ECR. Attach the role to the SageMaker notebook instance. Set up both an S3 endpoint and a KMS endpoint in the default VPC. Create Amazon SageMaker Processing jobs from the notebook.
  • D. Create an IAM role that has permissions to create Amazon SageMaker Processing jobs. Attach the role to the SageMaker notebook instance. Set up an S3 endpoint in the default VPC. Create Amazon SageMaker Processing jobs with the access key and secret key of the IAM user with appropriate KMS and ECR permissions.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
spaceexplorer
Highly Voted 2 years, 5 months ago
Selected Answer: A
A; IAM assigned to SageMaker Notebook instance can be passed to other SageMaker jobs like training; processing, automl, etc.,
upvoted 11 times
kukreti18
1 year, 3 months ago
Why should the IAM permission be assigned to create S3, when the data is already stored in S3? It only require permission to read and write data in S3. I believe A is incorrect.
upvoted 3 times
...
...
MJSY
Most Recent 2 weeks, 1 day ago
Selected Answer: B
A is not correct, for safety and principle of least privilege, you should decouple the role of each service.
upvoted 1 times
...
Chiquitabandita
4 months, 2 weeks ago
Selected Answer: B
based on answers from here
upvoted 1 times
...
F1Fan
5 months, 3 weeks ago
Selected Answer: A
Option A: The IAM role is created with the necessary permissions to create Amazon SageMaker Processing jobs, read and write data to the relevant S3 bucket, and access the KMS CMKs and ECR container image. The IAM role is attached to the SageMaker notebook instance, which allows the notebook to assume the role and create the Amazon SageMaker Processing job with the necessary permissions. The Amazon SageMaker Processing job is created from the notebook, which ensures that the job has the necessary permissions to read data from S3, process it, and upload it back to the same S3 bucket. Option B is close, but it's not entirely correct. It mentions creating an IAM role with permissions to create Amazon SageMaker Processing jobs, but it doesn't mention attaching the role to the SageMaker notebook instance. This is a crucial step, as it allows the notebook to assume the role and create the Amazon SageMaker Processing job with the necessary permissions.
upvoted 2 times
...
kyuhuck
8 months, 1 week ago
Selected Answer: B
The correct solution for granting permissions for data preprocessing is to use the following steps:Create an IAM role that has permissions to create Amazon SageMaker Processing jobs. Attach therole to the SageMaker notebook instance. This role allows the ML specialist to run Processing jobsfrom the notebook code1 Create an Amazon SageMaker Processing job with an IAM role that hasread and write permissions to the relevant S3 bucket, and appropriate KMS and ECR permissions.This role allows the Processing job to access the data in the encrypted S3 bucket, decrypt it with theKMS CMK, and pull the container image from ECR23 The other options are incorrect because theyeither miss some permissions or use unnecessary steps. For example
upvoted 1 times
...
CloudHandsOn
8 months, 4 weeks ago
Selected Answer: B
Least priv.
upvoted 1 times
...
CloudHandsOn
9 months, 1 week ago
Selected Answer: B
A. Create an IAM role with S3, KMS, ECR permissions and SageMaker Processing job creation permissions. Attach it to the SageMaker notebook instance: This option seems comprehensive as it includes all necessary permissions. However, attaching this role directly to the SageMaker notebook instance would not be sufficient for the Processing job itself. The Processing job needs its own role with appropriate permissions. B. Create two IAM roles: one for the SageMaker notebook with permissions to create Processing jobs, and another for the Processing job itself with S3, KMS, and ECR permissions: This option is more aligned with best practices. The notebook instance and the Processing job have different roles tailored to their specific needs. This separation ensures that each service has only the permissions necessary for its operation, following the principle of least privilege.
upvoted 3 times
...
rav009
9 months, 2 weeks ago
The processing job may not run on the notebook instance. AWS will provide resources to execute the job. So A is wrong. B.
upvoted 2 times
...
endeesa
10 months, 3 weeks ago
Selected Answer: B
If we follow the principle of Least Privillege, B is correct. The notebook instance does not need access to S3 and KMS given that it is only needed to trigger the processsing Job.
upvoted 2 times
...
u_b
11 months ago
Not A b/c it does not indicate perms given to the Job via IAM role. => I went with B.
upvoted 1 times
...
DimLam
11 months, 3 weeks ago
Selected Answer: B
My answer is B. The notebook instance doesn't need access to S3 and ECR. This access is needed for Processing Job only. And as a best practice of least privilege I'll choose B
upvoted 3 times
...
Rejju
1 year ago
Selected Answer: B
where permissions are granted to the SageMaker Processing job itself and not to the notebook instance. This approach offers better security and control over permissions, making it the preferred choice for running SageMaker Processing jobs with the required access to S3, KMS, and ECR. ( Follows the principle of least privilege and have more control over permissions.
upvoted 2 times
...
loict
1 year, 1 month ago
Selected Answer: A
It says "Amazon SageMaker Processing job that is triggered from code in an Amazon SageMaker notebook." - so A or C. There is no need to create an S3 endpoint (C), that is only to allow traffic over the internet. So A.
upvoted 1 times
...
Mickey321
1 year, 1 month ago
Selected Answer: B
Confusing between A and B. Leaning to B The main difference between A and B is the IAM role that is attached to the SageMaker notebook instance. In A, the role has permissions to access the data, the container image, and the KMS CMK. In B, the role only has permissions to create SageMaker Processing jobs. This means that in A, the notebook instance can potentially access or modify the data or the image without using a Processing job, which is not desirable. In B, the notebook instance can only create Processing jobs, and the Processing jobs themselves have a separate IAM role that grants them access to the data, the image, and the KMS CMK. This way, the data and the image are only accessed by the Processing jobs, which are more secure and controlled than the notebook instance.
upvoted 4 times
...
kaike_reis
1 year, 2 months ago
Selected Answer: A
Letters C and D are wrong, as they bring VPC, something that is not mentioned in the problem. Letter A is correct, since Letter B asks for the creation of two different IAM roles.
upvoted 1 times
DimLam
11 months, 3 weeks ago
What is the problem with creating two different IAM roles?
upvoted 1 times
...
...
ccpmad
1 year, 2 months ago
Selected Answer: A
Option A ensures that the role has the necessary permissions to access the required resources (S3, KMS, ECR) and that the notebook has the ability to create a processing job in SageMaker seamlessly. It also follows the principle of "least privilege" by granting only the necessary permissions to perform the task without exposing more access than required.
upvoted 1 times
...
ADVIT
1 year, 3 months ago
Probably A is simpler than B. Per https://docs.aws.amazon.com/sagemaker/latest/dg/security-iam-awsmanpol.html#security-iam-awsmanpol-AmazonSageMakerFullAccess One IAM Role can do everything.
upvoted 1 times
DimLam
11 months, 3 weeks ago
It's rarely a best prectice
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...