exam questions

Exam Professional Machine Learning Engineer All Questions

View all questions & answers for the Professional Machine Learning Engineer exam

Exam Professional Machine Learning Engineer topic 1 question 10 discussion

Actual exam question from Google's Professional Machine Learning Engineer
Question #: 10
Topic #: 1
[All Professional Machine Learning Engineer Questions]

Your team needs to build a model that predicts whether images contain a driver's license, passport, or credit card. The data engineering team already built the pipeline and generated a dataset composed of 10,000 images with driver's licenses, 1,000 images with passports, and 1,000 images with credit cards. You now have to train a model with the following label map: [`˜drivers_license', `˜passport', `˜credit_card']. Which loss function should you use?

  • A. Categorical hinge
  • B. Binary cross-entropy
  • C. Categorical cross-entropy
  • D. Sparse categorical cross-entropy
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
ransev
Highly Voted 3 years, 6 months ago
Answer is C
upvoted 20 times
gcp2021go
3 years, 5 months ago
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 9 times
GogoG
3 years, 2 months ago
Definitely C - the target variable label formulated in the question requires a categorical cross entropy loss function i.e. 3 columns 'drivers_license' , 'passport', 'credit_card' that can take values 1, 0. Meanwhile sparse categorical cross entropy would require the labels to be integer encoded in a single vector, for example, 'drivers_license' = 1, 'passport' = 2, 'credit_card' = 3.
upvoted 8 times
desertlotus1211
4 days, 22 hours ago
wrong - its; [0,1,2]
upvoted 1 times
...
dorinas
1 month ago
Using 1, 2 and 3 the model might interpret one value to be more important than the other which is not the case
upvoted 1 times
...
...
...
...
gcp2021go
Highly Voted 3 years, 6 months ago
answer is D https://machinelearningmastery.com/how-to-choose-loss-functions-when-training-deep-learning-neural-networks/
upvoted 10 times
ori5225
3 years, 4 months ago
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 3 times
...
giaZ
2 years, 9 months ago
Literally from the link you posted: "A possible cause of frustration when using cross-entropy with classification problems with a large number of labels is the one hot encoding process. [...] This can mean that the target element of each training example may require a one hot encoded vector with tens or hundreds of thousands of zero values, requiring significant memory. Sparse cross-entropy addresses this by performing the same cross-entropy calculation of error, without requiring that the target variable be one hot encoded prior to training". Here we have 3 categories...No problem doing one-hot encoding. Answer: C
upvoted 2 times
...
...
desertlotus1211
Most Recent 4 days, 22 hours ago
Selected Answer: D
The label map [driver's_license, passport, credit_card] naturally maps to 0, 1, 2 as per machine learning standards. Which is used in Sparse categorical cross-entropy
upvoted 1 times
...
rajshiv
2 weeks, 1 day ago
Selected Answer: C
It is C. D will be appropriate only if the labels are integers which is not true in this case.
upvoted 1 times
...
joqu
1 month ago
Selected Answer: D
The question clearly says "You now have to train a model with the following LABEL MAP". Label map is not one-hot encoding.
upvoted 1 times
...
jkkim_jt
2 months ago
Selected Answer: D
ㅇ Categorial Cross-Entropy for the multiple classification with one-hot-encoding labels ㅇ Sparse Categorical Cross-Entropy for the multiple classification with index labels
upvoted 1 times
...
Prakzz
5 months, 3 weeks ago
Selected Answer: D
C needs the target to be One hot encoded already. Since it is not, the answer is D
upvoted 1 times
...
PhilipKoku
6 months, 2 weeks ago
Selected Answer: C
C) Multi-Class Classification (Three or More Classes): Since you have three classes, you should use a multi-class loss function. The most common choice for multi-class image classification is categorical cross-entropy2. Categorical cross-entropy is designed for scenarios where each input belongs to exactly one class (i.e., mutually exclusive classes). Therefore, the correct answer is C. Categorical cross-entropy. It’s well-suited for multi-class classification tasks like this one. References: How to Choose Loss Functions When Training Deep Learning Neural Networks (https://machinelearningmastery.com/how-to-choose-loss-functions-when-training-deep-learning-neural-networks/) Stack Exchange: How to know which loss function is suitable for image classification? (https://datascience.stackexchange.com/questions/58138/how-to-know-which-loss-function-is-suitable-for-image-classification)
upvoted 1 times
...
gscharly
8 months ago
Selected Answer: C
I'd go with C. Categorical cross entropy is used when classes are mutually exclusive. If the number of classes was very high, then we could use sparse categorical cross entropy.
upvoted 1 times
...
pinimichele01
8 months, 2 weeks ago
Selected Answer: D
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 3 times
pinimichele01
8 months ago
A. Categorical hinge : Mainly for SVM soft margins B. Binary cross-entropy : for 2 class only C. Categorical cross-entropy: Multi class but not necessarily Mutually exclusive D. Sparse categorical cross-entropy : Multi class + Mutually exclusive only , saves memory too
upvoted 2 times
...
pinimichele01
8 months ago
https://www.tensorflow.org/api_docs/python/tf/keras/losses/categorical_crossentropy https://www.tensorflow.org/api_docs/python/tf/keras/metrics/sparse_categorical_crossentropy
upvoted 1 times
...
...
Yan_X
8 months, 3 weeks ago
Selected Answer: C
C D is for integer value instead of one-hot encoded vectors, in our question, it is 'drivers_license', 'passport', 'credit_card' one-hot.
upvoted 1 times
...
Paulus89
9 months, 3 weeks ago
Selected Answer: C
It depends on how the labels are encoded. If onehot use CCE. If its a single integer representing the class use SCCE (Source: same as in the official (wrong) answer) From the question it's not clear how the labels are encoded. But for just 3 classes there is no doubt it's better to go with one-hot encoding. Memory restrictions or a huge number of classes might point to SCCE
upvoted 1 times
...
Zwi3b3l
11 months ago
Selected Answer: D
You now HAVE TO to train a model with the following label map: [`˜drivers_license', `˜passport', `˜credit_card'].
upvoted 2 times
...
Sum_Sum
1 year, 1 month ago
Selected Answer: C
If you are wondering between C & D - think about what "sparse" means It is used when dealing with hundreds of categories
upvoted 1 times
...
Sahana_98
1 year, 1 month ago
Selected Answer: D
mutually exclusive classes
upvoted 1 times
...
syedsajjad
1 year, 2 months ago
In this case, we have a multi-class classification problem with three classes: driver's license, passport, and credit card. Therefore, we should use the categorical cross-entropy loss function to train our model. Sparse categorical cross-entropy is used for multi-class classification problems where the labels are represented in a sparse matrix format. This is not the case in this problem.
upvoted 2 times
...
lalala_meow
1 year, 3 months ago
Selected Answer: C
Only 3 categories of values being either T or F. They don't really need to be integer encoded, which differs sparse cross-entropy from categorical.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago