exam questions

Exam Professional Machine Learning Engineer All Questions

View all questions & answers for the Professional Machine Learning Engineer exam

Exam Professional Machine Learning Engineer topic 1 question 10 discussion

Actual exam question from Google's Professional Machine Learning Engineer
Question #: 10
Topic #: 1
[All Professional Machine Learning Engineer Questions]

Your team needs to build a model that predicts whether images contain a driver's license, passport, or credit card. The data engineering team already built the pipeline and generated a dataset composed of 10,000 images with driver's licenses, 1,000 images with passports, and 1,000 images with credit cards. You now have to train a model with the following label map: [`˜drivers_license', `˜passport', `˜credit_card']. Which loss function should you use?

  • A. Categorical hinge
  • B. Binary cross-entropy
  • C. Categorical cross-entropy
  • D. Sparse categorical cross-entropy
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
ransev
Highly Voted 3 years, 7 months ago
Answer is C
upvoted 21 times
gcp2021go
3 years, 7 months ago
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 9 times
GogoG
3 years, 3 months ago
Definitely C - the target variable label formulated in the question requires a categorical cross entropy loss function i.e. 3 columns 'drivers_license' , 'passport', 'credit_card' that can take values 1, 0. Meanwhile sparse categorical cross entropy would require the labels to be integer encoded in a single vector, for example, 'drivers_license' = 1, 'passport' = 2, 'credit_card' = 3.
upvoted 8 times
desertlotus1211
1 month, 2 weeks ago
wrong - its; [0,1,2]
upvoted 1 times
...
dorinas
2 months, 2 weeks ago
Using 1, 2 and 3 the model might interpret one value to be more important than the other which is not the case
upvoted 1 times
...
...
...
...
gcp2021go
Highly Voted 3 years, 8 months ago
answer is D https://machinelearningmastery.com/how-to-choose-loss-functions-when-training-deep-learning-neural-networks/
upvoted 10 times
ori5225
3 years, 5 months ago
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 3 times
...
giaZ
2 years, 10 months ago
Literally from the link you posted: "A possible cause of frustration when using cross-entropy with classification problems with a large number of labels is the one hot encoding process. [...] This can mean that the target element of each training example may require a one hot encoded vector with tens or hundreds of thousands of zero values, requiring significant memory. Sparse cross-entropy addresses this by performing the same cross-entropy calculation of error, without requiring that the target variable be one hot encoded prior to training". Here we have 3 categories...No problem doing one-hot encoding. Answer: C
upvoted 2 times
...
...
ddeveloperr
Most Recent 1 day, 18 hours ago
Selected Answer: D
Since the problem is a multi-class classification task (choosing between drivers_license, passport, and credit_card), you need a loss function designed for multi-class classification. Sparse Categorical Cross-Entropy is the best choice because: The labels are integer-encoded (not one-hot encoded). It is computationally more efficient than Categorical Cross-Entropy when dealing with class indices instead of one-hot vectors. Why not the others? A (Categorical Hinge): Used for multi-class classification with hinge loss, typically for SVMs, not neural networks. B (Binary Cross-Entropy): Used for binary classification (two classes), while this problem has three classes. C (Categorical Cross-Entropy): Works for multi-class classification but requires one-hot encoded labels, whereas the dataset likely uses integer labels.
upvoted 1 times
...
vishalzade29
2 days, 3 hours ago
Selected Answer: C
Categorical cross-entropy is suitable for multi-class classification problems where each instance belongs to one and only one class. Since you have multiple classes (driver's license, passport, credit card), this loss function is appropriate.
upvoted 1 times
...
arjun2025
4 days, 15 hours ago
Selected Answer: D
In case of multiclass classification problems, we use sparse categorical cross‐entropy.
upvoted 1 times
...
strafer
1 week, 2 days ago
Selected Answer: C
Because you have a multi-class classification problem with mutually exclusive classes and a label map, categorical cross-entropy is the most suitable and commonly used loss function.
upvoted 1 times
...
moammary
2 weeks, 2 days ago
Selected Answer: C
The answer is C. No need to overthink it as sparse categorical cross entropy is used for sparse matrix which is not the case.
upvoted 1 times
...
kongae
1 month ago
Selected Answer: C
Answer will be D if the label values are integer but it is string, I will go for C
upvoted 1 times
...
desertlotus1211
1 month, 2 weeks ago
Selected Answer: D
The label map [driver's_license, passport, credit_card] naturally maps to 0, 1, 2 as per machine learning standards. Which is used in Sparse categorical cross-entropy
upvoted 1 times
...
rajshiv
2 months ago
Selected Answer: C
It is C. D will be appropriate only if the labels are integers which is not true in this case.
upvoted 1 times
...
joqu
2 months, 2 weeks ago
Selected Answer: D
The question clearly says "You now have to train a model with the following LABEL MAP". Label map is not one-hot encoding.
upvoted 1 times
...
jkkim_jt
3 months, 2 weeks ago
Selected Answer: D
ㅇ Categorial Cross-Entropy for the multiple classification with one-hot-encoding labels ㅇ Sparse Categorical Cross-Entropy for the multiple classification with index labels
upvoted 1 times
...
Prakzz
7 months, 1 week ago
Selected Answer: D
C needs the target to be One hot encoded already. Since it is not, the answer is D
upvoted 1 times
...
PhilipKoku
8 months ago
Selected Answer: C
C) Multi-Class Classification (Three or More Classes): Since you have three classes, you should use a multi-class loss function. The most common choice for multi-class image classification is categorical cross-entropy2. Categorical cross-entropy is designed for scenarios where each input belongs to exactly one class (i.e., mutually exclusive classes). Therefore, the correct answer is C. Categorical cross-entropy. It’s well-suited for multi-class classification tasks like this one. References: How to Choose Loss Functions When Training Deep Learning Neural Networks (https://machinelearningmastery.com/how-to-choose-loss-functions-when-training-deep-learning-neural-networks/) Stack Exchange: How to know which loss function is suitable for image classification? (https://datascience.stackexchange.com/questions/58138/how-to-know-which-loss-function-is-suitable-for-image-classification)
upvoted 1 times
...
gscharly
9 months, 2 weeks ago
Selected Answer: C
I'd go with C. Categorical cross entropy is used when classes are mutually exclusive. If the number of classes was very high, then we could use sparse categorical cross entropy.
upvoted 1 times
...
pinimichele01
9 months, 4 weeks ago
Selected Answer: D
Use sparse categorical crossentropy when your classes are mutually exclusive (e.g. when each sample belongs exactly to one class) and categorical crossentropy when one sample can have multiple classes or labels are soft probabilities (like [0.5, 0.3, 0.2]).
upvoted 3 times
pinimichele01
9 months, 2 weeks ago
A. Categorical hinge : Mainly for SVM soft margins B. Binary cross-entropy : for 2 class only C. Categorical cross-entropy: Multi class but not necessarily Mutually exclusive D. Sparse categorical cross-entropy : Multi class + Mutually exclusive only , saves memory too
upvoted 2 times
...
pinimichele01
9 months, 2 weeks ago
https://www.tensorflow.org/api_docs/python/tf/keras/losses/categorical_crossentropy https://www.tensorflow.org/api_docs/python/tf/keras/metrics/sparse_categorical_crossentropy
upvoted 1 times
...
...
Yan_X
10 months, 1 week ago
Selected Answer: C
C D is for integer value instead of one-hot encoded vectors, in our question, it is 'drivers_license', 'passport', 'credit_card' one-hot.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago