exam questions

Exam AI-102 All Questions

View all questions & answers for the AI-102 exam

Exam AI-102 topic 3 question 20 discussion

Actual exam question from Microsoft's AI-102
Question #: 19
Topic #: 3
[All AI-102 Questions]

You are training a Language Understanding model for a user support system.
You create the first intent named GetContactDetails and add 200 examples.
You need to decrease the likelihood of a false positive.
What should you do?

  • A. Enable active learning.
  • B. Add a machine learned entity.
  • C. Add additional examples to the GetContactDetails intent.
  • D. Add examples to the None intent.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Isidro
Highly Voted 2 years, 7 months ago
I would say is D) as per the following: https://docs.microsoft.com/en-us/azure/cognitive-services/language-service/conversational-language-understanding/concepts/none-intent#adding-examples-to-the-none-intent
upvoted 32 times
mk1967
2 years, 4 months ago
Agreed, as stated in the link: "You should also consider adding false positive examples to the None intent."
upvoted 5 times
...
rdemontis
1 year, 2 months ago
agree with you, thanks for the provided documentation
upvoted 3 times
...
...
evangelist
Highly Voted 11 months, 2 weeks ago
False positive means => The model needs examples of what it should not classify as "GetContactDetails," which is the role of the "None" intent. Therefore, the most effective approach is to add a diverse range of examples to the "None" intent, covering various phrases and queries that are outside the scope of "GetContactDetails." This helps create a clear boundary for the model, reducing the likelihood of it mistakenly classifying unrelated inputs as belonging to the "GetContactDetails" intent.
upvoted 9 times
...
HaraTadahisa
Most Recent 6 months, 3 weeks ago
Selected Answer: D
I say this answer is D.
upvoted 1 times
...
reigenchimpo
7 months ago
Selected Answer: D
D is answer.
upvoted 1 times
...
anto69
7 months, 1 week ago
Selected Answer: D
For me and ChatGPT: d
upvoted 1 times
...
omankoman
7 months, 2 weeks ago
Selected Answer: D
D. Add examples to the None intent.
upvoted 1 times
...
nanaw770
7 months, 2 weeks ago
Selected Answer: D
It must be D.
upvoted 1 times
...
evangelist
11 months, 3 weeks ago
Selected Answer: D
The correct option to decrease the likelihood of a false positive in the Language Understanding model is to add additional None intent examples. Option D is correct. By adding more varied examples that do not map to a valid intent to the None intent, the model can better learn when an utterance does not apply and avoid falsely matching invalid queries to a valid intent. Options A, B, and C may improve the model in certain ways, but they do not directly address reducing false positives. Only adding additional out-of-scope examples to the None intent will help the model better distinguish when new utterances do not match any existing intent's patterns. So out of the options, adding examples to the None intent is the way to decrease the likelihood of false positives.
upvoted 3 times
...
rdemontis
1 year, 2 months ago
Selected Answer: D
to me the answer is D. Non intents have the purpose to reduce false positive too. https://docs.microsoft.com/en-us/azure/cognitive-services/language-service/conversational-language-understanding/concepts/none-intent#adding-examples-to-the-none-intent
upvoted 2 times
...
james2033
1 year, 4 months ago
Selected Answer: D
200 sample data. --> much false positive. --> Increase number of training data. --> Add example to the None intent, not active learning in this context.
upvoted 1 times
...
zellck
1 year, 6 months ago
Selected Answer: D
D is the answer. https://learn.microsoft.com/en-us/azure/cognitive-services/language-service/conversational-language-understanding/concepts/none-intent#adding-examples-to-the-none-intent The None intent is also treated like any other intent in your project. If there are utterances that you want predicted as None, consider adding similar examples to them in your training data. For example, if you would like to categorize utterances that are not important to your project as None, such as greetings, yes and no answers, responses to questions such as providing a number, then add those utterances to your intent. You should also consider adding false positive examples to the None intent. For example, in a flight booking project it is likely that the utterance "I want to buy a book" could be confused with a Book Flight intent. Adding "I want to buy a book" or "I love reading books" as None training utterances helps alter the predictions of those types of utterances towards the None intent instead of Book Flight.
upvoted 2 times
...
Drummer
1 year, 7 months ago
A. Enable active learning. By enabling active learning, the model can actively request feedback from users when it encounters uncertain or ambiguous queries. This feedback loop helps improve the model's understanding and reduces false positives by incorporating user input into its training process. Option A (Enable active learning) is the correct choice to decrease the likelihood of false positives.
upvoted 1 times
...
ziggy1117
1 year, 7 months ago
Selected Answer: D
You should also consider adding false positive examples to the None intent. For example, in a flight booking project it is likely that the utterance "I want to buy a book" could be confused with a Book Flight intent. Adding "I want to buy a book" or "I love reading books" as None training utterances helps alter the predictions of those types of utterances towards the None intent instead of Book Flight. https://learn.microsoft.com/en-us/azure/cognitive-services/language-service/conversational-language-understanding/concepts/none-intent#adding-examples-to-the-none-intent
upvoted 3 times
...
ziggy1117
1 year, 7 months ago
You should also consider adding false positive examples to the None intent. For example, in a flight booking project it is likely that the utterance "I want to buy a book" could be confused with a Book Flight intent. Adding "I want to buy a book" or "I love reading books" as None training utterances helps alter the predictions of those types of utterances towards the None intent instead of Book Flight. https://learn.microsoft.com/en-us/azure/cognitive-services/language-service/conversational-language-understanding/concepts/none-intent#adding-examples-to-the-none-intent
upvoted 1 times
...
Marilena96
1 year, 10 months ago
To decrease the likelihood of a false positive, you can add additional examples to the GetContactDetails intent. This will help the model to better understand the intent and reduce the likelihood of false positive predictions.
upvoted 2 times
Rob77
1 year, 7 months ago
Nope, 20-30 examples per intent is recommended. See https://learn.microsoft.com/en-us/azure/cognitive-services/LUIS/concepts/application-design#create-example-utterances-for-each-intent
upvoted 1 times
...
...
ap1234pa
1 year, 11 months ago
Selected Answer: D
As explained in MS Document "false positive" = None intent
upvoted 4 times
...
ap1234pa
1 year, 11 months ago
Selected Answer: D
Add examples to "None" intent
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago