exam questions

Exam AWS Certified AI Practitioner AIF-C01 All Questions

View all questions & answers for the AWS Certified AI Practitioner AIF-C01 exam

Exam AWS Certified AI Practitioner AIF-C01 topic 1 question 73 discussion

A company has documents that are missing some words because of a database error. The company wants to build an ML model that can suggest potential words to fill in the missing text.
Which type of model meets this requirement?

  • A. Topic modeling
  • B. Clustering models
  • C. Prescriptive ML models
  • D. BERT-based models
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
dehkon
Highly Voted 3 months, 3 weeks ago
BERT (Bidirectional Encoder Representations from Transformers) is a language model designed to understand context in text by considering both the left and right sides of a word. BERT-based models are well-suited for filling in missing words in sentences due to their ability to predict masked words in a given text. This makes them ideal for tasks that require filling in missing information within text data.
upvoted 7 times
...
Jessiii
Most Recent 2 weeks, 6 days ago
Selected Answer: D
D. BERT-based models: BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained language model that has been fine-tuned for various natural language processing tasks, including text completion. BERT-based models are particularly effective at predicting missing words or filling in gaps in text because they can understand context in both directions (left and right of the missing word). This makes them ideal for suggesting potential words to fill in missing text.
upvoted 1 times
...
may2021_r
2 months ago
Selected Answer: D
**Answer: D. BERT-based models** BERT (Bidirectional Encoder Representations from Transformers) uses a **masked language modeling** approach. It learns how to predict missing or “masked” words in a sentence based on the surrounding context. This makes a BERT-based model ideal for suggesting potential words to fill in missing text.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago