A company has documents that are missing some words because of a database error. The company wants to build an ML model that can suggest potential words to fill in the missing text. Which type of model meets this requirement?
BERT (Bidirectional Encoder Representations from Transformers) is a language model designed to understand context in text by considering both the left and right sides of a word. BERT-based models are well-suited for filling in missing words in sentences due to their ability to predict masked words in a given text. This makes them ideal for tasks that require filling in missing information within text data.
**Answer: D. BERT-based models**
BERT (Bidirectional Encoder Representations from Transformers) uses a **masked language modeling** approach. It learns how to predict missing or “masked” words in a sentence based on the surrounding context. This makes a BERT-based model ideal for suggesting potential words to fill in missing text.
upvoted 1 times
...
Log in to ExamTopics
Sign in:
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
dehkon
Highly Voted 2 months, 2 weeks agomay2021_r
Most Recent 3 weeks, 3 days ago