Word Embeddings QUIZ (MCQ QUESTIONS AND ANSWERS)

Total Correct: 0

Time:20:00

Question: 1

What is "ELMo" in the context of word embeddings?

Question: 2

What is the "bag of words" model?

Question: 3

What is "concreteness rating" in word embeddings?

Question: 4

What is "hyperparameter tuning" in word embeddings training?

Question: 5

What is "conceptual analogy" in word embeddings evaluation?

Question: 6

What is "word vector composition" in word embeddings?

Question: 7

What is "document embedding" in word representations?

Question: 8

What is the "context window" in Word2Vec?

Question: 9

What is "polysemy" in word embeddings?

Question: 10

What is "word sense disambiguation" in the context of word embeddings?

Question: 11

What is the primary limitation of word embeddings like Word2Vec and GloVe?

Question: 12

What is the "embedding dimension" in word embeddings?

Question: 13

What is the primary goal of "fine-tuning" in contextual word embeddings like BERT?

Question: 14

What is "word vector analogy" in word embeddings evaluation?

Question: 15

What is "BERT" in word embeddings and language understanding?

Question: 16

What is "contextual word embeddings" in word representations?

Question: 17

What is "t-SNE" in word embeddings visualization?

Question: 18

What is the "Word Mover's Distance" (WMD) in word embeddings?

Question: 19

What is the primary advantage of subword embeddings in word representations?

Question: 20

In word embeddings, what is the "word similarity" task used for evaluation?

Question: 21

What is the primary goal of "fastText" in word embeddings?

Question: 22

What does "GloVe" stand for in the context of word embeddings?

Question: 23

Which of the following word embedding techniques is based on matrix factorization?

Question: 24

What is "continuous bag of words" (CBOW) in Word2Vec?

Question: 25

In Word2Vec, what is the "window size" parameter used for?

Question: 26

What is the "skip-gram" model in Word2Vec?

Question: 27

Which word embedding technique uses a neural network to learn word representations?

Question: 28

In word embeddings, what does it mean for words with similar meanings to have similar vectors?

Question: 29

What are word embeddings in natural language processing?

Question: 30

What is the primary advantage of word embeddings compared to one-hot encoding?