What is the primary purpose of the term "word embedding" in natural language processing (NLP)?
Representing words as sparse binary vectors
Encoding words into numerical vectors with continuous values
Tokenizing sentences into individual words
Reducing the dimensionality of word representations
This question is part of this quiz :
Data Science Quiz