Skip to content

Lecture 3 Rabbit Hole

word2vec

Word2vec is a group of related methods that uses a neural network to learn word associations from a large corpus of text. The model receives a corpus and oututs a vector space of high dimension in which each word is associated to a point in the vector space. The words are arranged in such a way that proximity reflects some type of context, for example, meaning similarity.

softmax

Training algorithm (it is used by word2vec methods)

CBOW and skip grams

These are two artchitecture variations for word2vec models. Continuous bag of words (CBOW) predicts the current words from its window context; and skip gram predicts the context of the current word.