Word2Vec
Introduction: What is word2vec? Motivation : Why Word2vec? Word2vec models: Continuous Bag of words Skip gram model Demo
What is Word2Vec? Introduced by Google in 2013 Computes vector representation of words Word meanings and relationships between words are encoded spatially Learns from input texts
Motivation Images; Speech easily represented in the form of vectors. What about text? Word2vec learns word embeddings. Converts words into meaningful vectors. Basically trains a neural network with a single hidden layer to perform a certain task. Doesn’t use neural network output, but instead uses the weights learnt.These weights serve as vector representations
Contextual Representation I eat an apple every day. I eat an orange every day. I like driving my car to work. Ref: https://docs.google.com/presentation/d/1yQWN1CDWLzxGeIAvnGgDsIJr5xmy4dB0VmHFKkLiibo/edit#slide=id.ge77999220_0_15
Word vectors
Learning Algorithms Continuous bag-of-words Continuous skip gram
Continuous Bag-of-words
Continuous skip-gram
Hidden Layer
How do we get word vectors?
Output layer
How is this different?
Demo https://ronxin.github.io/wevi/