Deep Learning Illustrated: Building Natural Language Processing Models
Domino Data Lab
AUGUST 22, 2019
When running word2vec, you can choose between two underlying model architectures— skip-gram (SG) or continuous bag of words (CBOW; pronounced see-bo)— either of which will typically produce roughly comparable results despite maximizing probabilities from “opposite” perspectives. Note: Mikolov, T., arXiv:1301.3781].
Let's personalize your content