Articles → NLP → Word2vec (Skip-Gram Version) In NLP
Word2vec (Skip-Gram Version) In NLP
How Skip-Gram Is Different From CBOW?
Steps
TokenizationChoose window sizeCreate skip-gram training pairsConvert words to vectors (one-hot encoding)Passing data to a neural network
Tokenization
[I, love, working, with, AI]
Choose Window Size
Create Skip-Gram Training Pairs
| Input (Centre) | Output (Context) |
|---|
| love | I |
| Love | working |
| Working | love |
| working | with |
| With | working |
| With | AI |
Convert Words To Vectors (One-Hot Encoding)
Passing Data To A Neural Network
Input = workingOutput = loveInput = workingOutput = with
| Posted By - | Karan Gupta |
| |
| Posted On - | Wednesday, February 25, 2026 |