Articles → NLP → Word2vec (Skip-Gram Version) In NLP

Word2vec (Skip-Gram Version) In NLP






How Skip-Gram Is Different From CBOW?





Steps




  1. Tokenization
  2. Choose window size
  3. Create skip-gram training pairs
  4. Convert words to vectors (one-hot encoding)
  5. Passing data to a neural network

Tokenization




I love working with AI




[I, love, working, with, AI]



Choose Window Size





Create Skip-Gram Training Pairs


Input (Centre)Output (Context)
loveI
Loveworking
Workinglove
workingwith
Withworking
WithAI





Convert Words To Vectors (One-Hot Encoding)


love = [0 1 0 0 0]



Passing Data To A Neural Network


Input  = working
Output = love
Input  = working
Output = with





Posted By  -  Karan Gupta
 
Posted On  -  Wednesday, February 25, 2026

Query/Feedback


Your Email Id
 
Subject
 
Query/FeedbackCharacters remaining 250