Articles → NLP → Word2vec (CBOW Version) In NLP

Word2vec (CBOW Version) In NLP






Purpose




  1. Input → Context words
  2. Output → Target word (centre word).

Steps




  1. Tokenization
  2. Choose window size
  3. Create CBOW training pairs
  4. Convert words to vectors (one-hot encoding)
  5. Passing data to a neural network

Tokenization




I love working with AI




[I, love, working, with, AI]



Choose Window Size





Create CBOW Training Pairs


Context (Input)Target (Output)
[love]I
[I, working]love
[love, with]working
[working, AI]with
[with]AI



Convert Words To Vectors (One-Hot Encoding)


I       → [1,0,0,0,0]love    → [0,1,0,0,0]working → [0,0,1,0,0]with    → [0,0,0,1,0]AI      → [0,0,0,0,1]



Passing Data To A Neural Network




Target = "working"Context = ["love", "with"]




love  → [0,1,0,0,0]with  → [0,0,0,1,0]




([0,1,0,0,0] + [0,0,0,1,0]) / 2 = [0, 0.5, 0, 0.5, 0]




working → [0,0,1,0,0]





Posted By  -  Karan Gupta
 
Posted On  -  Friday, February 20, 2026

Query/Feedback


Your Email Id
 
Subject
 
Query/FeedbackCharacters remaining 250