Articles → NLP → Word2vec (Skip-Gram Version) In NLP

Word2vec (Skip-Gram Version) In NLP






How Skip-Gram Is Different From CBOW?





Steps




  • Tokenization
  • Choose window size
  • Create skip-gram training pairs
  • Convert words to vectors (one-hot encoding)
  • Passing data to a neural network

  • Tokenization




    I love working with AI




    [I, love, working, with, AI]



    Choose Window Size





    Create Skip-Gram Training Pairs


    Input (Centre)Output (Context)
    loveI
    Loveworking
    Workinglove
    workingwith
    Withworking
    WithAI





    Convert Words To Vectors (One-Hot Encoding)


    love = [0 1 0 0 0]



    Passing Data To A Neural Network


    Input  = workingOutput = loveInput  = workingOutput = with





    Posted By  -  Karan Gupta
     
    Posted On  -  Wednesday, February 25, 2026

    Query/Feedback


    Your Email Id
     
    Subject
     
    Query/FeedbackCharacters remaining 250