Articles → NLP → Weight Update And A Bias Update In A Single-Layer Perceptron Model

Weight Update And A Bias Update In A Single-Layer Perceptron Model






Why Update Weight?





Formula Of Weight Update


Picture showing


  1. wi​ → weight of the ith input
  2. η → learning rate (small constant like 0.01 or 0.1)
  3. y → actual/true label
  4. ŷ ​ → predicted output of the perceptron
  5. xi ​ → input feature value

Example




  1. Input x = [2,3]
  2. Weights w = [0.5,0.2]
  3. Actual y=1
  4. Predicted ŷ = 0
  5. Learning rate 0.1


Picture showing



Formula For Bias Update


Picture showing





Posted By  -  Karan Gupta
 
Posted On  -  Monday, March 23, 2026

Query/Feedback


Your Email Id
 
Subject
 
Query/FeedbackCharacters remaining 250