Articles → NLP → Weight Update And A Bias Update In A Single-Layer Perceptron Model
Weight Update And A Bias Update In A Single-Layer Perceptron Model
Why Update Weight?
Formula Of Weight Update
- wi → weight of the ith input
- η → learning rate (small constant like 0.01 or 0.1)
- y → actual/true label
- ŷ → predicted output of the perceptron
- xi → input feature value
Example
- Input x = [2,3]
- Weights w = [0.5,0.2]
- Actual y=1
- Predicted ŷ = 0
- Learning rate 0.1
Formula For Bias Update
| Posted By - | Karan Gupta |
| |
| Posted On - | Monday, March 23, 2026 |