neural network - Differential Hebbian Learning: why "-wij" appear in the equation -


hebb's law states if 2 neurons on either side of connection activated synchronously (or asynchronously), weight of connection increased (or decreased).
saw common equation:

formula

i can not understand why w_{ij} value appears in right-hand side of equation.

thank in advance , sorry poor english.

in standard hebbian learning weight can subtracted limit infinite growth (a known problem in hebbian learning). multiplied forgetting rate. believe role of weight decay term similar in equation provided, though, not sure if needed, since product of 2 terms can take either positive or negative sign.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -