Home Articles FAQs XREF Games Software Instant Books BBS About FOLDOC RFCs Feedback Sitemap
irt.Org

Hebbian learning

You are here: irt.org | FOLDOC | Hebbian learning

<artificial intelligence> The most common way to train a neural network; a kind of unsupervised learning; named after canadian neuropsychologist, Donald O. Hebb.

The algorithm is based on Hebb's Postulate, which states that where one cell's firing repeatedly contributes to the firing of another cell, the magnitude of this contribution will tend to increase gradually with time. This means that what may start as little more than a coincidental relationship between the firing of two nearby neurons becomes strongly causal.

Despite limitations with Hebbian learning, e.g., the inability to learn certain patterns, variations such as Signal Hebbian Learning and Differential Hebbian Learning are still used.

(http://neuron-ai.tuke.sk/NCS/VOL1/P3_html/node14.html).

(2003-11-07)

Nearby terms: heavy metal « heavyweight « heavy wizardry « Hebbian learning » heisenbug » Helen Keller mode » Helix

FOLDOC, Topics, A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, Y, Z, ?, ALL

©2018 Martin Webb