Rosenblatt describes multiple algorithms for training this model
that are clearly influenced by Hebb’s theory of learning in real neural networks. Some algorithms that only strengthen the connections between activated
some that only weaken the connection between activate
and some that use a combination of both. Interestingly, Rosenblatt is able to prove that these algorithms will always (eventually) yield a solution, if such a solution exists. Later Minsky and Papert found in their paper “Perceptrons: an introduction to computational geometry.” that such solutions only exist for linearly separable problems.
Their famous counterexample was
that can’t be solved using the perceptron as imagined by Rosenblatt.