A single layer perceptron (SLP) is a perceptron having only one layer of flexible weights and one layer of output neurons Ω.
Certainly, the existence of several output neurons Ω 1, Ω2... Ω n does not considerably change the idea of the perceptron.A perceptron with several output neurons will also be regarded as several unique perceptron with the same inputSimilar Threads:
- Recurrent perceptron-like networks in Neural Networks free pdf
- Initial configuration of a multilayer perceptron in Neural Networks free pdf
- A multilayer perceptron in Neural Networks free pdf
- The perceptron, backpropagation and its variants in Neural Networks free pdf
- Neural Networks for Knowledge Engineering in Neural Networks free pdf