Perceptron, most of the time is going to be used to describe a feed forward network with simple and easy connections. This network has a layer of scanner neurons with statically weighted connections; the weights of the layers are able to be changed.

All neurons subordinate to the retina are pattern detectors. At this point we initially use a binary perceptron with each and every output neuron having specifically two possible output values (e.g. (0, 1) or (−1, 1)).Thus, a binary threshold function can be used as activation function, based on the threshold valuation on the output neuron. In a way, the binary activation function represents a query which can alsobe negated by using negative weights. The perceptron may thus be utilized to accomplishtrue logical information processing, however it is not the easiest way to achieve Boolean logic.