In an RBF network the output neurons exclusively contain the identity as activation function and one weighted sum as propagation function. Thus, they actually do little more than adding all input values and returning the sum.Hidden neurons are usually called RBF neurons (as well as the layer in which they are located is referred to as RBF layer). As propagation function, each unseen neuron calculates a normal that represents the extended distance between the input to the network and the so-called position of the neuron. This is certainly inserted into a radial activation function which calculates and outputs the activation of the neuron.