The number m of training patterns need to be about the number of neurons n, or less (according to Hopfield, it really must be less than 0.15 n; according to some new results, m £ 0.5n/log n). Which means the memorizing capability of a Hopfield network is severely limited. Catastrophic “forgetting" might occur if we try to memorize more patterns than the network should handle.