Abstract:
A pruning algorithm based on entropy theory for designing BP neural network structure is proposed. The essence is to define the psuedo-entropy of the hidden node's output of neural network based on the Shannon's entropy principle. Both of the two different definitions of entropy have the same effect on the description of uncertainty, but the new definition of entropy overcomes the inherent drawbacks of Shannon's entropy. The cross-entropy of neural network's actual output and target output and the pseudo-entropy of the hidden node's output are used as cost function, and entropy cycle strategy is used to opimize the parameters of the network, and a simple neural network structure can be obtained by deleting and merging the hidden layer neurons at last. The simulation result of a typical non-linear function approximation shows that a simple architecture of BP neural networks can be achieved while the approximation performance is ensured.