Abstract:
To improve the approximation efficiency of process neural networks, we propose an incremental extreme process neural network from the model structure perspective, which realizes the adaptive growth of the hidden-layer structure by gradually adding new neurons to the hidden layer based on the output error. First, we propose a quantum-inspired firefly algorithm to optimize the input parameter of a newly added neuron. Then, we analyze the relevance between the new neuron and the existing neurons according to the orthogonal vector 2-norm with respect to the new neuron's ouput. Lastly, we calculate the output weights of the new neuron based on the extreme learning theory while fixing the weight parameters of the existing neurons. Through a simulation experiment based on Henon time series forecasting and shale lithology identification, we compare the performance of the proposed method with those of other process neuron networks, and verify the effectiveness of our proposed method and the obvious improvements realized by the model's approximation efficiency and training speed.