Improvement Algorithm of an Incremental Extreme Learning Machine
-
-
Abstract
Given that the input weights and threshold of hidden layer neurons are obtained randomly, in the training process of an incremental extreme learning machine (I-ELM), the output weights of some hidden layer neurons may be too small to contribute effectively to the network output. This causes the neurons to be invalid. This problem not only makes the network more complicated, but also reduces the stability of the network. To deal with this issue, we propose in this study an improved method that adds an offset to hidden layer output of I-ELM (II-ELM). Then, we analyze and prove the existence of the offset. Finally, the validity of II-ELM is verified by simulationand comparison with the I-ELM in classification and regression problems.
-
-