GAO Tianzhu. Build and Multi-datasets Comparison Analysis of LeafSpring Activation Function[J]. INFORMATION AND CONTROL, 2020, 49(3): 306-314, 322. DOI: 10.13976/j.cnki.xk.2020.9332
Citation: GAO Tianzhu. Build and Multi-datasets Comparison Analysis of LeafSpring Activation Function[J]. INFORMATION AND CONTROL, 2020, 49(3): 306-314, 322. DOI: 10.13976/j.cnki.xk.2020.9332

Build and Multi-datasets Comparison Analysis of LeafSpring Activation Function

  • To solve the problems of non-negative return value and non-activation of neuron in the ReLU activation function of neural network, we propose a new ReLU-like activation function called Leafspring activation function. LeafSpring not only inherits the advantages of ReLU but also returns negative values, which is more versatile. The derivation and properties of the LeafSpring activation function are discussed. In the LeafSpring activation function, we introduce the stiffness coefficient k, which can adjust the weight coefficient of two adjacent layers by training. To reduce the amount of computation, LeafSpring can be simplified to ReLU or Softplus in some cases. The performance of LeafSpring activation function is shown by comparing its fitting accuracy with that of ReLU, Softplus, and Sigmoid on four different types of datasets. Comparison results show that LeafSpring can give consideration to fitting accuracy and convergence speed in different datasets, and it can fit complex nonlinear functions faster and more accurately on a small grid scale.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return