激活函数LeafSpring的构建及多数据集对比研究

Build and Multi-datasets Comparison Analysis of LeafSpring Activation Function

  • 摘要: 针对神经网络中ReLU激活函数存在返回值非负及神经元未激活等问题,提出了一种全新的类ReLU激活函数——LeafSpring.LeafSpring既继承了ReLU的优势,又可以返回负值,通用性更强.LeafSpring函数的推导及函数性质也会被讨论.该激活函数还引入了刚度系数k,可以通过训练主动调节相邻两层的权重系数.为了减少计算量,LeafSpring可以在一定情况下简化为ReLU或Softplus.为了展现LeafSpring激活函数的性能,还将其与ReLU、Softplus及Sigmoid在4种不同类型的数据集上进行拟合精度对比.对比结果表明,LeafSpring在不同的数据集上均能兼顾拟合精度和收敛速度且在小网格规模可以更快、更准确地拟合复杂非线性函数.

     

    Abstract: To solve the problems of non-negative return value and non-activation of neuron in the ReLU activation function of neural network, we propose a new ReLU-like activation function called Leafspring activation function. LeafSpring not only inherits the advantages of ReLU but also returns negative values, which is more versatile. The derivation and properties of the LeafSpring activation function are discussed. In the LeafSpring activation function, we introduce the stiffness coefficient k, which can adjust the weight coefficient of two adjacent layers by training. To reduce the amount of computation, LeafSpring can be simplified to ReLU or Softplus in some cases. The performance of LeafSpring activation function is shown by comparing its fitting accuracy with that of ReLU, Softplus, and Sigmoid on four different types of datasets. Comparison results show that LeafSpring can give consideration to fitting accuracy and convergence speed in different datasets, and it can fit complex nonlinear functions faster and more accurately on a small grid scale.

     

/

返回文章
返回