News

The contribution of this work resides in several axes. First, we introduce two novel activation functions: absolute linear units and inverse polynomial linear units. Both activation functions are ...