Viet-Anh on Software Logo

What is: Shifted Softplus?

SourceSchNet: A continuous-filter convolutional neural network for modeling quantum interactions
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Shifted Softplus is an activation function ssp(x)=ln(0.5ex+0.5){\rm ssp}(x) = \ln( 0.5 e^{x} + 0.5 ), which SchNet employs as non-linearity throughout the network in order to obtain a smooth potential energy surface. The shifting ensures that ssp(0)=0{\rm ssp}(0) = 0 and improves the convergence of the network. This activation function shows similarity to ELUs, while having infinite order of continuity.