ReLU

Rectified Linear Unit

$$ ReLU(x) = max(0,x) $$

LeakyReLU

$$ LeakyReLU_{\alpha} = max(\alpha x, x) $$

GELU

Gaussian Error Linear Unit

$$ GELU(x) = 0.5 \cdot x \cdot (1 + \tanh(\sqrt{2/\pi} \cdot (x + 0.044715 \cdot x^3))) $$

PReLU

Parametric ReLU

$$ PReLU(x) = \max(0, x) + a \cdot \min(0, x) $$

ELU

Exponential Linear Unit

$$ ELU_{\alpha} = \begin{cases} \alpha (\exp(x) - 1) & \text{if } x < 0 \\ x & \text{if } x >= 0 \end{cases} $$

 

SELU

Scaled Exponential Linear Unit

$$ SELU(x) = \text{scale} \cdot (\max(0, x) + \min(0, \alpha \cdot (\exp(x) - 1))) $$

욱근욱