ReLU Family
·
ML & DL/기초 이론
ReLURectified Linear Unit$$ ReLU(x) = max(0,x) $$LeakyReLU$$ LeakyReLU_{\alpha} = max(\alpha x, x) $$GELUGaussian Error Linear Unit$$ GELU(x) = 0.5 \cdot x \cdot (1 + \tanh(\sqrt{2/\pi} \cdot (x + 0.044715 \cdot x^3))) $$PReLUParametric ReLU$$ PReLU(x) = \max(0, x) + a \cdot \min(0, x) $$ELUExponential Linear Unit$$ ELU_{\alpha} = \begin{cases} \alpha (\exp(x) - 1) & \text{if } x = 0 \end{cases}..