版权声明:王家林大咖2018年新书《SPARK大数据商业实战三部曲》清华大学出版,清华大学出版社官方旗舰店(天猫)https://qhdx.tmall.com/?spm=a220o.1000855.1997427721.d4918089.4b2a2e5dT6bUsM https://blog.csdn.net/duan_zhihua/article/details/82659308
Pytorch 学习(7):Pytorch中的Non-linear Activations (非线性层)实现
Pytorch中的Non-linear Activations (非线性层)包括以下激活函数:
- ReLU
- ReLU6
- ELU
- SELU
- PReLU
- LeakyReLU
- Threshold
- Hardtanh
- Sigmoid
- Tanh
- LogSigmoid
- Softplus
- Softshrink
- Softsign
- Tanhshrink
- Softmin
- Softmax
- Softmax2d
- LogSoftmax
Pytorch各激活函数的Python前端代码在activation.py中:
查看一下ReLU的python源码:
class ReLU(Threshold):
r"""Applies the rectified linear unit function element-wise
:math:`\text{ReLU}(x)= \max(0, x)`
.. image:: scripts/activation_images/ReLU.png
Args:
inplace: can optionally do the operation in-place. Default: ``False``
Shape:
- Input: :math:`(N, *)` where `*` means, any number of additional
dimensions
- Output: :math:`(N, *)`, same shape as the input
Examples::
>>> m = nn.ReLU()
>>> input = torch.randn(2)
>>> output = m(input)
"""
def __init__(self, inplace=False):
super(ReLU, self).__init__(0, 0, inplace)
def extra_repr(self):
inplace_str = 'inplace' if self.inplace else ''
return inplace_str
调用父类Threshold的初始化方法
class Threshold(Module):
r"""Thresholds each element of the input Tensor
Threshold is defined as:
.. math::
y =
\begin{cases}
x, &\text{ if } x > \text{threshold} \\
\text{value}, &\text{ otherwise }
\end{cases}
Args:
threshold: The value to threshold at
value: The value to replace with
inplace: can optionally do the operation in-place. Default: ``False``
Shape:
- Input: :math:`(N, *)` where `*` means, any number of additional
dimensions
- Output: :math:`(N, *)`, same shape as the input
Examples::
>>> m = nn.Threshold(0.1, 20)
>>> input = torch.randn(2)
>>> output = m(input)
"""
def __init__(self, threshold, value, inplace=False):
super(Threshold, self).__init__()
self.threshold = threshold
self.value = value
self.inplace = inplace
# TODO: check in THNN (if inplace == True, then assert value <= threshold)
def forward(self, input):
return F.threshold(input, self.threshold, self.value, self.inplace)
def extra_repr(self):
inplace_str = ', inplace' if self.inplace else ''
return 'threshold={}, value={}{}'.format(
self.threshold, self.value, inplace_str
)
执行forward前向传播函数时调用torch._C._nn.threshold;如果是inplace:调用torch._C._nn.threshold_
def threshold(input, threshold, value, inplace=False):
r"""Thresholds each element of the input Tensor.
See :class:`~torch.nn.Threshold` for more details.
"""
if inplace:
return torch._C._nn.threshold_(input, threshold, value)
return torch._C._nn.threshold(input, threshold, value)
最终实现的代码在\pytorch-master\aten\src\THNN\generic\目录里面,
激活函数的数学公式都嵌入在后端C代码里面了,将数学公式转换为C代码实现,python前端代码只是调用后端C代码的接口。Threshold.c源代码如下:
#ifndef TH_GENERIC_FILE
#define TH_GENERIC_FILE "generic/Threshold.c"
#else
void THNN_(Threshold_updateOutput)(
THNNState *state,
THTensor *input,
THTensor *output,
accreal threshold_,
accreal val_,
bool inplace)
{
real threshold = TH_CONVERT_ACCREAL_TO_REAL(threshold_);
real val = TH_CONVERT_ACCREAL_TO_REAL(val_);
if (inplace)
{
TH_TENSOR_APPLY(real, input,
if (*input_data <= threshold)
*input_data = val;
);
THTensor_(set)(output, input);
}
else
{
THTensor_(resizeAs)(output, input);
TH_TENSOR_APPLY2(real, output, real, input,
*output_data = (*input_data > threshold) ? *input_data : val;
);
}
}
void THNN_(Threshold_updateGradInput)(
THNNState *state,
THTensor *input,
THTensor *gradOutput,
THTensor *gradInput,
accreal threshold_,
accreal val_,
bool inplace)
{
real threshold = TH_CONVERT_ACCREAL_TO_REAL(threshold_);
THNN_CHECK_NELEMENT(input, gradOutput);
if (inplace)
{
TH_TENSOR_APPLY2(real, gradOutput, real, input,
if ((*input_data) <= threshold)
*gradOutput_data = 0;
);
THTensor_(set)(gradInput, gradOutput);
}
else
{
THTensor_(resizeAs)(gradInput, input);
TH_TENSOR_APPLY3(real, gradInput, real, gradOutput, real, input,
if ((*input_data) > threshold)
*gradInput_data = *gradOutput_data;
else
*gradInput_data = 0;
);
}
}
#endif
但是,也有一部分激活函数的实现在python中实现的,例如:softsign激活函数
class Softsign(Module):
r"""Applies element-wise, the function :math:`\text{SoftSign}(x) = \frac{x}{ 1 + |x|}`
Shape:
- Input: :math:`(N, *)` where `*` means, any number of additional
dimensions
- Output: :math:`(N, *)`, same shape as the input
.. image:: scripts/activation_images/Softsign.png
Examples::
>>> m = nn.Softsign()
>>> input = torch.randn(2)
>>> output = m(input)
"""
def forward(self, input):
return F.softsign(input)
def softsign(input):
r"""softsign(input) -> Tensor
Applies element-wise, the function :math:`\text{SoftSign}(x) = \frac{x}{1 + |x|}`
See :class:`~torch.nn.Softsign` for more details.
"""
return input / (input.abs() + 1)
要想一下子全知道,就意味着什么也不会知道。——巴甫洛夫
学到很多东西的决窍,就是一下子不要学很多东西。——洛克