mxnet随笔-Activation激活函数

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/u010255642/article/details/82082318

The following activation functions are supported:

  • relu: Rectified Linear Unit, y=max(x,0)y=max(x,0)
  • sigmoid: y=11+exp(−x)y=11+exp(−x)
  • tanh: Hyperbolic tangent, y=exp(x)−exp(−x)exp(x)+exp(−x)y=exp(x)−exp(−x)exp(x)+exp(−x)
  • softrelu: Soft ReLU, or SoftPlus, y=log(1+exp(x))y=log(1+exp(x))
  • softsign: y=x1+abs(x)

mxnet.ndarray.Activation(data=Noneact_type=_Nullout=Nonename=None**kwargs)

  • data (NDArray) – The input array.
  • act_type ({'relu''sigmoid''softrelu''softsign''tanh'}required) – Activation function to be applied.
 # -*- coding: utf-8 -*-
"""
Spyder Editor

This is a temporary script file.
"""
import mxnet as mx
import numpy as np



x = mx.nd.array([[1.085,-2.75,-5.6,9.9],[3.087,5.32,3.75,11.865]])
y = mx.nd.Activation(x,act_type='relu')
print y
y = mx.nd.Activation(x,act_type='softrelu')
print y

[[ 1.085  0.     0.     9.9  ]
 [ 3.087  5.32   3.75  11.865]]
<NDArray 2x4 @cpu(0)>

[[1.3761026e+00 6.1967585e-02 3.6910437e-03 9.9000502e+00]
 [3.1316278e+00 5.3248811e+00 3.7732456e+00 1.1865006e+01]]
<NDArray 2x4 @cpu(0)>
>>> 

猜你喜欢

转载自blog.csdn.net/u010255642/article/details/82082318