【TensorFlow】:激活函数

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/bqw18744018044/article/details/83218533
import tensorflow as tf
sess = tf.InteractiveSession()

TensorFlow提供了7中激活函数

1.ReLU

f ( x ) = m a x ( x , 0 ) f(x)=max(x,0)

print(tf.nn.relu(3.5).eval())
print(tf.nn.relu(-1).eval())
print(tf.nn.relu(7).eval())
3.5
0
7

2.ReLU6

f ( x ) = m i n ( m a x ( x , 0 ) , 6 ) f(x)=min(max(x,0),6)

print(tf.nn.relu6(3.5).eval())
print(tf.nn.relu6(-1).eval())
print(tf.nn.relu6(7).eval())
3.5
0
6

3.softplus

f ( x ) = l o g ( 1 + e x ) f(x)=log(1+e^x)

print(tf.nn.softplus(0.5).eval())
0.974077

4.sigmoid

f ( x ) = 1 1 + e x f(x)=\frac{1}{{1+e^{-x}}}

print(tf.nn.sigmoid(2.0).eval())
0.880797

5.tanh

f ( x ) = 1 e 2 x 1 + e 2 x f(x)=\frac{1-e^{-2x}}{1+e^{-2x}}

print(tf.nn.tanh(2.0).eval())
0.9640276

6.bias_add

a = tf.constant([[1.0,1.0],[2.0,2.0],[3.0,3.0]])
b = tf.constant([1.0,1.0])
print(tf.nn.bias_add(a,b).eval())
[[2. 2.]
 [3. 3.]
 [4. 4.]]

7.dropout

x = tf.constant([[1.0,1.0],[2.0,2.0],[3.0,3.0]])
print(tf.nn.dropout(x,0.5).eval())
[[0. 0.]
 [0. 0.]
 [0. 6.]]

猜你喜欢

转载自blog.csdn.net/bqw18744018044/article/details/83218533