版权声明:小博主大三在读,水平有限,希望大家多多指导,Personal Page:holeungliu.com https://blog.csdn.net/soulmeetliang/article/details/78621040
Result
Code
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
# fake data
x = np.linspace(-5,5,200) # x data,shape(100,1)
# following are popular activation function
y_relu = tf.nn.relu(x)
y_sigmoid = tf.nn.sigmoid(x)
y_tanh = tf.nn.tanh(x)
y_softplus = tf.nn.softplus(x)
y_softmax = tf.nn.softmax(x) # softmax is a special kind of activation function, it is about probability
sess = tf.Session()
y_relu,y_sigmoid,y_tanh,y_softplus,y_sigmoid = sess.run([y_relu,y_sigmoid,y_tanh,y_softplus,y_sigmoid])
# plt to visualize these activation function
plt.figure(1,figsize=(8,6)) # 自定义画布大小
plt.subplot(221)
'''
将figure设置的画布大小分成几个部分,参数‘221’表示2(row)x2(colu),即将画布分成2x2,
两行两列的4块区域,1表示选择图形输出的区域在第一块,图形输出区域参数必须在“行x列”范围,
此处必须在1和2之间选择——如果参数设置为subplot(111),则表示画布整个输出,不分割成小块区域,
图形直接输出在整块画布上
'''
plt.plot(x,y_relu,c='red',label='relu') #画点
plt.ylim((-1,5)) # 设置y轴范围
plt.legend(loc='best')
'''
'best' : 0, (only implemented for axes legends)(自适应方式)
'upper right' : 1,
'upper left' : 2,
'lower left' : 3,
'lower right' : 4,
'right' : 5,
'center left' : 6,
'center right' : 7,
'lower center' : 8,
'upper center' : 9,
'center' : 10,
'''
plt.subplot(222)
plt.plot(x,y_sigmoid,c='red',label='sigmoid')
plt.ylim((-0.2,1.2))
plt.legend(loc='best')
plt.subplot(223)
plt.plot(x,y_relu,c='red',label='relu')
plt.ylim((-1,5))
plt.legend(loc='best')
plt.subplot(224)
plt.plot(x,y_tanh,c='red',label='tanh')
plt.ylim((-1.2,1.2))
plt.legend(loc='best')
plt.show()