TensorFlow(八) 深度学习 HelloWorld 小程序

helloworld 是学习一个领域的不错切入点, 一些书上使用mnist手写体识别作为深度学习的 helloworld , 其实还是不够简单,这里可以更简单。


直接先上代码

# coding:utf-8

import tensorflow as tf
from numpy.random import RandomState
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'  # 设置 TensorFlow 的 Log 输出级别为 : 只有错误の情况才会被输出


def main():
    # 定义网络
    w1 = tf.Variable(tf.random_normal([2, 3], stddev=1, seed=1))
    w2 = tf.Variable(tf.random_normal([3, 1], stddev=1, seed=1))

    # 定义输入和输出,y_ 为理应的标签输出
    x = tf.placeholder(tf.float32, shape=[None, 2], name='x-input')
    y_ = tf.placeholder(tf.float32, shape=[None, 1], name='y-input')

    # 定义网络前向传播过程,y 为网络计算输出
    a = tf.matmul(x, w1)
    y = tf.matmul(a, w2)

    # 定义损失函数和反向传播算法
    cross_entropy = -tf.reduce_mean(
        y_ * tf.log(tf.clip_by_value(y, 1e-10, 1.0))
    )
    train_step = tf.train.AdamOptimizer(0.001).minimize(cross_entropy)

    # 生成待使用的数据集,A 为随机生成的[data_size, 2]维矩阵,B 为[data_size, 1]维矩阵
    # A 中每行的两个数字 a1 和 a2 对应 B 中每行的单个数字 b
    # 这里指定对应规则: 当 a1+a2 < 1 时候,b 为 0 ;当 a1+a2 > 1 时候,b 为 1
    # 神经网络训练的过程,就相当于寻找这种规则的过程,以能够在未知数据集上对数据进行恰当的预测
    data_set_size = 256
    batch_size = 8
    rds = RandomState(2)
    A = rds.rand(data_set_size, 2)
    B = [[int(a1+a2 < 1)] for (a1, a2) in A]

    # 创建会话来执行神经网络的训练和检测
    with tf.Session() as sess:
        # 初始化变量
        tf.global_variables_initializer().run()
        # 设置训练的迭代次数
        steps = 10000
        for i in range(0, steps):
            # 每次选一个 batch_size 的数据集合进行训练
            start = (i * batch_size) % data_set_size
            end = min(start + batch_size, data_set_size)
            # 训练神经网络
            sess.run(train_step,
                     feed_dict={x: A[start:end], y_: B[start:end]})
            if i % 1000 == 0:
                # 每隔一段时间,计算在所有数据上的交叉熵,并格式化打印输出 (指定迭代次数为4位整数,交叉熵保留小数点后10位)
                total_cross_entropy = sess.run(
                    cross_entropy, feed_dict={x: A, y_: B})
                print("[+] after %4d training steps, cross entropy on all data is %.10f" % (i, total_cross_entropy))

if __name__ == '__main__':
    print('[+] start')
    main()
    print('[+] end')

输出为 : 

[+] start
[+] after    0 training steps, cross entropy on all data is 0.0442740396
[+] after 1000 training steps, cross entropy on all data is 0.0137241585
[+] after 2000 training steps, cross entropy on all data is 0.0081874840
[+] after 3000 training steps, cross entropy on all data is 0.0054333927
[+] after 4000 training steps, cross entropy on all data is 0.0045696641
[+] after 5000 training steps, cross entropy on all data is 0.0041177678
[+] after 6000 training steps, cross entropy on all data is 0.0035931033
[+] after 7000 training steps, cross entropy on all data is 0.0030437880
[+] after 8000 training steps, cross entropy on all data is 0.0024990581
[+] after 9000 training steps, cross entropy on all data is 0.0019730974
[+] end


猜你喜欢

转载自blog.csdn.net/jiangmengya1/article/details/78289581