美文网首页
神经网络基本知识

神经网络基本知识

作者: 原上的小木屋 | 来源:发表于2020-09-21 16:55 被阅读0次

    本讲目标

    学会神经网络计算过程,使用基于TF2原生代码搭建你的第一个神经网络训练模型

    1. 当今人工智能主流方向----连接主义
    2. 前向传播
    3. 损失函数
    4. 梯度下降
    5. 学习率
    6. 反向传播参数更新
    7. Tensorflow 2常用函数

    人工智能三个学派

    1. 行为主义:基于控制论,构建感知-动作控制系统(控制论,如平衡、行走、避障等自适应控制系统)
    2. 符号主义:基于算数逻辑表达式,求解问题时先把问题描述为表达式,再求解表达式(可用公式描述、实现理性思维,如专家系统)
    3. 连接主义:仿生学,模拟神经元连接关系(神经网络)

    具体步骤

    1. 准备数据:采集大量“特征/标签”数据
    2. 搭建网络:搭建神经网络结构
    3. 优化参数:训练网络获取最佳参数
    4. 应用网络:将网络保存为模型,输入新数据,输出分类或预测结果
    import tensorflow as tf
    
    w = tf.Variable(tf.constant(5, dtype=tf.float32))
    lr = 0.2
    epoch = 40
    
    for epoch in range(epoch):  # for epoch 定义顶层循环,表示对数据集循环epoch次,此例数据集数据仅有1个w,初始化时候constant赋值为5,循环40次迭代。
        with tf.GradientTape() as tape:  # with结构到grads框起了梯度的计算过程。
            loss = tf.square(w + 1)
        grads = tape.gradient(loss, w)  # .gradient函数告知谁对谁求导
    
        w.assign_sub(lr * grads)  # .assign_sub 对变量做自减 即:w -= lr*grads 即 w = w - lr*grads
        print("After %s epoch,w is %f,loss is %f" % (epoch, w.numpy(), loss))
    
    # lr初始值:0.2   请自改学习率  0.001  0.999 看收敛过程
    # 最终目的:找到 loss 最小 即 w = -1 的最优参数w
    
    After 0 epoch,w is 2.600000,loss is 36.000000
    After 1 epoch,w is 1.160000,loss is 12.959999
    After 2 epoch,w is 0.296000,loss is 4.665599
    After 3 epoch,w is -0.222400,loss is 1.679616
    After 4 epoch,w is -0.533440,loss is 0.604662
    After 5 epoch,w is -0.720064,loss is 0.217678
    After 6 epoch,w is -0.832038,loss is 0.078364
    After 7 epoch,w is -0.899223,loss is 0.028211
    After 8 epoch,w is -0.939534,loss is 0.010156
    After 9 epoch,w is -0.963720,loss is 0.003656
    After 10 epoch,w is -0.978232,loss is 0.001316
    After 11 epoch,w is -0.986939,loss is 0.000474
    After 12 epoch,w is -0.992164,loss is 0.000171
    After 13 epoch,w is -0.995298,loss is 0.000061
    After 14 epoch,w is -0.997179,loss is 0.000022
    After 15 epoch,w is -0.998307,loss is 0.000008
    After 16 epoch,w is -0.998984,loss is 0.000003
    After 17 epoch,w is -0.999391,loss is 0.000001
    After 18 epoch,w is -0.999634,loss is 0.000000
    After 19 epoch,w is -0.999781,loss is 0.000000
    After 20 epoch,w is -0.999868,loss is 0.000000
    After 21 epoch,w is -0.999921,loss is 0.000000
    After 22 epoch,w is -0.999953,loss is 0.000000
    After 23 epoch,w is -0.999972,loss is 0.000000
    After 24 epoch,w is -0.999983,loss is 0.000000
    After 25 epoch,w is -0.999990,loss is 0.000000
    After 26 epoch,w is -0.999994,loss is 0.000000
    After 27 epoch,w is -0.999996,loss is 0.000000
    After 28 epoch,w is -0.999998,loss is 0.000000
    After 29 epoch,w is -0.999999,loss is 0.000000
    After 30 epoch,w is -0.999999,loss is 0.000000
    After 31 epoch,w is -1.000000,loss is 0.000000
    After 32 epoch,w is -1.000000,loss is 0.000000
    After 33 epoch,w is -1.000000,loss is 0.000000
    After 34 epoch,w is -1.000000,loss is 0.000000
    After 35 epoch,w is -1.000000,loss is 0.000000
    After 36 epoch,w is -1.000000,loss is 0.000000
    After 37 epoch,w is -1.000000,loss is 0.000000
    After 38 epoch,w is -1.000000,loss is 0.000000
    After 39 epoch,w is -1.000000,loss is 0.000000
    

    神经网络在训练时,是把输入特征和标签配对后喂入网络的,Tensorflow给出了把特征和标签配对的函数

    data=tf.data.Dataset.from_tensor_slices((输入特征,标签))下面我们来举个例子

    features=tf.constant([12,23,10,17])
    labels=tf.constant([0,1,1,0])
    
    dataset=tf.data.Dataset.from_tensor_slices((features,labels))
    print(dataset)
    for element in dataset:
        print(element)
    
    <TensorSliceDataset shapes: ((), ()), types: (tf.int32, tf.int32)>
    (<tf.Tensor: shape=(), dtype=int32, numpy=12>, <tf.Tensor: shape=(), dtype=int32, numpy=0>)
    (<tf.Tensor: shape=(), dtype=int32, numpy=23>, <tf.Tensor: shape=(), dtype=int32, numpy=1>)
    (<tf.Tensor: shape=(), dtype=int32, numpy=10>, <tf.Tensor: shape=(), dtype=int32, numpy=1>)
    (<tf.Tensor: shape=(), dtype=int32, numpy=17>, <tf.Tensor: shape=(), dtype=int32, numpy=0>)
    

    常用函数tf.nn.softmax

    y=tf.constant([1.01,2.01,-0.66])
    y_pro=tf.nn.softmax(y)
    print("After softmax,y_pro is:",y_pro)
    
    After softmax,y_pro is: tf.Tensor([0.25598174 0.69583046 0.0481878 ], shape=(3,), dtype=float32)
    

    常用函数assign_sub

    1. 赋值操作,更新参数的值并返回
    2. 调用assign_sub前,先用tf.Variable定义变量w为可训练(可自更新)
    3. w.assign_sub(w为要自减的内容)
    w=tf.Variable(4)
    w.assign_sub(1)
    print(w)
    
    <tf.Variable 'Variable:0' shape=() dtype=int32, numpy=3>
    

    常用函数tf.argmax

    1. 返回张量沿指定维度最大值的索引tf.argmax(张量名,axis=操作轴)
    import numpy as np
    test=np.array([[1,2,3],[2,3,4],[5,4,3],[8,7,2]])
    print(test)
    print(tf.argmax(test,axis=0))#返回每一列(经度)最大值的索引
    print(tf.argmax(test,axis=1))#返回每一行(纬度)最大值的索引
    
    [[1 2 3]
     [2 3 4]
     [5 4 3]
     [8 7 2]]
    tf.Tensor([3 3 1], shape=(3,), dtype=int64)
    tf.Tensor([2 2 0 0], shape=(4,), dtype=int64)
    

    鸢尾花分类

    1. 导入数据集
    from sklearn import datasets
    from pandas import DataFrame
    import pandas as pd
    
    x_data = datasets.load_iris().data  # .data返回iris数据集所有输入特征
    y_data = datasets.load_iris().target  # .target返回iris数据集所有标签
    print("x_data from datasets: \n", x_data)
    print("y_data from datasets: \n", y_data)
    
    x_data = DataFrame(x_data, columns=['花萼长度', '花萼宽度', '花瓣长度', '花瓣宽度']) # 为表格增加行索引(左侧)和列标签(上方)
    pd.set_option('display.unicode.east_asian_width', True)  # 设置列名对齐
    print("x_data add index: \n", x_data)
    
    x_data['类别'] = y_data  # 新加一列,列标签为‘类别’,数据为y_data
    print("x_data add a column: \n", x_data)
    
    #类型维度不确定时,建议用print函数打印出来确认效果
    
    x_data from datasets: 
     [[5.1 3.5 1.4 0.2]
     [4.9 3.  1.4 0.2]
     [4.7 3.2 1.3 0.2]
     [4.6 3.1 1.5 0.2]
     [5.  3.6 1.4 0.2]
     [5.4 3.9 1.7 0.4]
     [4.6 3.4 1.4 0.3]
     [5.  3.4 1.5 0.2]
     [4.4 2.9 1.4 0.2]
     [4.9 3.1 1.5 0.1]
     [5.4 3.7 1.5 0.2]
     [4.8 3.4 1.6 0.2]
     [4.8 3.  1.4 0.1]
     [4.3 3.  1.1 0.1]
     [5.8 4.  1.2 0.2]
     [5.7 4.4 1.5 0.4]
     [5.4 3.9 1.3 0.4]
     [5.1 3.5 1.4 0.3]
     [5.7 3.8 1.7 0.3]
     [5.1 3.8 1.5 0.3]
     [5.4 3.4 1.7 0.2]
     [5.1 3.7 1.5 0.4]
     [4.6 3.6 1.  0.2]
     [5.1 3.3 1.7 0.5]
     [4.8 3.4 1.9 0.2]
     [5.  3.  1.6 0.2]
     [5.  3.4 1.6 0.4]
     [5.2 3.5 1.5 0.2]
     [5.2 3.4 1.4 0.2]
     [4.7 3.2 1.6 0.2]
     [4.8 3.1 1.6 0.2]
     [5.4 3.4 1.5 0.4]
     [5.2 4.1 1.5 0.1]
     [5.5 4.2 1.4 0.2]
     [4.9 3.1 1.5 0.2]
     [5.  3.2 1.2 0.2]
     [5.5 3.5 1.3 0.2]
     [4.9 3.6 1.4 0.1]
     [4.4 3.  1.3 0.2]
     [5.1 3.4 1.5 0.2]
     [5.  3.5 1.3 0.3]
     [4.5 2.3 1.3 0.3]
     [4.4 3.2 1.3 0.2]
     [5.  3.5 1.6 0.6]
     [5.1 3.8 1.9 0.4]
     [4.8 3.  1.4 0.3]
     [5.1 3.8 1.6 0.2]
     [4.6 3.2 1.4 0.2]
     [5.3 3.7 1.5 0.2]
     [5.  3.3 1.4 0.2]
     [7.  3.2 4.7 1.4]
     [6.4 3.2 4.5 1.5]
     [6.9 3.1 4.9 1.5]
     [5.5 2.3 4.  1.3]
     [6.5 2.8 4.6 1.5]
     [5.7 2.8 4.5 1.3]
     [6.3 3.3 4.7 1.6]
     [4.9 2.4 3.3 1. ]
     [6.6 2.9 4.6 1.3]
     [5.2 2.7 3.9 1.4]
     [5.  2.  3.5 1. ]
     [5.9 3.  4.2 1.5]
     [6.  2.2 4.  1. ]
     [6.1 2.9 4.7 1.4]
     [5.6 2.9 3.6 1.3]
     [6.7 3.1 4.4 1.4]
     [5.6 3.  4.5 1.5]
     [5.8 2.7 4.1 1. ]
     [6.2 2.2 4.5 1.5]
     [5.6 2.5 3.9 1.1]
     [5.9 3.2 4.8 1.8]
     [6.1 2.8 4.  1.3]
     [6.3 2.5 4.9 1.5]
     [6.1 2.8 4.7 1.2]
     [6.4 2.9 4.3 1.3]
     [6.6 3.  4.4 1.4]
     [6.8 2.8 4.8 1.4]
     [6.7 3.  5.  1.7]
     [6.  2.9 4.5 1.5]
     [5.7 2.6 3.5 1. ]
     [5.5 2.4 3.8 1.1]
     [5.5 2.4 3.7 1. ]
     [5.8 2.7 3.9 1.2]
     [6.  2.7 5.1 1.6]
     [5.4 3.  4.5 1.5]
     [6.  3.4 4.5 1.6]
     [6.7 3.1 4.7 1.5]
     [6.3 2.3 4.4 1.3]
     [5.6 3.  4.1 1.3]
     [5.5 2.5 4.  1.3]
     [5.5 2.6 4.4 1.2]
     [6.1 3.  4.6 1.4]
     [5.8 2.6 4.  1.2]
     [5.  2.3 3.3 1. ]
     [5.6 2.7 4.2 1.3]
     [5.7 3.  4.2 1.2]
     [5.7 2.9 4.2 1.3]
     [6.2 2.9 4.3 1.3]
     [5.1 2.5 3.  1.1]
     [5.7 2.8 4.1 1.3]
     [6.3 3.3 6.  2.5]
     [5.8 2.7 5.1 1.9]
     [7.1 3.  5.9 2.1]
     [6.3 2.9 5.6 1.8]
     [6.5 3.  5.8 2.2]
     [7.6 3.  6.6 2.1]
     [4.9 2.5 4.5 1.7]
     [7.3 2.9 6.3 1.8]
     [6.7 2.5 5.8 1.8]
     [7.2 3.6 6.1 2.5]
     [6.5 3.2 5.1 2. ]
     [6.4 2.7 5.3 1.9]
     [6.8 3.  5.5 2.1]
     [5.7 2.5 5.  2. ]
     [5.8 2.8 5.1 2.4]
     [6.4 3.2 5.3 2.3]
     [6.5 3.  5.5 1.8]
     [7.7 3.8 6.7 2.2]
     [7.7 2.6 6.9 2.3]
     [6.  2.2 5.  1.5]
     [6.9 3.2 5.7 2.3]
     [5.6 2.8 4.9 2. ]
     [7.7 2.8 6.7 2. ]
     [6.3 2.7 4.9 1.8]
     [6.7 3.3 5.7 2.1]
     [7.2 3.2 6.  1.8]
     [6.2 2.8 4.8 1.8]
     [6.1 3.  4.9 1.8]
     [6.4 2.8 5.6 2.1]
     [7.2 3.  5.8 1.6]
     [7.4 2.8 6.1 1.9]
     [7.9 3.8 6.4 2. ]
     [6.4 2.8 5.6 2.2]
     [6.3 2.8 5.1 1.5]
     [6.1 2.6 5.6 1.4]
     [7.7 3.  6.1 2.3]
     [6.3 3.4 5.6 2.4]
     [6.4 3.1 5.5 1.8]
     [6.  3.  4.8 1.8]
     [6.9 3.1 5.4 2.1]
     [6.7 3.1 5.6 2.4]
     [6.9 3.1 5.1 2.3]
     [5.8 2.7 5.1 1.9]
     [6.8 3.2 5.9 2.3]
     [6.7 3.3 5.7 2.5]
     [6.7 3.  5.2 2.3]
     [6.3 2.5 5.  1.9]
     [6.5 3.  5.2 2. ]
     [6.2 3.4 5.4 2.3]
     [5.9 3.  5.1 1.8]]
    y_data from datasets: 
     [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
     0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
     1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
     2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
     2 2]
    x_data add index: 
          花萼长度  花萼宽度  花瓣长度  花瓣宽度
    0         5.1       3.5       1.4       0.2
    1         4.9       3.0       1.4       0.2
    2         4.7       3.2       1.3       0.2
    3         4.6       3.1       1.5       0.2
    4         5.0       3.6       1.4       0.2
    ..        ...       ...       ...       ...
    145       6.7       3.0       5.2       2.3
    146       6.3       2.5       5.0       1.9
    147       6.5       3.0       5.2       2.0
    148       6.2       3.4       5.4       2.3
    149       5.9       3.0       5.1       1.8
    
    [150 rows x 4 columns]
    x_data add a column: 
          花萼长度  花萼宽度  花瓣长度  花瓣宽度  类别
    0         5.1       3.5       1.4       0.2     0
    1         4.9       3.0       1.4       0.2     0
    2         4.7       3.2       1.3       0.2     0
    3         4.6       3.1       1.5       0.2     0
    4         5.0       3.6       1.4       0.2     0
    ..        ...       ...       ...       ...   ...
    145       6.7       3.0       5.2       2.3     2
    146       6.3       2.5       5.0       1.9     2
    147       6.5       3.0       5.2       2.0     2
    148       6.2       3.4       5.4       2.3     2
    149       5.9       3.0       5.1       1.8     2
    
    [150 rows x 5 columns]
    
    # -*- coding: UTF-8 -*-
    # 利用鸢尾花数据集,实现前向传播、反向传播,可视化loss曲线
    
    # 导入所需模块
    import tensorflow as tf
    from sklearn import datasets
    from matplotlib import pyplot as plt
    import numpy as np
    
    # 导入数据,分别为输入特征和标签
    x_data = datasets.load_iris().data
    y_data = datasets.load_iris().target
    
    # 随机打乱数据(因为原始数据是顺序的,顺序不打乱会影响准确率)
    # seed: 随机数种子,是一个整数,当设置之后,每次生成的随机数都一样(为方便教学,以保每位同学结果一致)
    np.random.seed(116)  # 使用相同的seed,保证输入特征和标签一一对应
    np.random.shuffle(x_data)
    np.random.seed(116)
    np.random.shuffle(y_data)
    tf.random.set_seed(116)
    
    # 将打乱后的数据集分割为训练集和测试集,训练集为前120行,测试集为后30行
    x_train = x_data[:-30]
    y_train = y_data[:-30]
    x_test = x_data[-30:]
    y_test = y_data[-30:]
    
    # 转换x的数据类型,否则后面矩阵相乘时会因数据类型不一致报错
    x_train = tf.cast(x_train, tf.float32)
    x_test = tf.cast(x_test, tf.float32)
    
    # from_tensor_slices函数使输入特征和标签值一一对应。(把数据集分批次,每个批次batch组数据)
    train_db = tf.data.Dataset.from_tensor_slices((x_train, y_train)).batch(32)
    test_db = tf.data.Dataset.from_tensor_slices((x_test, y_test)).batch(32)
    
    # 生成神经网络的参数,4个输入特征故,输入层为4个输入节点;因为3分类,故输出层为3个神经元
    # 用tf.Variable()标记参数可训练
    # 使用seed使每次生成的随机数相同(方便教学,使大家结果都一致,在现实使用时不写seed)
    w1 = tf.Variable(tf.random.truncated_normal([4, 3], stddev=0.1, seed=1))
    b1 = tf.Variable(tf.random.truncated_normal([3], stddev=0.1, seed=1))
    
    lr = 0.1  # 学习率为0.1
    train_loss_results = []  # 将每轮的loss记录在此列表中,为后续画loss曲线提供数据
    test_acc = []  # 将每轮的acc记录在此列表中,为后续画acc曲线提供数据
    epoch = 500  # 循环500轮
    loss_all = 0  # 每轮分4个step,loss_all记录四个step生成的4个loss的和
    
    # 训练部分
    for epoch in range(epoch):  #数据集级别的循环,每个epoch循环一次数据集
        for step, (x_train, y_train) in enumerate(train_db):  #batch级别的循环 ,每个step循环一个batch
            with tf.GradientTape() as tape:  # with结构记录梯度信息
                y = tf.matmul(x_train, w1) + b1  # 神经网络乘加运算
                y = tf.nn.softmax(y)  # 使输出y符合概率分布(此操作后与独热码同量级,可相减求loss)
                y_ = tf.one_hot(y_train, depth=3)  # 将标签值转换为独热码格式,方便计算loss和accuracy
                loss = tf.reduce_mean(tf.square(y_ - y))  # 采用均方误差损失函数mse = mean(sum(y-out)^2)
                loss_all += loss.numpy()  # 将每个step计算出的loss累加,为后续求loss平均值提供数据,这样计算的loss更准确
            # 计算loss对各个参数的梯度
            grads = tape.gradient(loss, [w1, b1])
    
            # 实现梯度更新 w1 = w1 - lr * w1_grad    b = b - lr * b_grad
            w1.assign_sub(lr * grads[0])  # 参数w1自更新
            b1.assign_sub(lr * grads[1])  # 参数b自更新
    
        # 每个epoch,打印loss信息
        print("Epoch {}, loss: {}".format(epoch, loss_all/4))
        train_loss_results.append(loss_all / 4)  # 将4个step的loss求平均记录在此变量中
        loss_all = 0  # loss_all归零,为记录下一个epoch的loss做准备
    
        # 测试部分
        # total_correct为预测对的样本个数, total_number为测试的总样本数,将这两个变量都初始化为0
        total_correct, total_number = 0, 0
        for x_test, y_test in test_db:
            # 使用更新后的参数进行预测
            y = tf.matmul(x_test, w1) + b1
            y = tf.nn.softmax(y)
            pred = tf.argmax(y, axis=1)  # 返回y中最大值的索引,即预测的分类
            # 将pred转换为y_test的数据类型
            pred = tf.cast(pred, dtype=y_test.dtype)
            # 若分类正确,则correct=1,否则为0,将bool型的结果转换为int型
            correct = tf.cast(tf.equal(pred, y_test), dtype=tf.int32)
            # 将每个batch的correct数加起来
            correct = tf.reduce_sum(correct)
            # 将所有batch中的correct数加起来
            total_correct += int(correct)
            # total_number为测试的总样本数,也就是x_test的行数,shape[0]返回变量的行数
            total_number += x_test.shape[0]
        # 总的准确率等于total_correct/total_number
        acc = total_correct / total_number
        test_acc.append(acc)
        print("Test_acc:", acc)
        print("--------------------------")
    
    # 绘制 loss 曲线
    plt.title('Loss Function Curve')  # 图片标题
    plt.xlabel('Epoch')  # x轴变量名称
    plt.ylabel('Loss')  # y轴变量名称
    plt.plot(train_loss_results, label="$Loss$")  # 逐点画出trian_loss_results值并连线,连线图标是Loss
    plt.legend()  # 画出曲线图标
    plt.show()  # 画出图像
    
    # 绘制 Accuracy 曲线
    plt.title('Acc Curve')  # 图片标题
    plt.xlabel('Epoch')  # x轴变量名称
    plt.ylabel('Acc')  # y轴变量名称
    plt.plot(test_acc, label="$Accuracy$")  # 逐点画出test_acc值并连线,连线图标是Accuracy
    plt.legend()
    plt.show()
    
    Epoch 0, loss: 0.2821310982108116
    Test_acc: 0.16666666666666666
    --------------------------
    Epoch 1, loss: 0.25459614396095276
    Test_acc: 0.16666666666666666
    --------------------------
    Epoch 2, loss: 0.22570250555872917
    Test_acc: 0.16666666666666666
    --------------------------
    Epoch 3, loss: 0.21028399839997292
    Test_acc: 0.16666666666666666
    --------------------------
    Epoch 4, loss: 0.19942265003919601
    Test_acc: 0.16666666666666666
    --------------------------
    Epoch 5, loss: 0.18873637542128563
    Test_acc: 0.5
    --------------------------
    Epoch 6, loss: 0.17851298674941063
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 7, loss: 0.16922875493764877
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 8, loss: 0.16107673197984695
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 9, loss: 0.15404684469103813
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 10, loss: 0.14802725985646248
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 11, loss: 0.14287303388118744
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 12, loss: 0.1384414117783308
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 13, loss: 0.13460607267916203
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 14, loss: 0.13126072101294994
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 15, loss: 0.12831821851432323
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 16, loss: 0.12570794485509396
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 17, loss: 0.12337299063801765
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 18, loss: 0.12126746773719788
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 19, loss: 0.11935433186590672
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 20, loss: 0.11760355159640312
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 21, loss: 0.11599068343639374
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 22, loss: 0.11449568346142769
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 23, loss: 0.11310207471251488
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 24, loss: 0.11179621331393719
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 25, loss: 0.11056671850383282
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 26, loss: 0.1094040758907795
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 27, loss: 0.10830028168857098
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 28, loss: 0.10724855028092861
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 29, loss: 0.10624313168227673
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 30, loss: 0.10527909733355045
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 31, loss: 0.10435222275555134
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 32, loss: 0.10345886088907719
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 33, loss: 0.1025958750396967
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 34, loss: 0.10176052898168564
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 35, loss: 0.10095042549073696
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 36, loss: 0.10016347467899323
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 37, loss: 0.09939785115420818
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 38, loss: 0.0986519306898117
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 39, loss: 0.09792428836226463
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 40, loss: 0.09721365198493004
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 41, loss: 0.09651889465749264
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 42, loss: 0.095839012414217
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 43, loss: 0.09517310559749603
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 44, loss: 0.09452036954462528
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 45, loss: 0.0938800759613514
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 46, loss: 0.09325155802071095
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 47, loss: 0.09263424575328827
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 48, loss: 0.09202759340405464
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 49, loss: 0.09143111854791641
    Test_acc: 0.5333333333333333
    --------------------------
    Epoch 50, loss: 0.09084436483681202
    Test_acc: 0.5666666666666667
    --------------------------
    Epoch 51, loss: 0.09026693925261497
    Test_acc: 0.5666666666666667
    --------------------------
    Epoch 52, loss: 0.08969846554100513
    Test_acc: 0.5666666666666667
    --------------------------
    Epoch 53, loss: 0.08913860842585564
    Test_acc: 0.6
    --------------------------
    Epoch 54, loss: 0.08858705312013626
    Test_acc: 0.6
    --------------------------
    Epoch 55, loss: 0.08804351650178432
    Test_acc: 0.6
    --------------------------
    Epoch 56, loss: 0.08750772662460804
    Test_acc: 0.6
    --------------------------
    Epoch 57, loss: 0.0869794450700283
    Test_acc: 0.6
    --------------------------
    Epoch 58, loss: 0.08645843341946602
    Test_acc: 0.6
    --------------------------
    Epoch 59, loss: 0.08594449236989021
    Test_acc: 0.6
    --------------------------
    Epoch 60, loss: 0.08543741330504417
    Test_acc: 0.6
    --------------------------
    Epoch 61, loss: 0.08493702113628387
    Test_acc: 0.6
    --------------------------
    Epoch 62, loss: 0.08444313704967499
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 63, loss: 0.08395560085773468
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 64, loss: 0.08347426541149616
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 65, loss: 0.08299898356199265
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 66, loss: 0.08252961002290249
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 67, loss: 0.08206603676080704
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 68, loss: 0.08160812966525555
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 69, loss: 0.08115577884018421
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 70, loss: 0.08070887811481953
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 71, loss: 0.08026731014251709
    Test_acc: 0.6333333333333333
    --------------------------
    Epoch 72, loss: 0.07983098737895489
    Test_acc: 0.6666666666666666
    --------------------------
    Epoch 73, loss: 0.07939981110394001
    Test_acc: 0.6666666666666666
    --------------------------
    Epoch 74, loss: 0.0789736956357956
    Test_acc: 0.6666666666666666
    --------------------------
    Epoch 75, loss: 0.07855254411697388
    Test_acc: 0.7
    --------------------------
    Epoch 76, loss: 0.078136270865798
    Test_acc: 0.7
    --------------------------
    Epoch 77, loss: 0.07772480882704258
    Test_acc: 0.7
    --------------------------
    Epoch 78, loss: 0.07731806114315987
    Test_acc: 0.7
    --------------------------
    Epoch 79, loss: 0.07691597007215023
    Test_acc: 0.7
    --------------------------
    Epoch 80, loss: 0.07651844993233681
    Test_acc: 0.7
    --------------------------
    Epoch 81, loss: 0.07612543925642967
    Test_acc: 0.7333333333333333
    --------------------------
    Epoch 82, loss: 0.07573685422539711
    Test_acc: 0.7333333333333333
    --------------------------
    Epoch 83, loss: 0.07535265013575554
    Test_acc: 0.7333333333333333
    --------------------------
    Epoch 84, loss: 0.07497274503111839
    Test_acc: 0.7333333333333333
    --------------------------
    Epoch 85, loss: 0.07459708210080862
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 86, loss: 0.07422559149563313
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 87, loss: 0.0738582294434309
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 88, loss: 0.0734949205070734
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 89, loss: 0.0731356143951416
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 90, loss: 0.07278026826679707
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 91, loss: 0.07242879550904036
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 92, loss: 0.07208117935806513
    Test_acc: 0.7666666666666667
    --------------------------
    Epoch 93, loss: 0.0717373350635171
    Test_acc: 0.8
    --------------------------
    Epoch 94, loss: 0.07139723561704159
    Test_acc: 0.8
    --------------------------
    Epoch 95, loss: 0.07106081955134869
    Test_acc: 0.8
    --------------------------
    Epoch 96, loss: 0.07072804030030966
    Test_acc: 0.8
    --------------------------
    Epoch 97, loss: 0.07039883825927973
    Test_acc: 0.8
    --------------------------
    Epoch 98, loss: 0.07007318455725908
    Test_acc: 0.8333333333333334
    --------------------------
    Epoch 99, loss: 0.06975101493299007
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 100, loss: 0.06943228747695684
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 101, loss: 0.06911696959286928
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 102, loss: 0.06880500074476004
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 103, loss: 0.06849635019898415
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 104, loss: 0.06819096114486456
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 105, loss: 0.06788879726082087
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 106, loss: 0.06758982129395008
    Test_acc: 0.8666666666666667
    --------------------------
    Epoch 107, loss: 0.0672939820215106
    Test_acc: 0.9
    --------------------------
    Epoch 108, loss: 0.06700124498456717
    Test_acc: 0.9
    --------------------------
    Epoch 109, loss: 0.06671156641095877
    Test_acc: 0.9
    --------------------------
    Epoch 110, loss: 0.0664249137043953
    Test_acc: 0.9
    --------------------------
    Epoch 111, loss: 0.06614123564213514
    Test_acc: 0.9
    --------------------------
    Epoch 112, loss: 0.06586050614714622
    Test_acc: 0.9
    --------------------------
    Epoch 113, loss: 0.06558268330991268
    Test_acc: 0.9
    --------------------------
    Epoch 114, loss: 0.06530773546546698
    Test_acc: 0.9
    --------------------------
    Epoch 115, loss: 0.06503560114651918
    Test_acc: 0.9
    --------------------------
    Epoch 116, loss: 0.06476627010852098
    Test_acc: 0.9
    --------------------------
    Epoch 117, loss: 0.0644997013732791
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 118, loss: 0.06423585209995508
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 119, loss: 0.06397469434887171
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 120, loss: 0.06371619366109371
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 121, loss: 0.06346031092107296
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 122, loss: 0.06320701260119677
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 123, loss: 0.06295627076178789
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 124, loss: 0.06270804442465305
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 125, loss: 0.062462314032018185
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 126, loss: 0.06221904046833515
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 127, loss: 0.061978189274668694
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 128, loss: 0.06173973251134157
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 129, loss: 0.06150364316999912
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 130, loss: 0.06126988120377064
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 131, loss: 0.06103844102472067
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 132, loss: 0.06080926302820444
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 133, loss: 0.06058233417570591
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 134, loss: 0.06035762373358011
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 135, loss: 0.06013510935008526
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 136, loss: 0.05991474911570549
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 137, loss: 0.05969652906060219
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 138, loss: 0.059480417519807816
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 139, loss: 0.05926638934761286
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 140, loss: 0.059054410085082054
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 141, loss: 0.058844463899731636
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 142, loss: 0.058636522851884365
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 143, loss: 0.058430569246411324
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 144, loss: 0.058226557448506355
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 145, loss: 0.05802448280155659
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 146, loss: 0.05782430898398161
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 147, loss: 0.057626026682555676
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 148, loss: 0.05742959212511778
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 149, loss: 0.0572349950671196
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 150, loss: 0.05704221874475479
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 151, loss: 0.056851218454539776
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 152, loss: 0.05666198953986168
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 153, loss: 0.05647451616823673
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 154, loss: 0.05628875829279423
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 155, loss: 0.056104714050889015
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 156, loss: 0.05592233967036009
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 157, loss: 0.055741630494594574
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 158, loss: 0.05556255951523781
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 159, loss: 0.055385113693773746
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 160, loss: 0.05520926974713802
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 161, loss: 0.05503500811755657
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 162, loss: 0.054862307384610176
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 163, loss: 0.054691143333911896
    Test_acc: 0.9333333333333333
    --------------------------
    Epoch 164, loss: 0.05452151130884886
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 165, loss: 0.05435337871313095
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 166, loss: 0.05418673623353243
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 167, loss: 0.05402155593037605
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 168, loss: 0.053857834078371525
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 169, loss: 0.05369555111974478
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 170, loss: 0.0535346744582057
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 171, loss: 0.05337520223110914
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 172, loss: 0.053217110224068165
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 173, loss: 0.053060383535921574
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 174, loss: 0.05290501844137907
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 175, loss: 0.05275098141282797
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 176, loss: 0.052598257549107075
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 177, loss: 0.05244683753699064
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 178, loss: 0.05229670647531748
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 179, loss: 0.05214785132557154
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 180, loss: 0.052000245079398155
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 181, loss: 0.05185388308018446
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 182, loss: 0.05170875042676926
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 183, loss: 0.051564828492701054
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 184, loss: 0.05142210703343153
    Test_acc: 0.9666666666666667
    --------------------------
    Epoch 185, loss: 0.051280577667057514
    Test_acc: 1.0
    --------------------------
    Epoch 186, loss: 0.05114021338522434
    Test_acc: 1.0
    --------------------------
    Epoch 187, loss: 0.05100100859999657
    Test_acc: 1.0
    --------------------------
    Epoch 188, loss: 0.05086294375360012
    Test_acc: 1.0
    --------------------------
    Epoch 189, loss: 0.05072600208222866
    Test_acc: 1.0
    --------------------------
    Epoch 190, loss: 0.05059019569307566
    Test_acc: 1.0
    --------------------------
    Epoch 191, loss: 0.050455485470592976
    Test_acc: 1.0
    --------------------------
    Epoch 192, loss: 0.05032186862081289
    Test_acc: 1.0
    --------------------------
    Epoch 193, loss: 0.05018934141844511
    Test_acc: 1.0
    --------------------------
    Epoch 194, loss: 0.050057862885296345
    Test_acc: 1.0
    --------------------------
    Epoch 195, loss: 0.04992745164781809
    Test_acc: 1.0
    --------------------------
    Epoch 196, loss: 0.04979807883501053
    Test_acc: 1.0
    --------------------------
    Epoch 197, loss: 0.04966974351555109
    Test_acc: 1.0
    --------------------------
    Epoch 198, loss: 0.04954242613166571
    Test_acc: 1.0
    --------------------------
    Epoch 199, loss: 0.049416118301451206
    Test_acc: 1.0
    --------------------------
    Epoch 200, loss: 0.04929080791771412
    Test_acc: 1.0
    --------------------------
    Epoch 201, loss: 0.04916647635400295
    Test_acc: 1.0
    --------------------------
    Epoch 202, loss: 0.04904312081634998
    Test_acc: 1.0
    --------------------------
    Epoch 203, loss: 0.048920733854174614
    Test_acc: 1.0
    --------------------------
    Epoch 204, loss: 0.04879929218441248
    Test_acc: 1.0
    --------------------------
    Epoch 205, loss: 0.04867880139499903
    Test_acc: 1.0
    --------------------------
    Epoch 206, loss: 0.048559242859482765
    Test_acc: 1.0
    --------------------------
    Epoch 207, loss: 0.04844059329479933
    Test_acc: 1.0
    --------------------------
    Epoch 208, loss: 0.04832287225872278
    Test_acc: 1.0
    --------------------------
    Epoch 209, loss: 0.04820604436099529
    Test_acc: 1.0
    --------------------------
    Epoch 210, loss: 0.04809010773897171
    Test_acc: 1.0
    --------------------------
    Epoch 211, loss: 0.04797505680471659
    Test_acc: 1.0
    --------------------------
    Epoch 212, loss: 0.04786087851971388
    Test_acc: 1.0
    --------------------------
    Epoch 213, loss: 0.047747558914124966
    Test_acc: 1.0
    --------------------------
    Epoch 214, loss: 0.0476350886747241
    Test_acc: 1.0
    --------------------------
    Epoch 215, loss: 0.04752346873283386
    Test_acc: 1.0
    --------------------------
    Epoch 216, loss: 0.0474126823246479
    Test_acc: 1.0
    --------------------------
    Epoch 217, loss: 0.047302715480327606
    Test_acc: 1.0
    --------------------------
    Epoch 218, loss: 0.047193581238389015
    Test_acc: 1.0
    --------------------------
    Epoch 219, loss: 0.047085246071219444
    Test_acc: 1.0
    --------------------------
    Epoch 220, loss: 0.04697771091014147
    Test_acc: 1.0
    --------------------------
    Epoch 221, loss: 0.04687096457928419
    Test_acc: 1.0
    --------------------------
    Epoch 222, loss: 0.04676500242203474
    Test_acc: 1.0
    --------------------------
    Epoch 223, loss: 0.046659817919135094
    Test_acc: 1.0
    --------------------------
    Epoch 224, loss: 0.046555391512811184
    Test_acc: 1.0
    --------------------------
    Epoch 225, loss: 0.04645172692835331
    Test_acc: 1.0
    --------------------------
    Epoch 226, loss: 0.046348823234438896
    Test_acc: 1.0
    --------------------------
    Epoch 227, loss: 0.04624664504081011
    Test_acc: 1.0
    --------------------------
    Epoch 228, loss: 0.04614521004259586
    Test_acc: 1.0
    --------------------------
    Epoch 229, loss: 0.04604450147598982
    Test_acc: 1.0
    --------------------------
    Epoch 230, loss: 0.0459445109590888
    Test_acc: 1.0
    --------------------------
    Epoch 231, loss: 0.045845236629247665
    Test_acc: 1.0
    --------------------------
    Epoch 232, loss: 0.045746659860014915
    Test_acc: 1.0
    --------------------------
    Epoch 233, loss: 0.0456487787887454
    Test_acc: 1.0
    --------------------------
    Epoch 234, loss: 0.04555159900337458
    Test_acc: 1.0
    --------------------------
    Epoch 235, loss: 0.0454550925642252
    Test_acc: 1.0
    --------------------------
    Epoch 236, loss: 0.04535926412791014
    Test_acc: 1.0
    --------------------------
    Epoch 237, loss: 0.0452641062438488
    Test_acc: 1.0
    --------------------------
    Epoch 238, loss: 0.04516960587352514
    Test_acc: 1.0
    --------------------------
    Epoch 239, loss: 0.04507576208561659
    Test_acc: 1.0
    --------------------------
    Epoch 240, loss: 0.044982570223510265
    Test_acc: 1.0
    --------------------------
    Epoch 241, loss: 0.04489002004265785
    Test_acc: 1.0
    --------------------------
    Epoch 242, loss: 0.04479810409247875
    Test_acc: 1.0
    --------------------------
    Epoch 243, loss: 0.044706812128424644
    Test_acc: 1.0
    --------------------------
    Epoch 244, loss: 0.044616157189011574
    Test_acc: 1.0
    --------------------------
    Epoch 245, loss: 0.04452610295265913
    Test_acc: 1.0
    --------------------------
    Epoch 246, loss: 0.04443667083978653
    Test_acc: 1.0
    --------------------------
    Epoch 247, loss: 0.044347843155264854
    Test_acc: 1.0
    --------------------------
    Epoch 248, loss: 0.04425960686057806
    Test_acc: 1.0
    --------------------------
    Epoch 249, loss: 0.04417196847498417
    Test_acc: 1.0
    --------------------------
    Epoch 250, loss: 0.044084908440709114
    Test_acc: 1.0
    --------------------------
    Epoch 251, loss: 0.04399843607097864
    Test_acc: 1.0
    --------------------------
    Epoch 252, loss: 0.043912540189921856
    Test_acc: 1.0
    --------------------------
    Epoch 253, loss: 0.043827205896377563
    Test_acc: 1.0
    --------------------------
    Epoch 254, loss: 0.043742443434894085
    Test_acc: 1.0
    --------------------------
    Epoch 255, loss: 0.043658241629600525
    Test_acc: 1.0
    --------------------------
    Epoch 256, loss: 0.043574584648013115
    Test_acc: 1.0
    --------------------------
    Epoch 257, loss: 0.0434914818033576
    Test_acc: 1.0
    --------------------------
    Epoch 258, loss: 0.04340892285108566
    Test_acc: 1.0
    --------------------------
    Epoch 259, loss: 0.04332689754664898
    Test_acc: 1.0
    --------------------------
    Epoch 260, loss: 0.0432454077526927
    Test_acc: 1.0
    --------------------------
    Epoch 261, loss: 0.04316444229334593
    Test_acc: 1.0
    --------------------------
    Epoch 262, loss: 0.04308399744331837
    Test_acc: 1.0
    --------------------------
    Epoch 263, loss: 0.04300406388938427
    Test_acc: 1.0
    --------------------------
    Epoch 264, loss: 0.04292464908212423
    Test_acc: 1.0
    --------------------------
    Epoch 265, loss: 0.04284573905169964
    Test_acc: 1.0
    --------------------------
    Epoch 266, loss: 0.04276733938604593
    Test_acc: 1.0
    --------------------------
    Epoch 267, loss: 0.042689429596066475
    Test_acc: 1.0
    --------------------------
    Epoch 268, loss: 0.042612007819116116
    Test_acc: 1.0
    --------------------------
    Epoch 269, loss: 0.042535084299743176
    Test_acc: 1.0
    --------------------------
    Epoch 270, loss: 0.042458645068109035
    Test_acc: 1.0
    --------------------------
    Epoch 271, loss: 0.042382679879665375
    Test_acc: 1.0
    --------------------------
    Epoch 272, loss: 0.042307195253670216
    Test_acc: 1.0
    --------------------------
    Epoch 273, loss: 0.04223217163234949
    Test_acc: 1.0
    --------------------------
    Epoch 274, loss: 0.04215762298554182
    Test_acc: 1.0
    --------------------------
    Epoch 275, loss: 0.04208352975547314
    Test_acc: 1.0
    --------------------------
    Epoch 276, loss: 0.04200989659875631
    Test_acc: 1.0
    --------------------------
    Epoch 277, loss: 0.041936714202165604
    Test_acc: 1.0
    --------------------------
    Epoch 278, loss: 0.04186398349702358
    Test_acc: 1.0
    --------------------------
    Epoch 279, loss: 0.04179169703274965
    Test_acc: 1.0
    --------------------------
    Epoch 280, loss: 0.041719851084053516
    Test_acc: 1.0
    --------------------------
    Epoch 281, loss: 0.04164844751358032
    Test_acc: 1.0
    --------------------------
    Epoch 282, loss: 0.04157747142016888
    Test_acc: 1.0
    --------------------------
    Epoch 283, loss: 0.041506921872496605
    Test_acc: 1.0
    --------------------------
    Epoch 284, loss: 0.04143680538982153
    Test_acc: 1.0
    --------------------------
    Epoch 285, loss: 0.04136710148304701
    Test_acc: 1.0
    --------------------------
    Epoch 286, loss: 0.04129781946539879
    Test_acc: 1.0
    --------------------------
    Epoch 287, loss: 0.04122895281761885
    Test_acc: 1.0
    --------------------------
    Epoch 288, loss: 0.04116049408912659
    Test_acc: 1.0
    --------------------------
    Epoch 289, loss: 0.041092448867857456
    Test_acc: 1.0
    --------------------------
    Epoch 290, loss: 0.04102479573339224
    Test_acc: 1.0
    --------------------------
    Epoch 291, loss: 0.040957554243505
    Test_acc: 1.0
    --------------------------
    Epoch 292, loss: 0.040890700183808804
    Test_acc: 1.0
    --------------------------
    Epoch 293, loss: 0.04082423262298107
    Test_acc: 1.0
    --------------------------
    Epoch 294, loss: 0.040758166462183
    Test_acc: 1.0
    --------------------------
    Epoch 295, loss: 0.04069248400628567
    Test_acc: 1.0
    --------------------------
    Epoch 296, loss: 0.04062717594206333
    Test_acc: 1.0
    --------------------------
    Epoch 297, loss: 0.040562245063483715
    Test_acc: 1.0
    --------------------------
    Epoch 298, loss: 0.040497698821127415
    Test_acc: 1.0
    --------------------------
    Epoch 299, loss: 0.04043352138251066
    Test_acc: 1.0
    --------------------------
    Epoch 300, loss: 0.04036971274763346
    Test_acc: 1.0
    --------------------------
    Epoch 301, loss: 0.0403062729164958
    Test_acc: 1.0
    --------------------------
    Epoch 302, loss: 0.04024319164454937
    Test_acc: 1.0
    --------------------------
    Epoch 303, loss: 0.04018046706914902
    Test_acc: 1.0
    --------------------------
    Epoch 304, loss: 0.0401181080378592
    Test_acc: 1.0
    --------------------------
    Epoch 305, loss: 0.040056094992905855
    Test_acc: 1.0
    --------------------------
    Epoch 306, loss: 0.039994434453547
    Test_acc: 1.0
    --------------------------
    Epoch 307, loss: 0.03993312316015363
    Test_acc: 1.0
    --------------------------
    Epoch 308, loss: 0.039872157853096724
    Test_acc: 1.0
    --------------------------
    Epoch 309, loss: 0.039811530616134405
    Test_acc: 1.0
    --------------------------
    Epoch 310, loss: 0.03975124517455697
    Test_acc: 1.0
    --------------------------
    Epoch 311, loss: 0.039691298734396696
    Test_acc: 1.0
    --------------------------
    Epoch 312, loss: 0.03963167453184724
    Test_acc: 1.0
    --------------------------
    Epoch 313, loss: 0.039572388399392366
    Test_acc: 1.0
    --------------------------
    Epoch 314, loss: 0.03951342450454831
    Test_acc: 1.0
    --------------------------
    Epoch 315, loss: 0.03945478983223438
    Test_acc: 1.0
    --------------------------
    Epoch 316, loss: 0.03939648298546672
    Test_acc: 1.0
    --------------------------
    Epoch 317, loss: 0.03933848673477769
    Test_acc: 1.0
    --------------------------
    Epoch 318, loss: 0.03928081365302205
    Test_acc: 1.0
    --------------------------
    Epoch 319, loss: 0.03922344697639346
    Test_acc: 1.0
    --------------------------
    Epoch 320, loss: 0.039166401606053114
    Test_acc: 1.0
    --------------------------
    Epoch 321, loss: 0.03910966124385595
    Test_acc: 1.0
    --------------------------
    Epoch 322, loss: 0.039053223095834255
    Test_acc: 1.0
    --------------------------
    Epoch 323, loss: 0.03899709461256862
    Test_acc: 1.0
    --------------------------
    Epoch 324, loss: 0.038941262755542994
    Test_acc: 1.0
    --------------------------
    Epoch 325, loss: 0.03888573916628957
    Test_acc: 1.0
    --------------------------
    Epoch 326, loss: 0.0388304959051311
    Test_acc: 1.0
    --------------------------
    Epoch 327, loss: 0.03877556184306741
    Test_acc: 1.0
    --------------------------
    Epoch 328, loss: 0.03872091509401798
    Test_acc: 1.0
    --------------------------
    Epoch 329, loss: 0.0386665565893054
    Test_acc: 1.0
    --------------------------
    Epoch 330, loss: 0.038612480740994215
    Test_acc: 1.0
    --------------------------
    Epoch 331, loss: 0.03855870105326176
    Test_acc: 1.0
    --------------------------
    Epoch 332, loss: 0.03850520076230168
    Test_acc: 1.0
    --------------------------
    Epoch 333, loss: 0.0384519724175334
    Test_acc: 1.0
    --------------------------
    Epoch 334, loss: 0.038399036042392254
    Test_acc: 1.0
    --------------------------
    Epoch 335, loss: 0.03834636928513646
    Test_acc: 1.0
    --------------------------
    Epoch 336, loss: 0.03829398099333048
    Test_acc: 1.0
    --------------------------
    Epoch 337, loss: 0.03824185347184539
    Test_acc: 1.0
    --------------------------
    Epoch 338, loss: 0.038189999759197235
    Test_acc: 1.0
    --------------------------
    Epoch 339, loss: 0.038138418924063444
    Test_acc: 1.0
    --------------------------
    Epoch 340, loss: 0.03808710351586342
    Test_acc: 1.0
    --------------------------
    Epoch 341, loss: 0.03803604608401656
    Test_acc: 1.0
    --------------------------
    Epoch 342, loss: 0.03798525454476476
    Test_acc: 1.0
    --------------------------
    Epoch 343, loss: 0.037934715393930674
    Test_acc: 1.0
    --------------------------
    Epoch 344, loss: 0.037884439807385206
    Test_acc: 1.0
    --------------------------
    Epoch 345, loss: 0.03783441847190261
    Test_acc: 1.0
    --------------------------
    Epoch 346, loss: 0.03778465138748288
    Test_acc: 1.0
    --------------------------
    Epoch 347, loss: 0.03773513715714216
    Test_acc: 1.0
    --------------------------
    Epoch 348, loss: 0.03768586413934827
    Test_acc: 1.0
    --------------------------
    Epoch 349, loss: 0.037636840250343084
    Test_acc: 1.0
    --------------------------
    Epoch 350, loss: 0.037588066421449184
    Test_acc: 1.0
    --------------------------
    Epoch 351, loss: 0.03753953380510211
    Test_acc: 1.0
    --------------------------
    Epoch 352, loss: 0.03749124752357602
    Test_acc: 1.0
    --------------------------
    Epoch 353, loss: 0.03744319686666131
    Test_acc: 1.0
    --------------------------
    Epoch 354, loss: 0.03739538788795471
    Test_acc: 1.0
    --------------------------
    Epoch 355, loss: 0.03734781965613365
    Test_acc: 1.0
    --------------------------
    Epoch 356, loss: 0.03730047354474664
    Test_acc: 1.0
    --------------------------
    Epoch 357, loss: 0.037253367248922586
    Test_acc: 1.0
    --------------------------
    Epoch 358, loss: 0.037206494715064764
    Test_acc: 1.0
    --------------------------
    Epoch 359, loss: 0.03715984430164099
    Test_acc: 1.0
    --------------------------
    Epoch 360, loss: 0.03711343323811889
    Test_acc: 1.0
    --------------------------
    Epoch 361, loss: 0.03706724150106311
    Test_acc: 1.0
    --------------------------
    Epoch 362, loss: 0.037021270021796227
    Test_acc: 1.0
    --------------------------
    Epoch 363, loss: 0.03697552578523755
    Test_acc: 1.0
    --------------------------
    Epoch 364, loss: 0.036930006463080645
    Test_acc: 1.0
    --------------------------
    Epoch 365, loss: 0.03688469948247075
    Test_acc: 1.0
    --------------------------
    Epoch 366, loss: 0.03683961136266589
    Test_acc: 1.0
    --------------------------
    Epoch 367, loss: 0.03679474908858538
    Test_acc: 1.0
    --------------------------
    Epoch 368, loss: 0.036750094033777714
    Test_acc: 1.0
    --------------------------
    Epoch 369, loss: 0.03670564666390419
    Test_acc: 1.0
    --------------------------
    Epoch 370, loss: 0.0366614181548357
    Test_acc: 1.0
    --------------------------
    Epoch 371, loss: 0.036617396865040064
    Test_acc: 1.0
    --------------------------
    Epoch 372, loss: 0.03657358791679144
    Test_acc: 1.0
    --------------------------
    Epoch 373, loss: 0.03652997827157378
    Test_acc: 1.0
    --------------------------
    Epoch 374, loss: 0.03648658888414502
    Test_acc: 1.0
    --------------------------
    Epoch 375, loss: 0.03644338669255376
    Test_acc: 1.0
    --------------------------
    Epoch 376, loss: 0.036400395911186934
    Test_acc: 1.0
    --------------------------
    Epoch 377, loss: 0.036357597913593054
    Test_acc: 1.0
    --------------------------
    Epoch 378, loss: 0.03631501505151391
    Test_acc: 1.0
    --------------------------
    Epoch 379, loss: 0.036272619385272264
    Test_acc: 1.0
    --------------------------
    Epoch 380, loss: 0.036230423487722874
    Test_acc: 1.0
    --------------------------
    Epoch 381, loss: 0.03618842409923673
    Test_acc: 1.0
    --------------------------
    Epoch 382, loss: 0.0361466147005558
    Test_acc: 1.0
    --------------------------
    Epoch 383, loss: 0.036105004604905844
    Test_acc: 1.0
    --------------------------
    Epoch 384, loss: 0.0360635737888515
    Test_acc: 1.0
    --------------------------
    Epoch 385, loss: 0.036022352520376444
    Test_acc: 1.0
    --------------------------
    Epoch 386, loss: 0.035981301218271255
    Test_acc: 1.0
    --------------------------
    Epoch 387, loss: 0.03594044363126159
    Test_acc: 1.0
    --------------------------
    Epoch 388, loss: 0.035899773240089417
    Test_acc: 1.0
    --------------------------
    Epoch 389, loss: 0.03585928678512573
    Test_acc: 1.0
    --------------------------
    Epoch 390, loss: 0.03581898845732212
    Test_acc: 1.0
    --------------------------
    Epoch 391, loss: 0.035778870806097984
    Test_acc: 1.0
    --------------------------
    Epoch 392, loss: 0.03573893290013075
    Test_acc: 1.0
    --------------------------
    Epoch 393, loss: 0.035699174739420414
    Test_acc: 1.0
    --------------------------
    Epoch 394, loss: 0.03565959073603153
    Test_acc: 1.0
    --------------------------
    Epoch 395, loss: 0.035620191134512424
    Test_acc: 1.0
    --------------------------
    Epoch 396, loss: 0.035580961499363184
    Test_acc: 1.0
    --------------------------
    Epoch 397, loss: 0.03554190881550312
    Test_acc: 1.0
    --------------------------
    Epoch 398, loss: 0.035503033082932234
    Test_acc: 1.0
    --------------------------
    Epoch 399, loss: 0.035464323591440916
    Test_acc: 1.0
    --------------------------
    Epoch 400, loss: 0.035425792913883924
    Test_acc: 1.0
    --------------------------
    Epoch 401, loss: 0.03538742754608393
    Test_acc: 1.0
    --------------------------
    Epoch 402, loss: 0.0353492246940732
    Test_acc: 1.0
    --------------------------
    Epoch 403, loss: 0.03531120577827096
    Test_acc: 1.0
    --------------------------
    Epoch 404, loss: 0.03527334099635482
    Test_acc: 1.0
    --------------------------
    Epoch 405, loss: 0.03523564524948597
    Test_acc: 1.0
    --------------------------
    Epoch 406, loss: 0.03519811574369669
    Test_acc: 1.0
    --------------------------
    Epoch 407, loss: 0.03516074130311608
    Test_acc: 1.0
    --------------------------
    Epoch 408, loss: 0.03512354148551822
    Test_acc: 1.0
    --------------------------
    Epoch 409, loss: 0.035086498130112886
    Test_acc: 1.0
    --------------------------
    Epoch 410, loss: 0.03504961263388395
    Test_acc: 1.0
    --------------------------
    Epoch 411, loss: 0.03501288965344429
    Test_acc: 1.0
    --------------------------
    Epoch 412, loss: 0.03497632220387459
    Test_acc: 1.0
    --------------------------
    Epoch 413, loss: 0.03493991028517485
    Test_acc: 1.0
    --------------------------
    Epoch 414, loss: 0.0349036636762321
    Test_acc: 1.0
    --------------------------
    Epoch 415, loss: 0.03486755769699812
    Test_acc: 1.0
    --------------------------
    Epoch 416, loss: 0.03483161563053727
    Test_acc: 1.0
    --------------------------
    Epoch 417, loss: 0.0347958211787045
    Test_acc: 1.0
    --------------------------
    Epoch 418, loss: 0.03476017527282238
    Test_acc: 1.0
    --------------------------
    Epoch 419, loss: 0.034724689088761806
    Test_acc: 1.0
    --------------------------
    Epoch 420, loss: 0.03468934912234545
    Test_acc: 1.0
    --------------------------
    Epoch 421, loss: 0.034654161892831326
    Test_acc: 1.0
    --------------------------
    Epoch 422, loss: 0.03461911762133241
    Test_acc: 1.0
    --------------------------
    Epoch 423, loss: 0.034584223292768
    Test_acc: 1.0
    --------------------------
    Epoch 424, loss: 0.03454947005957365
    Test_acc: 1.0
    --------------------------
    Epoch 425, loss: 0.0345148635096848
    Test_acc: 1.0
    --------------------------
    Epoch 426, loss: 0.03448040969669819
    Test_acc: 1.0
    --------------------------
    Epoch 427, loss: 0.03444609045982361
    Test_acc: 1.0
    --------------------------
    Epoch 428, loss: 0.034411918837577105
    Test_acc: 1.0
    --------------------------
    Epoch 429, loss: 0.03437788691371679
    Test_acc: 1.0
    --------------------------
    Epoch 430, loss: 0.0343439974822104
    Test_acc: 1.0
    --------------------------
    Epoch 431, loss: 0.03431024681776762
    Test_acc: 1.0
    --------------------------
    Epoch 432, loss: 0.034276628866791725
    Test_acc: 1.0
    --------------------------
    Epoch 433, loss: 0.03424314875155687
    Test_acc: 1.0
    --------------------------
    Epoch 434, loss: 0.03420981951057911
    Test_acc: 1.0
    --------------------------
    Epoch 435, loss: 0.03417660994455218
    Test_acc: 1.0
    --------------------------
    Epoch 436, loss: 0.0341435419395566
    Test_acc: 1.0
    --------------------------
    Epoch 437, loss: 0.034110613632947206
    Test_acc: 1.0
    --------------------------
    Epoch 438, loss: 0.03407781198620796
    Test_acc: 1.0
    --------------------------
    Epoch 439, loss: 0.03404514491558075
    Test_acc: 1.0
    --------------------------
    Epoch 440, loss: 0.03401260497048497
    Test_acc: 1.0
    --------------------------
    Epoch 441, loss: 0.033980210311710835
    Test_acc: 1.0
    --------------------------
    Epoch 442, loss: 0.03394793486222625
    Test_acc: 1.0
    --------------------------
    Epoch 443, loss: 0.03391578374430537
    Test_acc: 1.0
    --------------------------
    Epoch 444, loss: 0.03388377372175455
    Test_acc: 1.0
    --------------------------
    Epoch 445, loss: 0.033851887565106153
    Test_acc: 1.0
    --------------------------
    Epoch 446, loss: 0.033820133190602064
    Test_acc: 1.0
    --------------------------
    Epoch 447, loss: 0.033788496162742376
    Test_acc: 1.0
    --------------------------
    Epoch 448, loss: 0.033756986260414124
    Test_acc: 1.0
    --------------------------
    Epoch 449, loss: 0.03372561139985919
    Test_acc: 1.0
    --------------------------
    Epoch 450, loss: 0.03369434690102935
    Test_acc: 1.0
    --------------------------
    Epoch 451, loss: 0.03366320999339223
    Test_acc: 1.0
    --------------------------
    Epoch 452, loss: 0.03363220253959298
    Test_acc: 1.0
    --------------------------
    Epoch 453, loss: 0.03360130963847041
    Test_acc: 1.0
    --------------------------
    Epoch 454, loss: 0.03357054013758898
    Test_acc: 1.0
    --------------------------
    Epoch 455, loss: 0.03353989636525512
    Test_acc: 1.0
    --------------------------
    Epoch 456, loss: 0.03350935876369476
    Test_acc: 1.0
    --------------------------
    Epoch 457, loss: 0.03347895201295614
    Test_acc: 1.0
    --------------------------
    Epoch 458, loss: 0.03344865143299103
    Test_acc: 1.0
    --------------------------
    Epoch 459, loss: 0.03341847471892834
    Test_acc: 1.0
    --------------------------
    Epoch 460, loss: 0.03338842140510678
    Test_acc: 1.0
    --------------------------
    Epoch 461, loss: 0.03335847705602646
    Test_acc: 1.0
    --------------------------
    Epoch 462, loss: 0.03332865331321955
    Test_acc: 1.0
    --------------------------
    Epoch 463, loss: 0.033298940397799015
    Test_acc: 1.0
    --------------------------
    Epoch 464, loss: 0.03326933644711971
    Test_acc: 1.0
    --------------------------
    Epoch 465, loss: 0.03323985682800412
    Test_acc: 1.0
    --------------------------
    Epoch 466, loss: 0.03321048431098461
    Test_acc: 1.0
    --------------------------
    Epoch 467, loss: 0.03318122588098049
    Test_acc: 1.0
    --------------------------
    Epoch 468, loss: 0.0331520726904273
    Test_acc: 1.0
    --------------------------
    Epoch 469, loss: 0.03312302753329277
    Test_acc: 1.0
    --------------------------
    Epoch 470, loss: 0.03309410251677036
    Test_acc: 1.0
    --------------------------
    Epoch 471, loss: 0.03306529065594077
    Test_acc: 1.0
    --------------------------
    Epoch 472, loss: 0.0330365770496428
    Test_acc: 1.0
    --------------------------
    Epoch 473, loss: 0.03300797380506992
    Test_acc: 1.0
    --------------------------
    Epoch 474, loss: 0.03297948185354471
    Test_acc: 1.0
    --------------------------
    Epoch 475, loss: 0.03295108396559954
    Test_acc: 1.0
    --------------------------
    Epoch 476, loss: 0.03292280435562134
    Test_acc: 1.0
    --------------------------
    Epoch 477, loss: 0.032894630916416645
    Test_acc: 1.0
    --------------------------
    Epoch 478, loss: 0.03286655806005001
    Test_acc: 1.0
    --------------------------
    Epoch 479, loss: 0.03283859323710203
    Test_acc: 1.0
    --------------------------
    Epoch 480, loss: 0.0328107294626534
    Test_acc: 1.0
    --------------------------
    Epoch 481, loss: 0.03278296673670411
    Test_acc: 1.0
    --------------------------
    Epoch 482, loss: 0.03275530692189932
    Test_acc: 1.0
    --------------------------
    Epoch 483, loss: 0.03272775374352932
    Test_acc: 1.0
    --------------------------
    Epoch 484, loss: 0.032700295094400644
    Test_acc: 1.0
    --------------------------
    Epoch 485, loss: 0.032672947738319635
    Test_acc: 1.0
    --------------------------
    Epoch 486, loss: 0.03264568746089935
    Test_acc: 1.0
    --------------------------
    Epoch 487, loss: 0.03261852730065584
    Test_acc: 1.0
    --------------------------
    Epoch 488, loss: 0.032591485884040594
    Test_acc: 1.0
    --------------------------
    Epoch 489, loss: 0.032564534805715084
    Test_acc: 1.0
    --------------------------
    Epoch 490, loss: 0.032537671737372875
    Test_acc: 1.0
    --------------------------
    Epoch 491, loss: 0.03251090971753001
    Test_acc: 1.0
    --------------------------
    Epoch 492, loss: 0.03248425014317036
    Test_acc: 1.0
    --------------------------
    Epoch 493, loss: 0.03245767951011658
    Test_acc: 1.0
    --------------------------
    Epoch 494, loss: 0.032431216444820166
    Test_acc: 1.0
    --------------------------
    Epoch 495, loss: 0.032404834404587746
    Test_acc: 1.0
    --------------------------
    Epoch 496, loss: 0.03237855713814497
    Test_acc: 1.0
    --------------------------
    Epoch 497, loss: 0.03235237207263708
    Test_acc: 1.0
    --------------------------
    Epoch 498, loss: 0.03232626663520932
    Test_acc: 1.0
    --------------------------
    Epoch 499, loss: 0.032300274819135666
    Test_acc: 1.0
    --------------------------
    
    output_12_1.png output_12_2.png

    相关文章

      网友评论

          本文标题:神经网络基本知识

          本文链接:https://www.haomeiwen.com/subject/ulovyktx.html