美文网首页
债务违约预测之四:利用人工神经网络进行预测

债务违约预测之四:利用人工神经网络进行预测

作者: 游遍星辰99 | 来源:发表于2017-07-24 15:10 被阅读785次

    不了解人工神经网络的小伙伴,可以看看之前写的一篇分享

    %matplotlib inline
    import pandas as pd
    import numpy as np 
    import tensorflow as tf
    import matplotlib.pyplot as plt
    
    pd.set_option('display.float_format', lambda x: '%.5f' % x) #为了直观的显示数字,不采用科学计数法
    pd.options.display.max_rows = 15 #最多显示15行
    import warnings
    warnings.filterwarnings('ignore') #为了整洁,去除弹出的warnings
    
    #读入数据,清洗数据
    df=pd.read_csv( 'cs-training.csv')
    df = df.drop(df.columns[0],axis=1)
    df=df[df.age>=18]
    df=df.dropna()  #为了简化问题,先把包含空值的记录都删除
    
    
    
    # 把NumberOfTime30-59DaysPastDueNotWorse的异常值设为0
    df.loc[(df['NumberOfTime30-59DaysPastDueNotWorse']==98) | (df['NumberOfTime30-59DaysPastDueNotWorse']==96),'NumberOfTime30-59DaysPastDueNotWorse']=0
    df.loc[(df['NumberOfTime60-89DaysPastDueNotWorse']==98) | (df['NumberOfTime60-89DaysPastDueNotWorse']==96),'NumberOfTime60-89DaysPastDueNotWorse']=0
    df.loc[(df['NumberOfTimes90DaysLate']==98) | (df['NumberOfTimes90DaysLate']==96),'NumberOfTimes90DaysLate']=0
    
    df=df.sample(frac=1).reset_index(drop=True)  #把数据打乱
    
    p = 0.8
    train = df.iloc[:int(df.shape[0] * p), :]#取80%作为测试集
    test  = df.iloc[int(df.shape[0] * p):, :]#剩下20%作为训练集
    Y=train.iloc[:,:1].values #返回一个n*1的array,如果不加.values,返回的是一个dataframe
    X=train.iloc[:,1:].values #返回一个n*10的array
    Y_t = test.iloc[:, :1].values  # 测试集中的标签。
    X_t = test.iloc[:, 1:].values  # 训练集中的自变量
    
    train.columns
    
    Index(['SeriousDlqin2yrs', 'RevolvingUtilizationOfUnsecuredLines', 'age',
           'NumberOfTime30-59DaysPastDueNotWorse', 'DebtRatio', 'MonthlyIncome',
           'NumberOfOpenCreditLinesAndLoans', 'NumberOfTimes90DaysLate',
           'NumberRealEstateLoansOrLines', 'NumberOfTime60-89DaysPastDueNotWorse',
           'NumberOfDependents'],
          dtype='object')
    
    Y_t
    
    array([[0],
           [0],
           [0],
           ..., 
           [0],
           [0],
           [0]], dtype=int64)
    
    def accuracy(preds, labels):
        return ((preds > 0.5) == labels).sum() / float(len(labels))
    #(preds > 0.5)返回一个一维数组,每个元素为0或1,取决于preds中对应下标的元素是否大于0.5
    
    #无隐层神经网络,in_size为自变量的个数
    def neural_network(in_size):
        tf.reset_default_graph()
        w1= tf.Variable(tf.random_normal([in_size, 1], stddev=1, seed=1))#w1是一个in_size*1矩阵,即列向量
        b1 = tf.Variable(tf.random_normal([1], stddev=1, seed=1))
        x = tf.placeholder(tf.float32, shape=(None, in_size), name="x-input")
        y_= tf.placeholder(tf.float32, shape=(None, 1), name='y-input')
        
        raw_output= tf.add(tf.matmul(x, w1),b1)
        y = tf.sigmoid(raw_output) 
        cross_entropy =tf.reduce_sum(y_*tf.log(y))
        cross_entropy = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=raw_output, labels=y_))
        #定义训练方法
        train_step = tf.train.AdamOptimizer(0.01).minimize(cross_entropy)
        costs=[]
        with tf.Session() as sess:
            init_op = tf.global_variables_initializer()
            sess.run(init_op)
    
            # 输出目前(未经训练)的参数取值。
            print("w1:", sess.run(w1))
            print("\n")
    
            # 训练模型。
            STEPS = 10000
            for i in range(STEPS):
                sess.run(train_step, feed_dict={x: X, y_: Y})
                total_cross_entropy = sess.run(cross_entropy, feed_dict={x: X, y_: Y})
                costs.append(total_cross_entropy)            
                if i % 500 == 0:
                    train_output = sess.run(y, feed_dict={x: X, y_: Y})
                    train_accuracy = accuracy(train_output, Y)
                    test_output = sess.run(y, feed_dict={x: X_t, y_: Y_t})
                    test_accuracy = accuracy(test_output, Y_t)
                    print("After %d training step(s), cross entropy on all data is "
                          "%3f, Train accuracy is %.2f, Test accuracy is %.2f" % (
                              i, total_cross_entropy, train_accuracy, test_accuracy))
    
           # 输出训练后的参数取值
            print("w1:", sess.run(w1))
            
            plt.plot(costs)
    
    neural_network(10)
    
    w1: [[-0.81131822]
     [ 1.48459876]
     [ 0.06532937]
     [-2.4427042 ]
     [ 0.0992484 ]
     [ 0.59122431]
     [ 0.59282297]
     [-2.12292957]
     [-0.72289723]
     [-0.05627038]]
    
    
    After 0 training step(s), cross entropy on all data is 633.568542, Train accuracy is 0.08, Test accuracy is 0.09
    After 500 training step(s), cross entropy on all data is 3.982690, Train accuracy is 0.93, Test accuracy is 0.93
    After 1000 training step(s), cross entropy on all data is 0.821444, Train accuracy is 0.93, Test accuracy is 0.92
    After 1500 training step(s), cross entropy on all data is 0.267407, Train accuracy is 0.93, Test accuracy is 0.93
    After 2000 training step(s), cross entropy on all data is 0.235337, Train accuracy is 0.93, Test accuracy is 0.93
    After 2500 training step(s), cross entropy on all data is 4.634377, Train accuracy is 0.93, Test accuracy is 0.93
    After 3000 training step(s), cross entropy on all data is 2.143549, Train accuracy is 0.29, Test accuracy is 0.29
    After 3500 training step(s), cross entropy on all data is 0.269957, Train accuracy is 0.93, Test accuracy is 0.93
    After 4000 training step(s), cross entropy on all data is 0.234392, Train accuracy is 0.93, Test accuracy is 0.93
    After 4500 training step(s), cross entropy on all data is 18.266743, Train accuracy is 0.09, Test accuracy is 0.09
    After 5000 training step(s), cross entropy on all data is 1.353009, Train accuracy is 0.93, Test accuracy is 0.93
    After 5500 training step(s), cross entropy on all data is 0.236580, Train accuracy is 0.93, Test accuracy is 0.93
    After 6000 training step(s), cross entropy on all data is 0.238814, Train accuracy is 0.93, Test accuracy is 0.93
    After 6500 training step(s), cross entropy on all data is 2.617790, Train accuracy is 0.93, Test accuracy is 0.93
    After 7000 training step(s), cross entropy on all data is 0.243092, Train accuracy is 0.93, Test accuracy is 0.93
    After 7500 training step(s), cross entropy on all data is 0.232212, Train accuracy is 0.93, Test accuracy is 0.93
    After 8000 training step(s), cross entropy on all data is 3.796988, Train accuracy is 0.93, Test accuracy is 0.93
    After 8500 training step(s), cross entropy on all data is 0.236019, Train accuracy is 0.93, Test accuracy is 0.93
    After 9000 training step(s), cross entropy on all data is 0.240095, Train accuracy is 0.93, Test accuracy is 0.93
    After 9500 training step(s), cross entropy on all data is 3.831031, Train accuracy is 0.93, Test accuracy is 0.93
    w1: [[ -4.11262177e-03]
     [ -4.57651988e-02]
     [  9.24635947e-01]
     [ -2.20867994e-04]
     [ -6.49655558e-05]
     [ -1.39010092e-02]
     [  1.16549110e+00]
     [  1.92463741e-01]
     [ -2.05861378e+00]
     [  2.05512360e-01]]
    
    output_10_1.png

    单隐层神经网络

    # in_size为自变量个数,hidden_size为隐层神经元个数
    def hidden_neural(in_size,hidden_size):
        tf.reset_default_graph()
        w1= tf.Variable(tf.random_normal([in_size,hidden_size], stddev=1, seed=1))
        w2= tf.Variable(tf.random_normal([hidden_size, 1], stddev=1, seed=1))
        b1 = tf.Variable(tf.random_normal([1], stddev=1, seed=1))
        b2 = tf.Variable(tf.random_normal([1], stddev=1, seed=1))
        x = tf.placeholder(tf.float32, shape=(None, in_size), name="x-input")
        y_= tf.placeholder(tf.float32, shape=(None, 1), name='y-input')
        
        a = tf.nn.tanh(tf.matmul(x, w1) + b1)  # 隐层用 tanh
        nn_rawoutput=tf.matmul(a, w2) + b2
        y = tf.sigmoid(nn_rawoutput)  # 输出层用 sigmoid
    
        cross_entropy = tf.reduce_mean(
            tf.nn.sigmoid_cross_entropy_with_logits(logits=nn_rawoutput, labels=y_)
        )
        cross_entropy = -tf.reduce_sum(y_ * tf.log(y))
        train_step = tf.train.AdamOptimizer(0.01).minimize(cross_entropy)
        costs=[]
        with tf.Session() as sess:
            init_op = tf.global_variables_initializer()
            sess.run(init_op)
    
            # 输出目前(未经训练)的参数取值。
            print("w1:", sess.run(w1))
            print("w2:", sess.run(w2))
            print("\n")
            # 训练模型。
            STEPS = 10000
            for i in range(STEPS):
                sess.run(train_step, feed_dict={x: X, y_: Y})
                total_cross_entropy = sess.run(cross_entropy, feed_dict={x: X, y_: Y})
                costs.append(total_cross_entropy)    
                if i % 500 == 0:
                    train_output = sess.run(y, feed_dict={x: X, y_: Y})
                    train_accuracy = accuracy(train_output, Y)
                    test_output = sess.run(y, feed_dict={x: X_t, y_: Y_t})
                    test_accuracy = accuracy(test_output, Y_t)
                    print("After %d training step(s), cross entropy on all data is "
                          "%3f, Train accuracy is %.2f, Test accuracy is %.2f" % (
                              i, total_cross_entropy, train_accuracy, test_accuracy))
    
            # 输出训练后的参数取值
            print("w1:", sess.run(w1))
            print("w2:",  sess.run(w2))
            plt.plot(costs)
    
    hidden_neural(10,15)
    
    w1: [[-0.81131822  1.48459876  0.06532937 -2.4427042   0.0992484   0.59122431
       0.59282297 -2.12292957 -0.72289723 -0.05627038  0.64354479 -0.26432407
       1.85663319  0.56784171 -0.38283589]
     [-1.48534334  1.26177108 -0.02530608 -0.26462969  1.53281379 -1.74297714
      -0.43789294 -0.56601     0.32066926  1.13283098 -2.27825713  0.48281202
      -1.31270874  0.35685033 -1.73028338]
     [-0.04016773  0.8996619  -1.38058913  1.48146236 -0.2454948  -0.73264718
      -0.19589645  0.07170801  0.63298088 -1.57119071  1.32938123 -1.17336702
       0.0315446   0.47705248  0.43694198]
     [-0.31680891 -0.45075032 -1.80606568  0.12489964 -0.7706542  -0.74624157
      -0.28195325 -1.95881546 -0.33761069  1.03019834  1.51340175  0.22515805
      -0.28566208  0.26882544  1.74621105]
     [ 0.92387104 -2.05909967 -0.31438306  1.21033823  0.694803   -1.06554997
       0.01364011 -1.06771255 -0.18407504 -2.20562339  1.82905924  1.24319017
      -0.33655512 -0.04000888 -0.33585522]
     [-0.30744898 -0.76692969 -0.28710833 -0.29470286 -0.8099063  -1.31590188
       0.37532416  0.17755835 -2.05828643  0.40742677 -1.00723302  0.29265752
       0.5163359   1.48094654  0.10440207]
     [-2.41602898 -0.6054818   0.04622507 -0.66815251 -0.40330869  0.70722419
      -1.79007626  0.36240223 -2.76690722  1.98079216  0.15743099  0.52636039
      -2.2120235   0.4751119  -0.45400357]
     [-0.06919777  0.68693012 -0.12727311  1.19293666  1.14117563 -1.69935191
      -1.32542193  0.53915238 -0.01135946 -0.87633544  0.95902258 -1.37203288
       0.28185159 -0.70281965  0.28490511]
     [ 0.33679315 -0.00496036  0.83482873  0.38735345 -0.26227593 -0.29129392
       0.43719748  1.38576829  0.19425733 -0.0610277   0.93549377  1.95708442
       0.2652913  -0.86742806 -0.02628282]
     [-0.1174508  -0.69646132 -0.22447547 -0.96103233  0.10064895  0.05909608
      -0.00711272 -1.07845175  0.93755829 -1.70846844 -1.32631063 -0.5413081
       0.64900833  0.02555733 -0.31969535]]
    w2: [[-0.81131822]
     [ 1.48459876]
     [ 0.06532937]
     [-2.4427042 ]
     [ 0.0992484 ]
     [ 0.59122431]
     [ 0.59282297]
     [-2.12292957]
     [-0.72289723]
     [-0.05627038]
     [ 0.64354479]
     [-0.26432407]
     [ 1.85663319]
     [ 0.56784171]
     [-0.38283589]]
    
    
    After 0 training step(s), cross entropy on all data is 27412.820312, Train accuracy is 0.93, Test accuracy is 0.92
    After 500 training step(s), cross entropy on all data is 24.879093, Train accuracy is 0.07, Test accuracy is 0.07
    After 1000 training step(s), cross entropy on all data is 10.726508, Train accuracy is 0.07, Test accuracy is 0.07
    After 1500 training step(s), cross entropy on all data is 6.479894, Train accuracy is 0.07, Test accuracy is 0.07
    After 2000 training step(s), cross entropy on all data is 4.310365, Train accuracy is 0.07, Test accuracy is 0.07
    After 2500 training step(s), cross entropy on all data is 2.943927, Train accuracy is 0.07, Test accuracy is 0.07
    After 3000 training step(s), cross entropy on all data is 2.023192, Train accuracy is 0.07, Test accuracy is 0.07
    After 3500 training step(s), cross entropy on all data is 1.398951, Train accuracy is 0.07, Test accuracy is 0.07
    After 4000 training step(s), cross entropy on all data is 0.979343, Train accuracy is 0.07, Test accuracy is 0.07
    After 4500 training step(s), cross entropy on all data is 0.696757, Train accuracy is 0.07, Test accuracy is 0.07
    After 5000 training step(s), cross entropy on all data is 0.503118, Train accuracy is 0.07, Test accuracy is 0.07
    After 5500 training step(s), cross entropy on all data is 0.368979, Train accuracy is 0.07, Test accuracy is 0.07
    After 6000 training step(s), cross entropy on all data is 0.274314, Train accuracy is 0.07, Test accuracy is 0.07
    After 6500 training step(s), cross entropy on all data is 0.206327, Train accuracy is 0.07, Test accuracy is 0.07
    After 7000 training step(s), cross entropy on all data is 0.156172, Train accuracy is 0.07, Test accuracy is 0.07
    After 7500 training step(s), cross entropy on all data is 0.118520, Train accuracy is 0.07, Test accuracy is 0.07
    After 8000 training step(s), cross entropy on all data is 0.090877, Train accuracy is 0.07, Test accuracy is 0.07
    After 8500 training step(s), cross entropy on all data is 0.069971, Train accuracy is 0.07, Test accuracy is 0.07
    After 9000 training step(s), cross entropy on all data is 0.053840, Train accuracy is 0.07, Test accuracy is 0.07
    After 9500 training step(s), cross entropy on all data is 0.041645, Train accuracy is 0.07, Test accuracy is 0.07
    w1: [[-0.81131822  1.74178159 -0.08909524 -2.65602541  0.29985178  0.59122431
       0.7024107  -2.12292957 -1.11616302 -0.05627038  0.64354479 -0.26432407
       1.85663319  0.67520195 -0.38283589]
     [-1.48534334  1.43142033 -0.15568708 -0.45996109  1.80822337 -1.74297714
      -0.32841063 -0.56601    -0.11948925  1.13283098 -2.27825713  0.48281202
      -1.31270874  0.46525508 -1.73028338]
     [-0.04016773  0.8996619  -1.38058913  1.36484158 -0.2454948  -0.73264718
       0.18841466  0.07170801  0.63298088 -1.57119071  1.32938123 -1.17336702
       0.0315446   0.60673642  0.43694198]
     [-0.31680891 -0.25157601 -1.79870474 -0.07272269 -0.48378247 -0.74624157
      -0.1626913  -1.95881546 -0.46097663  1.03019834  1.51340175  0.22515805
      -0.28566208  0.39043701  1.74621105]
     [ 0.92387104 -1.99463725 -0.48141563  1.10448539  0.82830316 -1.06554997
       0.10411295 -1.06771255 -0.64607352 -2.20562339  1.82905924  1.24319017
      -0.33655512  0.05482186 -0.33585522]
     [-0.30744898 -0.62716478 -0.35888094 -0.50358832 -0.53620595 -1.31590188
       0.46029875  0.17755835 -2.29272509  0.40742677 -1.00723302  0.29265752
       0.5163359   1.58985686  0.10440207]
     [-2.41602898 -0.6054818  -0.01284296 -0.85056406 -0.40330869  0.70722419
      -1.40090346  0.36240223 -3.11578465  1.98079216  0.15743099  0.52636039
      -2.2120235   0.60482615 -0.45400357]
     [-0.06919777  0.68693012 -0.12727311  1.12997842  1.14117563 -1.69935191
      -1.25104678  0.53915238 -0.01135946 -0.87633544  0.95902258 -1.37203288
       0.28185159 -0.59929645  0.28490511]
     [ 0.33679315  0.60589051  0.84839994  0.25937754 -0.26227593 -0.29129392
       0.82340252  1.38576829  0.19425733 -0.0610277   0.93549377  1.95708442
       0.2652913  -0.73811334 -0.02628282]
     [-0.1174508  -0.53300107 -0.31493971 -1.07677209  0.33373576  0.05909608
       0.08408035 -1.07845175  0.64108926 -1.70846844 -1.32631063 -0.5413081
       0.64900833  0.12541851 -0.31969535]]
    w2: [[-0.31261337]
     [ 0.75431836]
     [-1.27567613]
     [-1.34072971]
     [ 0.60311234]
     [-0.74970371]
     [ 1.09538651]
     [-2.8592217 ]
     [-2.06396842]
     [-0.55497575]
     [ 1.98081636]
     [ 1.07660115]
     [ 0.51562256]
     [ 1.77408111]
     [-1.49910092]]
    
    output_13_1.png
    training_epochs=10
    batch_size=2048
    n_samples=X.shape[0]
    learning_rate = 0.005
    
    # in_size为自变量个数,hidden_size为隐层神经元个数
    def hidden_neural_batch(in_size,hidden_size):
        tf.reset_default_graph()
        w1= tf.Variable(tf.random_normal([in_size,hidden_size], stddev=1, seed=1))
        w2= tf.Variable(tf.random_normal([hidden_size, 1], stddev=1, seed=1))
        b1 = tf.Variable(tf.random_normal([1], stddev=1, seed=1))
        b2 = tf.Variable(tf.random_normal([1], stddev=1, seed=1))
        x = tf.placeholder(tf.float32, shape=(None, in_size), name="x-input")
        y_= tf.placeholder(tf.float32, shape=(None, 1), name='y-input')
        
        a = tf.nn.tanh(tf.matmul(x, w1) + b1)  # 隐层用 tanh
        nn_rawoutput=tf.matmul(a, w2) + b2
        y = tf.sigmoid(nn_rawoutput)  # 输出层用 sigmoid
    
        cross_entropy = tf.reduce_mean(
            tf.nn.sigmoid_cross_entropy_with_logits(logits=nn_rawoutput, labels=y_)
        )
    
        train_step = tf.train.AdamOptimizer(0.01).minimize(cross_entropy)
        costs=[]
        with tf.Session() as sess:
            init_op = tf.global_variables_initializer()
            sess.run(init_op)
    
            # 输出目前(未经训练)的参数取值。
            print("w1:", sess.run(w1))
            print("w2:", sess.run(w2))
            print("\n")
            # 训练模型。
            STEPS = 10000
            for i in range(training_epochs):
                for batch in range(int(n_samples/batch_size)):
                    batch_x = X[batch*batch_size : (1+batch)*batch_size]
                    batch_y = Y[batch*batch_size : (1+batch)*batch_size]
                    sess.run(train_step, feed_dict={x: batch_x, y_: batch_y})
                    total_cross_entropy = sess.run(cross_entropy, feed_dict={x: batch_x, y_: batch_y})
                    costs.append(total_cross_entropy)    
                if i % 1 == 0:
                    train_output = sess.run(y, feed_dict={x: X, y_: Y})
                    train_accuracy = accuracy(train_output, Y)
                    test_output = sess.run(y, feed_dict={x: X_t, y_: Y_t})
                    test_accuracy = accuracy(test_output, Y_t)
                    print("After %d training step(s), cross entropy on all data is "
                          "%3f, Train accuracy is %.2f, Test accuracy is %.2f" % (
                              i, total_cross_entropy, train_accuracy, test_accuracy))
    
            # 输出训练后的参数取值
            print("w1:", sess.run(w1))
            print("w2:",  sess.run(w2))
            plt.plot(costs)
    
    hidden_neural_batch(10,15)
    
    w1: [[-0.81131822  1.48459876  0.06532937 -2.4427042   0.0992484   0.59122431
       0.59282297 -2.12292957 -0.72289723 -0.05627038  0.64354479 -0.26432407
       1.85663319  0.56784171 -0.38283589]
     [-1.48534334  1.26177108 -0.02530608 -0.26462969  1.53281379 -1.74297714
      -0.43789294 -0.56601     0.32066926  1.13283098 -2.27825713  0.48281202
      -1.31270874  0.35685033 -1.73028338]
     [-0.04016773  0.8996619  -1.38058913  1.48146236 -0.2454948  -0.73264718
      -0.19589645  0.07170801  0.63298088 -1.57119071  1.32938123 -1.17336702
       0.0315446   0.47705248  0.43694198]
     [-0.31680891 -0.45075032 -1.80606568  0.12489964 -0.7706542  -0.74624157
      -0.28195325 -1.95881546 -0.33761069  1.03019834  1.51340175  0.22515805
      -0.28566208  0.26882544  1.74621105]
     [ 0.92387104 -2.05909967 -0.31438306  1.21033823  0.694803   -1.06554997
       0.01364011 -1.06771255 -0.18407504 -2.20562339  1.82905924  1.24319017
      -0.33655512 -0.04000888 -0.33585522]
     [-0.30744898 -0.76692969 -0.28710833 -0.29470286 -0.8099063  -1.31590188
       0.37532416  0.17755835 -2.05828643  0.40742677 -1.00723302  0.29265752
       0.5163359   1.48094654  0.10440207]
     [-2.41602898 -0.6054818   0.04622507 -0.66815251 -0.40330869  0.70722419
      -1.79007626  0.36240223 -2.76690722  1.98079216  0.15743099  0.52636039
      -2.2120235   0.4751119  -0.45400357]
     [-0.06919777  0.68693012 -0.12727311  1.19293666  1.14117563 -1.69935191
      -1.32542193  0.53915238 -0.01135946 -0.87633544  0.95902258 -1.37203288
       0.28185159 -0.70281965  0.28490511]
     [ 0.33679315 -0.00496036  0.83482873  0.38735345 -0.26227593 -0.29129392
       0.43719748  1.38576829  0.19425733 -0.0610277   0.93549377  1.95708442
       0.2652913  -0.86742806 -0.02628282]
     [-0.1174508  -0.69646132 -0.22447547 -0.96103233  0.10064895  0.05909608
      -0.00711272 -1.07845175  0.93755829 -1.70846844 -1.32631063 -0.5413081
       0.64900833  0.02555733 -0.31969535]]
    w2: [[-0.81131822]
     [ 1.48459876]
     [ 0.06532937]
     [-2.4427042 ]
     [ 0.0992484 ]
     [ 0.59122431]
     [ 0.59282297]
     [-2.12292957]
     [-0.72289723]
     [-0.05627038]
     [ 0.64354479]
     [-0.26432407]
     [ 1.85663319]
     [ 0.56784171]
     [-0.38283589]]
    
    
    After 0 training step(s), cross entropy on all data is 0.266905, Train accuracy is 0.93, Test accuracy is 0.93
    After 1 training step(s), cross entropy on all data is 0.266154, Train accuracy is 0.93, Test accuracy is 0.93
    After 2 training step(s), cross entropy on all data is 0.266479, Train accuracy is 0.93, Test accuracy is 0.93
    After 3 training step(s), cross entropy on all data is 0.266038, Train accuracy is 0.93, Test accuracy is 0.93
    After 4 training step(s), cross entropy on all data is 0.265680, Train accuracy is 0.93, Test accuracy is 0.93
    After 5 training step(s), cross entropy on all data is 0.263993, Train accuracy is 0.93, Test accuracy is 0.93
    After 6 training step(s), cross entropy on all data is 0.263786, Train accuracy is 0.93, Test accuracy is 0.93
    After 7 training step(s), cross entropy on all data is 0.262848, Train accuracy is 0.93, Test accuracy is 0.93
    After 8 training step(s), cross entropy on all data is 0.264124, Train accuracy is 0.93, Test accuracy is 0.93
    After 9 training step(s), cross entropy on all data is 0.263521, Train accuracy is 0.93, Test accuracy is 0.93
    w1: [[-0.1307393   1.25472212  0.21943504 -1.8895539  -0.16593404  0.59122431
       0.83043867 -2.12292957 -0.83195138 -0.05627038  0.60097039 -0.26432407
       1.72399604  0.37524632 -0.2702803 ]
     [-0.90236533  0.88192987  0.32363042  0.12973411  1.365098   -1.74297714
      -0.37624282 -0.56601     0.62538344  1.13283098 -2.40597391  0.48281202
      -1.43043005 -0.19094455 -1.53778791]
     [ 0.20213602  0.49750674 -1.22418904  1.60667074 -0.26015031 -0.73264718
       0.30443743  0.07170801  0.59210908 -1.57119071  1.25638139 -1.17336702
       0.0315446   0.34599131  0.43694198]
     [ 0.60033566 -0.71668613 -0.87891889  0.36105621 -0.92789191 -0.74624157
      -0.50923687 -1.95881546 -0.28002039  1.03019834  1.38067317  0.22515805
      -0.40243286 -0.48839927  1.92582166]
     [ 1.33008099 -2.2839191  -0.19257383  1.39984441  0.60135603 -1.06554997
       0.1237011  -1.06771255 -0.0768716  -2.20562339  1.56527531  1.24319017
      -0.49268776 -0.259552   -0.25305635]
     [ 0.2348972  -1.15629339  0.01864882  0.07197443 -0.94793862 -1.31590188
       0.33565438  0.17755835 -1.42682576  0.40742677 -1.09856331  0.29265752
       0.39925966  0.71737742  0.29674378]
     [-2.23415732 -0.90887469  0.33973566 -1.08791244 -0.40343961  0.70722419
      -1.24081218  0.36240223 -2.72842002  1.98079216 -0.03883504  0.52636039
      -2.2120235   0.82433987 -0.45395043]
     [ 0.33973345  0.49417496 -0.1228751   1.2922473   1.08592165 -1.69935191
      -1.40754449  0.53915238  2.17422199 -0.87633544  0.79488677 -1.37203288
       0.16487117 -0.94592625  0.28490511]
     [ 0.51801199 -0.30546933  0.86197817  0.46826503 -0.27043018 -0.29129392
       0.96714443  1.38576829  0.33453467 -0.0610277   0.8624931   1.95708442
       0.2652913  -0.86487854 -0.02628282]
     [ 0.02453351 -1.02639699 -0.00644672 -1.05826247  0.01883459  0.05909608
      -0.1505565  -1.07845175  1.0150702  -1.70846844 -1.41358078 -0.5413081
       0.53199828  0.09832783 -0.18930206]]
    w2: [[-0.69936597]
     [ 1.24573088]
     [-0.0439459 ]
     [-2.09101486]
     [-0.05357318]
     [ 0.51120198]
     [ 0.53572941]
     [-2.13095355]
     [-0.8688491 ]
     [-0.04454806]
     [ 0.80097681]
     [-0.22473553]
     [ 1.39108026]
     [ 0.20757644]
     [-0.26078567]]
    
    output_16_1.png
    # 尝试多层神经网络
    input_nodes = 10
    
    # 每层的神经元比上一层放大1.5倍
    mulitplier = 1.5 
    
    # Number of nodes in each hidden layer
    hidden_nodes1 = 18
    hidden_nodes2 = round(hidden_nodes1 * mulitplier)
    hidden_nodes3 = round(hidden_nodes2 * mulitplier)
    
    # input
    x = tf.placeholder(tf.float32, [None, input_nodes])
    
    # layer 1
    W1 = tf.Variable(tf.truncated_normal([input_nodes, hidden_nodes1], stddev = 0.15))
    b1 = tf.Variable(tf.zeros([hidden_nodes1]))
    y1 = tf.nn.sigmoid(tf.matmul(x, W1) + b1)
    
    # layer 2
    W2 = tf.Variable(tf.truncated_normal([hidden_nodes1, hidden_nodes2], stddev = 0.15))
    b2 = tf.Variable(tf.zeros([hidden_nodes2]))
    y2 = tf.nn.sigmoid(tf.matmul(y1, W2) + b2)
    
    # layer 3
    W3 = tf.Variable(tf.truncated_normal([hidden_nodes2, hidden_nodes3], stddev = 0.15)) 
    b3 = tf.Variable(tf.zeros([hidden_nodes3]))
    y3 = tf.nn.sigmoid(tf.matmul(y2, W3) + b3)
    #y3 = tf.nn.dropout(y3, pkeep)
    
    # layer 4
    W4 = tf.Variable(tf.truncated_normal([hidden_nodes3, 1], stddev = 0.15)) 
    b4 = tf.Variable(tf.zeros([1]))
    y_rawoutput=tf.matmul(y3, W4) + b4
    y4 = tf.nn.sigmoid(tf.matmul(y3, W4) + b4)
    
    # output
    y = y4
    y_ = tf.placeholder(tf.float32, [None, 1])
    
    cost = tf.reduce_mean(
            tf.nn.sigmoid_cross_entropy_with_logits(logits=y_rawoutput, labels=y_)
        )
    
    def test():
        
        train_step = tf.train.AdamOptimizer(learning_rate).minimize(cost)
        costs=[]
        with tf.Session() as sess:
            init_op = tf.global_variables_initializer()
            sess.run(init_op)
    
            # 输出目前(未经训练)的参数取值。
    #        print("w1:", sess.run(w1))
    #        print("w2:", sess.run(w2))
            print("\n")
            # 训练模型。
            STEPS = 10000
            for i in range(training_epochs):
                for batch in range(int(n_samples/batch_size)):
                    batch_x = X[batch*batch_size : (1+batch)*batch_size]
                    batch_y = Y[batch*batch_size : (1+batch)*batch_size]
                    sess.run(train_step, feed_dict={x: batch_x, y_: batch_y})
                    total_cross_entropy = sess.run(cost, feed_dict={x: batch_x, y_: batch_y})
                    costs.append(total_cross_entropy)   
                    if batch%10==0:
                        train_output = sess.run(y, feed_dict={x: X, y_: Y})
                        train_accuracy = accuracy(train_output, Y)                
                        print("After %d batch step(s), cross entropy on all data is "
                          "%3f, Train accuracy is %.2f" % (
                              batch, total_cross_entropy, train_accuracy))                
                if i % 1 == 0:
                    train_output = sess.run(y, feed_dict={x: X, y_: Y})
                    train_accuracy = accuracy(train_output, Y)
                    test_output = sess.run(y, feed_dict={x: X_t, y_: Y_t})
                    test_accuracy = accuracy(test_output, Y_t)
                    print("After %d training step(s), cross entropy on all data is "
                          "%3f, Train accuracy is %.2f, Test accuracy is %.2f" % (
                              i, total_cross_entropy, train_accuracy, test_accuracy))
    
            # 输出训练后的参数取值
            print("W1:", sess.run(W1))
            print("W2:",  sess.run(W2))
            plt.plot(costs)
    
    test()
    
    After 0 batch step(s), cross entropy on all data is 0.761022, Train accuracy is 0.07
    After 10 batch step(s), cross entropy on all data is 0.300253, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242970, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224430, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.269178, Train accuracy is 0.93
    After 0 training step(s), cross entropy on all data is 0.262812, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230240, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246762, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.243116, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224826, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263234, Train accuracy is 0.93
    After 1 training step(s), cross entropy on all data is 0.259564, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230239, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246523, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242870, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224150, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263701, Train accuracy is 0.93
    After 2 training step(s), cross entropy on all data is 0.259801, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.229958, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246558, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242965, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224338, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263407, Train accuracy is 0.93
    After 3 training step(s), cross entropy on all data is 0.259671, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230115, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246524, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242860, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224380, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263789, Train accuracy is 0.93
    After 4 training step(s), cross entropy on all data is 0.259717, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.229895, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246580, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242962, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.223895, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263439, Train accuracy is 0.93
    After 5 training step(s), cross entropy on all data is 0.259534, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230208, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246550, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242967, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224260, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263530, Train accuracy is 0.93
    After 6 training step(s), cross entropy on all data is 0.259636, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230077, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246483, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242958, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224245, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263562, Train accuracy is 0.93
    After 7 training step(s), cross entropy on all data is 0.259633, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230171, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246480, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242965, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224253, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263583, Train accuracy is 0.93
    After 8 training step(s), cross entropy on all data is 0.259621, Train accuracy is 0.93, Test accuracy is 0.93
    After 0 batch step(s), cross entropy on all data is 0.230179, Train accuracy is 0.93
    After 10 batch step(s), cross entropy on all data is 0.246479, Train accuracy is 0.93
    After 20 batch step(s), cross entropy on all data is 0.242984, Train accuracy is 0.93
    After 30 batch step(s), cross entropy on all data is 0.224209, Train accuracy is 0.93
    After 40 batch step(s), cross entropy on all data is 0.263605, Train accuracy is 0.93
    After 9 training step(s), cross entropy on all data is 0.259612, Train accuracy is 0.93, Test accuracy is 0.93
    W1: [[ -6.27096146e-02   6.19635433e-02   7.36427680e-02   1.68631196e-01
       -7.60331690e-01   3.47528040e-01   4.36133966e-02  -3.37092340e-01
        2.07973793e-01   1.55349210e-01  -2.47479677e-01   1.48503095e-01
        1.20097958e-01   2.58629858e-01  -1.80520844e-02   3.30400243e-02
        2.99186558e-01   6.07701193e-04]
     [  2.33916506e-01   2.20484942e-01   8.15607235e-02   5.05064167e-02
        3.86699080e-01   3.23683053e-01   6.64170608e-02   8.91096368e-02
        2.84857035e-01   3.01495463e-01   3.25932622e-01   9.31131393e-02
        2.26717368e-01  -2.27795541e-01  -8.92875418e-02   1.51137367e-01
        3.31026882e-01  -3.26146409e-02]
     [  1.28775194e-01  -1.15595251e-01   3.53413224e-02   2.69077485e-04
       -9.38407540e-01  -6.09002650e-01  -1.59576893e-01   6.80298284e-02
       -2.98769642e-02  -1.47649273e-01  -2.69829303e-01   2.71455973e-01
       -9.22602937e-02   1.78162009e-03  -4.29572575e-02  -4.57999229e-01
       -2.94187278e-01   3.03394645e-01]
     [  4.06553894e-02   2.58741766e-01   1.11730397e-01   3.69872123e-01
       -2.47309715e-01   4.75301109e-02  -3.08942609e-02  -3.02954674e-01
        4.80195582e-02   4.47125509e-02   5.27960658e-01   1.42086819e-01
       -1.22759722e-01  -2.36116990e-01  -3.22828516e-02   8.75321254e-02
        1.30226342e-02  -5.17042577e-02]
     [  2.63865571e-02   7.74538368e-02   2.83283871e-02   8.29500630e-02
       -3.56711000e-02  -3.66764776e-02   2.32610598e-01   2.87605338e-02
       -8.82343799e-02  -9.72566381e-02  -4.70980518e-02   1.54890463e-01
        1.11744344e-01  -3.18082236e-02   3.62924278e-01   1.93099398e-02
       -4.63043712e-02  -2.36418396e-02]
     [ -1.20784484e-01   2.75714308e-01  -1.00368142e-01   3.53458196e-01
        4.01106298e-01   2.55347371e-01   2.23828822e-01   7.26789162e-02
        1.70375884e-01   1.60733044e-01   2.27448732e-01   1.52648434e-01
       -2.06443802e-01  -1.44332886e-01  -9.46074575e-02   4.46852118e-01
        4.38104957e-01   1.28217861e-01]
     [  1.09821679e-02   8.86156112e-02  -8.86320248e-02   2.49592103e-02
       -7.46144593e-01  -5.44901311e-01   3.96740437e-03  -1.89219609e-01
       -3.13865572e-01  -8.68629143e-02  -4.89704967e-01   1.48358860e-03
        1.14535345e-02   1.11116907e-02  -2.16335468e-02  -4.36338454e-01
       -5.33603311e-01   2.27682531e-01]
     [  1.84053376e-01   4.01767671e-01  -7.18951449e-02   1.32131562e-01
        5.42489588e-01   5.10929339e-02  -1.17896341e-01   1.80867054e-02
        4.20738965e-01   2.67274827e-01   6.84235338e-03  -1.57171059e-02
       -3.76166478e-02  -2.25746945e-01   1.93848863e-01   2.60983825e-01
        1.48856938e-01   6.41906112e-02]
     [ -1.21465884e-01  -1.46467388e-01   1.47176608e-01   1.48934007e-01
       -9.41196978e-01  -4.47248161e-01  -1.18821748e-01   4.25963029e-02
        9.72005352e-02   4.03921679e-02  -1.81921661e-01   3.11804935e-02
       -7.11452588e-02  -4.59574983e-02  -5.53867444e-02  -4.60661709e-01
       -4.86376792e-01   5.51459551e-01]
     [  1.31753668e-01   2.61191666e-01   2.99756005e-02  -3.97033095e-01
       -1.85625330e-01  -1.16527766e-01  -2.47673914e-02   1.93218458e-02
       -8.98092464e-02   2.03786984e-01  -3.04912806e-01  -1.04219699e-02
        4.61245999e-02  -2.80769557e-01   8.63420963e-02   3.33901271e-02
       -2.25947738e-01   1.25736788e-01]]
    W2: [[ -2.47483760e-01   2.74535149e-01  -2.00524136e-01   1.79923221e-01
        2.17588171e-01  -6.64749891e-02  -8.61104354e-02   3.33997965e-01
        1.77661210e-01   9.10267755e-02  -1.68243766e-01  -2.23183393e-01
       -2.39883274e-01   3.14111598e-02  -2.32722405e-02  -1.99615151e-01
        6.16971105e-02   3.59589942e-02   1.79169610e-01   3.57046127e-02
       -3.46526355e-02  -4.26769480e-02  -8.37428719e-02  -6.09069541e-02
       -3.21599513e-01   2.92103380e-01   2.27866322e-02]
     [  1.49556741e-01   5.40980836e-04   2.43838310e-01  -1.20390980e-02
        3.58613394e-03   3.47295664e-02   1.84276402e-01   1.84944689e-01
        2.28680730e-01   2.87274987e-01   1.77906498e-01   2.18276437e-02
        1.35881558e-01   9.42268223e-02   2.18612388e-01  -6.05812371e-02
        1.44918278e-01   1.44585148e-01   2.53670141e-02   2.34575152e-01
        1.32305101e-01   1.59189060e-01   6.10962231e-03   4.66559157e-02
       -1.11459807e-01   1.51057154e-01   1.54180720e-01]
     [ -5.04053757e-02   5.53702004e-02  -5.49984798e-02   9.81567353e-02
       -1.88498318e-01  -1.39121711e-01   3.32015641e-02  -6.94152191e-02
       -1.31983846e-01  -1.36204585e-01   2.41934359e-02  -2.98041143e-02
       -3.74129154e-02  -4.60114144e-02  -9.96935926e-03   9.71305184e-03
       -2.86987692e-01  -1.09244749e-01   7.30254725e-02  -4.69237864e-02
       -1.20570123e-01  -3.12603623e-01  -3.96302044e-01  -1.45545170e-01
       -2.72707134e-01   1.28854766e-01  -3.35495681e-01]
     [  1.04735464e-01  -2.24808484e-01   5.39765842e-02  -2.39716982e-03
        9.69926789e-02  -1.23070806e-01   8.46826434e-02   7.79065490e-02
       -7.71566555e-02  -4.94302586e-02   1.17854714e-01   7.50287697e-02
       -7.21424893e-02   9.17640477e-02   2.03861088e-01  -2.04748902e-02
       -6.35029003e-02   4.95297983e-02   5.29667288e-02   4.21655178e-02
       -8.98981839e-02  -2.22930983e-01   1.20568700e-01  -4.38534059e-02
        6.64039236e-03   7.14863092e-02   4.93475534e-02]
     [  9.71197784e-01   6.93454623e-01   8.24395180e-01   6.19480908e-01
        6.03376091e-01   7.53725052e-01   8.51884604e-01   5.71891010e-01
        8.16100538e-01   8.63069415e-01   8.12737703e-01   8.64770055e-01
        8.75601947e-01   6.86313450e-01   8.68773520e-01   1.00470674e+00
        8.13403010e-01   1.08989930e+00   6.49629533e-01   5.85276484e-01
        5.75168252e-01   6.35941446e-01   9.25097048e-01   9.46587622e-01
        1.11604416e+00   6.98117316e-01   1.07315254e+00]
     [  7.81452656e-01   4.87429708e-01   9.02347028e-01   3.35463107e-01
        5.49626887e-01   8.01850796e-01   6.71848238e-01   4.30190593e-01
        5.37678182e-01   4.62538064e-01   5.66581309e-01   5.25604963e-01
        7.59734213e-01   7.25885332e-01   7.08737910e-01   4.86123830e-01
        3.90948027e-01   8.49048913e-01   1.37412354e-01   7.54611194e-01
        6.85468554e-01   3.85228217e-01   6.47241056e-01   5.32116473e-01
        6.07052326e-01   2.75263876e-01   4.51169044e-01]
     [  2.20008776e-01   1.17634917e-02  -1.41911626e-01   2.76960105e-01
        1.45396397e-01  -1.42692536e-01   1.79078817e-01  -3.14875040e-03
        2.50405073e-03   1.56506598e-02   3.45965736e-02  -2.68472552e-01
       -4.29479256e-02  -2.13654991e-02   2.31837668e-02  -1.45248652e-01
        2.23421797e-01  -9.21307653e-02  -3.39591429e-02   2.23463520e-01
        6.88029155e-02  -2.47113425e-02   1.60195798e-01  -2.11858824e-01
        1.27470503e-02   2.15154052e-01  -3.11076883e-02]
     [ -2.37635132e-02  -1.73365474e-01  -1.12947501e-01  -1.48859143e-01
        1.85159460e-01   3.30418833e-02  -2.53871500e-01  -1.28793567e-01
       -1.69082116e-02   5.44793904e-03   1.68549478e-01  -1.32903099e-01
       -2.13813469e-01  -2.37874761e-01   6.07706644e-02  -2.83597946e-01
       -1.41368866e-01  -4.74268124e-02  -1.13941632e-01  -5.64679503e-02
        1.73464879e-01   5.85160479e-02  -1.30440399e-01  -5.84462807e-02
       -2.47649327e-01   1.89262871e-02  -4.14109856e-01]
     [  5.66611886e-01   1.00701079e-01   2.80587882e-01   5.28735995e-01
        2.48230562e-01   5.07063925e-01   4.87066805e-01   5.53084970e-01
        5.67206383e-01   4.78248298e-01   4.70222801e-01   3.53814512e-01
        6.10332549e-01   4.54438239e-01   1.64543107e-01   6.09976590e-01
        4.27011877e-01   4.14798468e-01   4.43085074e-01   5.94868064e-01
        3.40550244e-01   4.60564345e-01   4.90165085e-01   3.60756338e-01
        5.18992722e-01   1.52996466e-01   5.56604564e-01]
     [  4.86890763e-01   5.82106650e-01   3.62138718e-01   6.67764425e-01
        5.88133514e-01   4.65598375e-01   3.53168219e-01   4.77591515e-01
        4.32764977e-01   5.53816259e-01   4.66729611e-01   7.71247089e-01
        5.03579080e-01   3.20984453e-01   7.35400319e-01   6.80477321e-01
        6.95575237e-01   4.72527951e-01   6.64922059e-01   4.64856803e-01
        6.01439595e-01   4.07652050e-01   5.80660999e-01   5.59776485e-01
        6.79509342e-01   6.19544744e-01   5.73529840e-01]
     [  2.78885543e-01   3.27404231e-01   3.05404127e-01   2.70429850e-01
        4.60326552e-01   4.53902990e-01   3.11377555e-01   1.06233321e-01
        2.63834685e-01   1.97067395e-01   4.82757181e-01   3.39598417e-01
        3.00747514e-01   4.11575168e-01   2.84612805e-01   1.64087623e-01
        4.82419431e-01   2.11109161e-01   6.98129088e-02   2.82756269e-01
        4.29396808e-01   2.50127643e-01   5.15340269e-01   3.99919420e-01
        2.26336911e-01   4.58118558e-01   5.15195549e-01]
     [ -7.32911155e-02   3.38609636e-01   1.71090942e-02  -2.92833196e-03
        2.24377185e-01   6.15050318e-04   1.57868221e-01  -1.07143283e-01
       -2.72531003e-01  -1.64496273e-01   2.72950847e-02  -1.45400092e-01
        1.44029558e-01   1.03170618e-01   1.74984515e-01   5.34740612e-02
       -1.22564279e-01  -2.44503826e-01   3.12247425e-01   5.62596619e-02
        3.13789248e-01  -1.88866764e-01  -4.97231074e-02   2.82089561e-02
       -9.08887237e-02   2.05484405e-01  -5.51709868e-02]
     [ -1.17321230e-01   2.56355517e-02   5.40102385e-02  -4.91569936e-03
        1.86124146e-02   2.34684631e-01   1.22642353e-01   3.22679669e-01
        2.52484351e-01   3.12222484e-02   1.53737478e-02  -5.46348505e-02
       -8.23673680e-02   1.22413106e-01  -7.57310390e-02  -1.12661431e-02
       -2.52660394e-01  -1.68270350e-01  -6.28279373e-02   4.33253199e-02
       -3.10955103e-02   2.94571165e-02   2.26540029e-01  -1.09926358e-01
        3.31482268e-03   7.65279159e-02  -1.21152520e-01]
     [  1.17148586e-01  -6.85131773e-02  -6.10492527e-02  -1.67501822e-01
        9.57531109e-02   2.84810904e-02   3.76874320e-02   1.16155639e-01
        5.63020036e-02   1.99789315e-01   5.88115640e-02  -4.60230634e-02
        1.04566298e-01   5.73917627e-02   2.40478590e-01  -3.70545052e-02
       -5.62350499e-03   1.69614270e-01  -5.20623289e-03   4.58280519e-02
        9.01719108e-02  -3.42041373e-01   1.24716014e-01  -6.41216934e-02
        5.78678809e-02   5.11394404e-02  -2.64394909e-01]
     [ -3.18749659e-02   2.71107405e-01  -1.33590512e-02  -5.65348044e-02
        3.57385367e-01   4.36317511e-02   1.56145811e-01  -1.41672775e-01
        1.99348181e-01   6.04956690e-03  -1.08531609e-01  -1.95701718e-01
       -1.84095308e-01   2.10863888e-01  -1.26457796e-01  -2.30793521e-01
       -4.06751297e-02  -3.72523405e-02   1.53413832e-01   1.69467404e-01
        1.12846553e-01  -5.01344092e-02  -9.66672003e-02   1.83976442e-01
        7.86100551e-02  -2.66810581e-02   1.13086171e-01]
     [  2.01281205e-01   2.93104053e-01   2.65929997e-01   7.41217658e-02
        1.46010593e-01   1.83841363e-01   3.18309724e-01   3.72458547e-01
        7.30247423e-02   2.88209677e-01   1.80921063e-01   4.01581228e-01
        3.36319767e-02   1.81846097e-01   3.21458906e-01   1.69346049e-01
        2.64965415e-01   9.59763750e-02   4.52427000e-01   7.37397233e-03
        2.16388196e-01   4.53007847e-01   1.78770229e-01   4.83540893e-01
        1.49519518e-01   1.87332630e-01   2.47215509e-01]
     [  6.42805517e-01   4.75173235e-01   7.56214559e-01   5.96054435e-01
        5.75478554e-01   6.66866839e-01   3.72783214e-01   4.47785348e-01
        6.45747721e-01   6.48468435e-01   4.78494674e-01   8.81808043e-01
        5.74304402e-01   5.01282692e-01   3.37430030e-01   8.89424205e-01
        8.44838083e-01   6.46390140e-01   6.07262790e-01   2.31410503e-01
        3.64845365e-01   4.93058950e-01   6.49804950e-01   6.97894454e-01
        8.42011988e-01   4.77401018e-01   4.17692274e-01]
     [ -1.71808563e-02  -3.49861272e-02  -1.17573462e-01  -8.60988796e-02
        1.55902933e-02   2.18224749e-01  -5.14874011e-02  -1.16035029e-01
       -2.78583318e-01  -4.12806198e-02   1.46568760e-01  -2.86657363e-01
       -1.34957030e-01  -1.42399102e-01   2.17876192e-02   6.13102391e-02
       -8.56453329e-02  -2.64274299e-01  -1.49690256e-01   3.73912118e-02
        3.97815742e-02  -1.25761107e-01   1.19089847e-02  -1.35761768e-01
       -1.46209732e-01   8.12482610e-02  -2.91973650e-01]]
    
    output_20_1.png
    
    

    不管是用无隐层,单隐层,还是多层的神经网络,都无法提高准确率,始终是0.93。是参数、模型问题,还是单纯使用神经网络,只能达到这样的准确率?
    作为刚刚跳进机器学习这个大坑的小白,我也回答不出来,只能是继续摸索。

    相关文章

      网友评论

          本文标题:债务违约预测之四:利用人工神经网络进行预测

          本文链接:https://www.haomeiwen.com/subject/vkmekxtx.html