美文网首页
机器学习学习笔记--hello神经网络

机器学习学习笔记--hello神经网络

作者: 松爱家的小秦 | 来源:发表于2017-12-09 17:01 被阅读0次

    这里用到了MNIST数据集,这个数据集是入门级别的计算机视觉集合。

    #-*- coding:utf-8 -*-

    print(__doc__)

    import matplotlib.pyplot as plt

    from sklearn.datasets import fetch_mldata

    from sklearn.neural_network import MLPClassifier

    mnist=fetch_mldata("MNIST original")

    x,y = mnist.data / 255. , mnist.target

    x_train , x_test = x[:60000],x[60000:]

    y_train,y_test = y[:60000],y[60000:]

    #以上都在做特征提取

    mlp=MLPClassifier(hidden_layer_sizes=(50,),max_iter=10,alpha=1e-4,

    solver='sgd',verbose=10,tol=1e-4,random_state=1,

    learning_rate_init=.1)

    mlp.fit(x_train,y_train)

    print("Training set score: %f" % mlp.score(x_train, y_train))

    print("Test set score: %f" % mlp.score(x_test, y_test))

    fig,axes=plt.subplots(4,4)

    vmin,vmax = mlp.coefs_[0].min(),mlp.coefs_[0].max()

    for coef,ax in zip(mlp.coefs_[0].T,axes.ravel()):

    ax.matshow(coef.reshape(28, 28), cmap=plt.cm.gray, vmin=.5 * vmin,

    vmax=.5 * vmax)

    ax.set_xticks(())

    ax.set_yticks(())

    plt.show()

    输出:Iteration 1, loss = 0.32212731

    Iteration 2, loss = 0.15738787

    Iteration 3, loss = 0.11647274

    Iteration 4, loss = 0.09631113

    Iteration 5, loss = 0.08074513

    Iteration 6, loss = 0.07163224

    Iteration 7, loss = 0.06351392

    Iteration 8, loss = 0.05694146

    Iteration 9, loss = 0.05213487

    Iteration 10, loss = 0.04708320

    /usr/local/lib/python2.7/dist-packages/sklearn/neural_network/multilayer_perceptron.py:564: ConvergenceWarning: Stochastic Optimizer: Maximum iterations (10) reached and the optimization hasn't converged yet.

    % self.max_iter, ConvergenceWarning)

    Training set score: 0.985733

    Test set score: 0.971000

    相关文章

      网友评论

          本文标题:机器学习学习笔记--hello神经网络

          本文链接:https://www.haomeiwen.com/subject/okqpixtx.html