美文网首页
2016年8月2日(week4神经网络2)

2016年8月2日(week4神经网络2)

作者: 上海王尔德 | 来源:发表于2016-08-04 11:08 被阅读19次

    Next, we need to understand it by using Math:

    Neuron Network model1.png

    Here is a picture of a typical Neuron Network. There are 4 layers: Layer 1 is called input layer; Layer2&3 are called hidden layers which compute intermidate features that lead us to a more meaningful(abstract?) result; Layers 4 is called output layer.

    answer1.png

    We want to compute values in Layer2. How do we do that? Suppose we are given a 3x4 matrix Big-theta(1) alreday. Then a(2) is just g(theta*a(1)), done!

    Next,

    Neuron Network example of logic &&.png

    We can do even more complicated things by Neuron Network, such as logic arithmetics.
    By taking a advantage sigmoid(x) function, we can approximate every logic arithmetic units.
    I believe the picture above explains everything!
    Finally,

    Neuron Network Multicalss Classification.png

    Packing up all cool thing we've learned so far, it's time to realize the ture power of multiclassification. How do we do that?
    We simply make them distinct vectors like [1,0,0,0],[0,1,0,0],[0,0,1,0],etc

    You should look for my complete code of this section : https://github.com/yhyu13/Coursera-Machine-Learning-Andrew-Ng (file name: predict.m)

    相关文章

      网友评论

          本文标题:2016年8月2日(week4神经网络2)

          本文链接:https://www.haomeiwen.com/subject/wuycsttx.html