Next, we need to understand it by using Math:
Neuron Network model1.pngHere is a picture of a typical Neuron Network. There are 4 layers: Layer 1 is called input layer; Layer2&3 are called hidden layers which compute intermidate features that lead us to a more meaningful(abstract?) result; Layers 4 is called output layer.
answer1.pngWe want to compute values in Layer2. How do we do that? Suppose we are given a 3x4 matrix Big-theta(1) alreday. Then a(2) is just g(theta*a(1)), done!
Next,
Neuron Network example of logic &&.pngWe can do even more complicated things by Neuron Network, such as logic arithmetics.
By taking a advantage sigmoid(x) function, we can approximate every logic arithmetic units.
I believe the picture above explains everything!
Finally,
Packing up all cool thing we've learned so far, it's time to realize the ture power of multiclassification. How do we do that?
We simply make them distinct vectors like [1,0,0,0],[0,1,0,0],[0,0,1,0],etc
You should look for my complete code of this section : https://github.com/yhyu13/Coursera-Machine-Learning-Andrew-Ng (file name: predict.m)
网友评论