https://www.youtube.com/results?search_query=Artificial+Neural+Network
这个视频讲得比较简单。
-
input dimensions -> Hidden layer(s) (Black box) -> output dimensions
ƒrom video
Common "black box" configuration options
- How many hidden layers?
(Usually 1) - How many nodes in each hidden layer?
(midway between input and output, less than 2x the input nodes, 2/3 the input nodes+output nodes) - Activation Function
- Learning Rate & Momentum
- Iterations & Desired Error level
(training stops when you get to one another)
Linear / Non-linear (Sigmoid, Tanh)
Learning Rate: How much should this step outcome affect our weights and biases?
Momentum: How much should past outcomes affect our weights and biases?
change=(learningRagedeltavalue)+(momentum*pastChange)
https://www.youtube.com/watch?v=aircAruvnKk
这个video 里作者用识别数字来举例子,通俗易懂,强烈推荐。
Neuron -> Thing that holds a number [0,1] called "Activation"
from video
Last layer: Determination of number
Hidden layers:maybe不同数字的部分形状特征
How can former layer determine next layer?
-
To give weights, and computer their weighted sum.
-
To be in [0,1] -> Sigmoid function.
Sigmoid -
Adding an additional number as "bias", shows how high the weighted sum needs to be before the neuron starts getting meaningfully active.
"bias" -
That is just one neuron, every other in this layer is going to be connected to all neurons in the first layer and different bias and weights.
So what is learning? It's referring to getting the computer to find a valid setting for all these many many numbers so that it'll actually solve the problem at hand.
We can show it by linear algebra:
matrix
网友评论