美文网首页
Deep Learning Lecture 1-Neural N

Deep Learning Lecture 1-Neural N

作者: 飞奔的红舞鞋 | 来源:发表于2017-12-09 16:40 被阅读8次

    Lecture 1 Neutral Network

    1. Learning is looking for a function

    Three questions for Training procedure

    Model Architecture

    classification task binary Classification and Multi-class Classification, but some cases are not easy to be formulated as classification problems.

    A single layer of Neutrons (Perceptron)

    This bias term is an "always on" feature.

    A single neuron can only handle binary classification.

    Limitation of Perceptron

    Neuron Network Model (Multi-layer Perceptron)

    non-linear activation function is usually used in neural network.

    What does the "Good" Function mean?

    Loss Function design

    Function = model parameters

    Model Parameter Measurement

    Optimization

    How to find out the best function

    Gradient Descent for Optimization



    Stochastic gradient descent (SGD)


    Mini-batch SGD


    Mini-batch is more effective.



    Practical tips

    different initialization parameters, set them randomly

    Tips for mini-batch training

    shuffle training samples before every epoch

    Possible solutions

    more training samples

    some tips:dropout, etc

    相关文章

      网友评论

          本文标题:Deep Learning Lecture 1-Neural N

          本文链接:https://www.haomeiwen.com/subject/uqguixtx.html