美文网首页
Pruning CNN for resource efficie

Pruning CNN for resource efficie

作者: 信步闲庭v | 来源:发表于2017-10-25 11:13 被阅读147次

Approach

The proposed scheme for pruning consists of the following steps:

  • Fine-tune the network until convergence on the target task;
  • Alternate iterations of pruning and fine-tuning;
  • Stop pruning when the required trade-off between accuracy and pruning objective is reached.

C(.) is a cost function, the goal of pruning is as follow:

There're several criterion for pruning by evaluating importance of neurons:

  • ORACLE pruning
    The best approximation of a neuron’s importance is to estimate the cost value of the network, once a particular neuron is pruned. This can be implemented as setting the pruning gate to 0 for each neuron in turn and estimating C(D|W).
  • Minimum weight
  • Activation based criteria
    One of the reasons of ReLU’s popularity is that convolutional layers with this activation act as feature detectors. Therefore it is reasonable to assume that if the activation value (the output of the neuron) is small then this feature detector is not important for prediction of the output of the network.
  • Taylor Expansion Approximation
    Intuitively, this criterion prunes neurons that have an almost flat influence on the cost function. This approach requires accumulation of the product of the activation and the gradient wrt. the cost function which is precomputed for back-propagation during training.

Experiment

References:
PRUNING CONVOLUTIONAL NEURAL NETWORKS FOR RESOURCE EFFICIENT INFERENCE, Pavlo Molchanov, 2017, ICLR

相关文章

网友评论

      本文标题:Pruning CNN for resource efficie

      本文链接:https://www.haomeiwen.com/subject/hietpxtx.html