美文网首页我爱编程
Tensorflow 可变学习率 learning rate

Tensorflow 可变学习率 learning rate

作者: lucientlau | 来源:发表于2018-05-19 11:08 被阅读0次

    Exponential decay

    #decayed_learning_rate = learning_rate *
    #                       decay_rate ^ (global_step / decay_steps)
    # u can use help(tf.train.exponential_decay) in python3 to see the manual of this function
    
    global_step = tf.Variable(0)  
    learning_rate = tf.train.exponential_decay(0.1, global_step, 100, 0.96, staircase=True)    
     #生成学习率  
      
    learning_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(.....,
                                                                 global_step=global_step)  
    #使用指数衰减学习率  每次 sess.run(train), the global_step will increase 1,
    #You dont need change the global_step in trainng loop
    

    Piecewise_constant decay

      global_step = tf.Variable(0, trainable=False)
      boundaries = [100000, 110000]
      values = [1.0, 0.5, 0.1]
      learning_rate = tf.train.piecewise_constant(global_step, boundaries, values)
      
      learning_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(.....,
                                                                 global_step=global_step)  
    
    
      # Later, whenever we perform an optimization step, we increment global_step.
    

    相关文章

      网友评论

        本文标题:Tensorflow 可变学习率 learning rate

        本文链接:https://www.haomeiwen.com/subject/huuodftx.html