Tensorflow学习率的learning rate decay
2017-08-26 15:29
465 查看
x = tf.Variable(1.0) y = x.assign_add(1) with tf.Session() as sess: sess.run(tf.global_variables_initializer()) print sess.run(x) print sess.run(y) print sess.run(x)
输出 1,2,2注意其x会变的
import tensorflow as tf global_step = tf.Variable(0, trainable=False) initial_learning_rate = 0.1 #初始学习率 learning_rate = tf.train.exponential_decay(initial_learning_rate, global_step=global_step, decay_steps=10,decay_rate=0.9) opt = tf.train.GradientDescentOptimizer(learning_rate) add_global = global_step.assign_add(1) with tf.Session() as sess: tf.global_variables_initializer().run() print(sess.run(learning_rate)) for i in range(1): _, rate = sess.run([add_global, learning_rate]) print(rate)
参考:
http://blog.csdn.net/u012436149/article/details/62058318
相关文章推荐
- 优化方法,一些重要参数learning rate,weight decay,momentum,learing rate decay
- Caffe中learning rate 和 weight decay 的理解
- pytorch学习笔记(十):learning rate decay(学习率衰减)
- Caffe中learning rate 和 weight decay 的理解
- learning_rate&weight_decay&momentum
- 【deeplearning.ai笔记第二课】2.3 学习率衰减(learning rate decay),局部极小值和鞍点
- 学习率衰减(learning rate decay)
- 学习笔记一:learning rate,weight decay和momentum的理解
- Caffe中learning rate 和 weight decay 的理解
- pytorch学习笔记(十):learning rate decay(学习率衰减)
- weight decay and learning rate
- tensorflow学习笔记(三十六):learning rate decay
- keras learning rate
- keras learning rate
- Caffe之learning rate policy
- learning-rate-in-neural-networks
- using learning rate schedules for deep learning models in python with keras
- tensorflow dynamic_rnn() decay_steps,decay_rate,embedding_lookup( )
- Learning Rate Annealing
- Plot Learning Rate