您的位置:首页 > 其它

Tensorflow学习率的learning rate decay

2017-08-26 15:29 465 查看
x = tf.Variable(1.0)
y = x.assign_add(1)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print sess.run(x)
print sess.run(y)
print sess.run(x)


输出 1,2,2注意其x会变的

import tensorflow as tf

global_step = tf.Variable(0, trainable=False)

initial_learning_rate = 0.1 #初始学习率

learning_rate = tf.train.exponential_decay(initial_learning_rate,
global_step=global_step,
decay_steps=10,decay_rate=0.9)
opt = tf.train.GradientDescentOptimizer(learning_rate)

add_global = global_step.assign_add(1)
with tf.Session() as sess:
tf.global_variables_initializer().run()
print(sess.run(learning_rate))
for i in range(1):
_, rate = sess.run([add_global, learning_rate])
print(rate)


参考:

http://blog.csdn.net/u012436149/article/details/62058318
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: