tensorflow 中 softmax_cross_entropy_with_logits 与 sparse_softmax_cross_entropy_with_logits 的区别
2017-05-09 14:38
645 查看
http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi
Having two different functions is a convenience, as they produce the same result.
The difference is simple:
For
For
Labels used in
Another tiny difference is that with
Having two different functions is a convenience, as they produce the same result.
The difference is simple:
For
sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range
[0, num_classes-1].
For
softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.
Labels used in
softmax_cross_entropy_with_logitsare the one hot version of labels used in
sparse_softmax_cross_entropy_with_logits.
Another tiny difference is that with
sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss
0on this label.
相关文章推荐
- [tensorflow]sparse_softmax_cross_entropy_with_logits 与 softmax_cross_entropy_with_logits的区别
- tensorflow 中 sparse_softmax_cross_entropy_with_logits 与 softmax_cross_entropy_with_logits区别
- tf.nn.sparse_softmax_cross_entropy_with_logits()和tf.nn.softmax_cross_entropy_with_logits()的区别与内容
- Tensorflow 交叉熵计算 sparse_softmax_cross_entropy_with_logits softmax_cross_entropy_with_logits
- tensorflow 笔记10:tf.nn.sparse_softmax_cross_entropy_with_logits 函数
- sparse_softmax_cross_entropy_with_logits 与 softmax_cross_entropy_with_logits 的区别
- TensorFlow学习--tf.nn.sparse_softmax_cross_entropy_with_logits
- [tensorflow] tf.nn.sparse_softmax_cross_entropy_with_logits的使用方法及常见报错
- tensorflow学习 softmax_cross_entropy_with_logits & sparse_softmax_cross_entropy_with_logits
- 对比两个函数tf.nn.softmax_cross_entropy_with_logits和tf.nn.sparse_softmax_cross_entropy_with_logits
- tf.nn.sparse_softmax_cross_entropy_with_logits的用法
- Tensorflow函数:tf.nn.softmax_cross_entropy_with_logits 讲解
- tf.nn.sparse_softmax_cross_entropy_with_logits
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法
- tensorflow--tf.nn.softmax_cross_entropy_with_logits的用法
- tf.nn.sparse_softmax_cross_entropy_with_logits()
- tensorflow sparse-softmax-cross-entropy-with-logits nan
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法
- tf.nn.sparse_softmax_cross_entropy_with_logits()函数的用法
- 【TensorFlow】tf.nn.softmax_cross_entropy_with_logits的用法