您的位置:首页 > 运维架构

tensorflow 中 softmax_cross_entropy_with_logits 与 sparse_softmax_cross_entropy_with_logits 的区别

2017-05-09 14:38 645 查看
http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi

Having two different functions is a convenience, as they produce the same result.

The difference is simple:

For
sparse_softmax_cross_entropy_with_logits
, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range
[0, num_classes-1]
.

For
softmax_cross_entropy_with_logits
, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

Labels used in
softmax_cross_entropy_with_logits
are the one hot version of labels used in
sparse_softmax_cross_entropy_with_logits
.

Another tiny difference is that with
sparse_softmax_cross_entropy_with_logits
, you can give -1 as a label to have loss
0
on this label.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐