对 tensorflow 中 tf.nn.embedding_lookup 函数的解释
2017-05-08 17:29
1216 查看
http://stackoverflow.com/questions/34870614/what-does-tf-nn-embedding-lookup-function-do
========================
Yes, this function is hard to understand, until you get the point.
In its simplest form, it is similar to
For example (assuming you are inside
would return [10 20 30 40], because the first element (index 0) of params is 10, the second element of params (index 1) is 20, etc.
Similarly,
would return: [20 20 40]
But
In such a case, the indexes, specified in
In the 'mod' strategy, index 0 corresponds to the first element of the first tensor in the list. Index 1 corresponds to the first element of the second tensor. Index 2 corresponds to the first element of the third tensor, and so on. Simply index
Now, index
So, in the code
index 0 corresponds to the first element of the first tensor: 1
index 1 corresponds to the first element of the second tensor: 10
index 2 corresponds to the second element of the first tensor: 2
index 3 corresponds to the second element of the second tensor: 20
Thus, the result would be:
[ 2 1 2 10 2 20]
embedding_lookupfunction retrieves rows of the
paramstensor. The behavior is similar to using indexing with arrays in numpy. E.g.:
matrix = np.random.random([1024, 64]) # 64-dimensional embeddings ids = np.array([0, 5, 17, 33]) print matrix[ids] # prints a matrix of shape [4, 64]
paramsargument can be also a list of tensors in which case the ids will be distributed among the tensors. E.g. given a list of 3 [2, 64] tensors the default behavior is that they will represent ids: [0, 3], [1, 4], [2, 5].
partition_strategycontrols the way how the ids are distributed among the list. The partitioning is useful for larger scale problems when the matrix might be too large to keep in one piece.
========================
Yes, this function is hard to understand, until you get the point.
In its simplest form, it is similar to
tf.gather. It returns the elements of
paramsaccording to the indexes specified by
ids.
For example (assuming you are inside
tf.InteractiveSession())
params = tf.constant([10,20,30,40]) ids = tf.constant([0,1,2,3]) print tf.nn.embedding_lookup(params,ids).eval()
would return [10 20 30 40], because the first element (index 0) of params is 10, the second element of params (index 1) is 20, etc.
Similarly,
params = tf.constant([10,20,30,40]) ids = tf.constant([1,1,3]) print tf.nn.embedding_lookup(params,ids).eval()
would return: [20 20 40]
But
embedding_lookupis more than that. The
paramsargument can be a list of tensors, rather than a single tensor.
params1 = tf.constant([1,2]) params2 = tf.constant([10,20]) ids = tf.constant([2,0,2,1,2,3]) result = tf.nn.embedding_lookup([params1, params2], ids)
In such a case, the indexes, specified in
ids, correspond to elements of tensors according to apartition strategy, where the default partition strategy is 'mod'.
In the 'mod' strategy, index 0 corresponds to the first element of the first tensor in the list. Index 1 corresponds to the first element of the second tensor. Index 2 corresponds to the first element of the third tensor, and so on. Simply index
icorresponds to the first element of the (i+1)th tensor , for all the indexes
0..(n-1), assuming params is a list of
ntensors.
Now, index
ncannot correspond to tensor n+1, because the list
paramscontains only
ntensors. So index
ncorresponds to the second element of the first tensor. Similarly, index
n+1corresponds to the second element of the second tensor, etc.
So, in the code
params1 = tf.constant([1,2]) params2 = tf.constant([10,20]) ids = tf.constant([2,0,2,1,2,3]) result = tf.nn.embedding_lookup([params1, params2], ids)
index 0 corresponds to the first element of the first tensor: 1
index 1 corresponds to the first element of the second tensor: 10
index 2 corresponds to the second element of the first tensor: 2
index 3 corresponds to the second element of the second tensor: 20
Thus, the result would be:
[ 2 1 2 10 2 20]
相关文章推荐
- tf.nn.embedding_lookup TensorFlow embedding_lookup 函数最简单实例
- TensorFlow中 tf.nn.embedding_lookup
- tf.nn.embedding_lookup TensorFlow embedding_lookup 函数最简单实例
- tensorflow中embedding_lookup, tf.gather以及tf.nn.embedding_lookup_sparse的理解
- tensorflow tf.nn.embedding_lookup(embeddings, train_inputs)解释
- Tensorflow学习---tf.nn.embedding_lookup 4000
- TensorFlow - tf.nn.embedding_lookup
- tf.nn.embedding_lookup()的用法
- Tensorflow函数:tf.nn.softmax_cross_entropy_with_logits 讲解
- tensorflow中的tf.nn这类函数
- tf.nn.embedding_lookup()的参数形式
- tf.nn.embedding_lookup理解
- tf.nn.conv3d和tf.nn.max_pool3d这两个tensorflow函数的功能和参数
- tf.nn.embedding_lookup
- TensorFlow函数:tf.nn.in_top_k()
- tf.nn.embedding_lookup
- tf.nn.embedding_lookup中关于partition_strategy参数详解
- tensorflow下的局部响应归一化函数tf.nn.lrn
- Tensorflow-tf.nn.embedding_lookup函数原理
- tensorflow 笔记10:tf.nn.sparse_softmax_cross_entropy_with_logits 函数