attributes_w_n的问题
you-old opened this issue · comments
older boy commented
如果样本的属性全是0,那么attributes_w_n的输出就是全是0,我看了一下Loss回传的地方就全都变成0了,这样不会有问题吗?
attribute_batch = np.random.randint(0, 1, [4, 6]) # 所有属性为0
attributes_w_n = tf.to_float(attribute_batch[:, 1:6])
# _num = attributes_w_n.shape[0]
mat_ratio = tf.reduce_mean(attributes_w_n, axis=0)
mat_ratio = tf.map_fn(lambda x: (tf.cond(x > 0, lambda: 1 / x, lambda: float(4))), mat_ratio)
attributes_w_n = tf.convert_to_tensor(attributes_w_n * mat_ratio)
attributes_w_n = tf.reduce_sum(attributes_w_n, axis=1)
loss_sum = tf.reduce_sum(tf.to_float(np.random.rand(4, 3))) # 假定一些loss
_sum_k = tf.reduce_sum(tf.to_float(np.random.rand(4, 196)))
loss_sum = tf.reduce_mean(loss_sum * _sum_k * attributes_w_n)# 0
with tf.Session() as sess1:
print(attributes_w_n.eval())
print(loss_sum.eval()) # 这个batch_size就为0了
Guoqiang QI commented
会有影响,会使得样本属性全为0的样本无法参与训练。
older boy commented
OK,明白了