Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

concat之后再输出,在计算结果上,和论文中 sigmoid(y_FM+y_DNN) 单独计算再加和是一样的。 #62

Open
van19 opened this issue Jan 25, 2020 · 0 comments

Comments

@van19
Copy link

van19 commented Jan 25, 2020

我也有此疑问,
按论文中的意思,y_FM= reduce_sum(first_order,1) + reduce_sum(second_order,1)
y_DNN = reduce_sum(y_deep,1),这个和
concat([first_order, second_order, y_deep]) X weights["concat_projection"])是不等价的吧,毕竟weights["concat_projection"]是不全为1的向量(变量),而且只有wx和DNN最后一层需要乘,second_order的<vi,vj>xixj项不需要乘weight
不知道是不是我理解不对?

concat之后再输出,在计算结果上,和论文中 sigmoid(y_FM+y_DNN) 单独计算再加和是一样的。

我觉得first order乘以feature_bias是多余的。因为embedding的结果与deep、second order拼接最后接一个projection layer,只看feat_value-projection这一块就已经是等价LR <w, x>的形式(论文中的公式2),在前面乘以一个feature_bias又不加非线性激活函数完全没必要。

PS:在gayhub上讨论,是不是还是用英语更合适?

Originally posted by @futureer in #32 (comment)

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant