Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TF版本与torch版本网络结构不一致,W&D输入问题 #4

Open
GrangerZyj opened this issue Mar 25, 2022 · 1 comment
Open

Comments

@GrangerZyj
Copy link

如W&D,Tensorflow版本中会建立sparse cols的1 dim embedding,并与dense cols 一起输入linear中,在pytorch版本中只有dense cols输入到linear中。

但是文章里面不是把特别稀疏的sparse cols输入到linear里面吗,为什么pytorch在linear里面就输入dense cols ?

dense_input, sparse_inputs = x[:, :len(elf.dense_feature_cols)], x[:, len(self.dense_feature_cols):]
sparse_inputs = sparse_inputs.long()
sparse_embeds = [self.embed_layers['embed_' + str(i)](sparse_inputs[:, i])
                for i in range(sparse_inputs.shape[1])]
sparse_embeds = torch.cat(sparse_embeds, axis=-1)

dnn_input = torch.cat([sparse_embeds, dense_input], axis = -1)

wide_out = self.linear(dense_input)

deep_out = self.dnn_network(dnn_input)
deep_out = self.final_linear(deep_out)

outputs = F.Sigmoid(0.5*(wide_out + deep_out))
@aobangli
Copy link

aobangli commented Jan 27, 2023

我也觉得wide的输入是交叉特征和稀疏特征,请问lz搞懂了吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants