Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

this code is not consistent with paper #3

Closed
saladcat opened this issue Mar 4, 2021 · 6 comments
Closed

this code is not consistent with paper #3

saladcat opened this issue Mar 4, 2021 · 6 comments

Comments

@saladcat
Copy link

saladcat commented Mar 4, 2021

        for k in range(0, self.n_layers):
            if self.layer_fun == 'gcf':
                ego_embeddings_s = one_graph_layer_gcf(A_fold_hat_s, ego_embeddings_s, weights_s, k)
                ego_embeddings_t = one_graph_layer_gcf(A_fold_hat_t, ego_embeddings_t, weights_t, k)
            if k >= self.n_layers - self.n_interaction and self.n_interaction > 0:
                if self.fuse_type_in == 'la2add':
                    ego_embeddings_s, ego_embeddings_t = self.s_t_la2add_layer(ego_embeddings_s, ego_embeddings_t,
                                                                               self.lambda_s, self.lambda_t,
                                                                               self.domain_laplace)

            norm_embeddings_s = tf.math.l2_normalize(ego_embeddings_s, axis=1)
            norm_embeddings_t = tf.math.l2_normalize(ego_embeddings_t, axis=1)

            if self.connect_way == 'concat':
                all_embeddings_s += [norm_embeddings_s]
                all_embeddings_t += [norm_embeddings_t]
            elif self.connect_way == 'mean':
                all_embeddings_s += norm_embeddings_s
                all_embeddings_t += norm_embeddings_t
        if self.connect_way == 'concat':
            all_embeddings_s = tf.concat(all_embeddings_s, 1)
            all_embeddings_t = tf.concat(all_embeddings_t, 1)
        elif self.connect_way == 'mean':
            all_embeddings_s = all_embeddings_s / (self.n_layers + 1)
            all_embeddings_t = all_embeddings_t / (self.n_layers + 1)

where

            norm_embeddings_s = tf.math.l2_normalize(ego_embeddings_s, axis=1)
            norm_embeddings_t = tf.math.l2_normalize(ego_embeddings_t, axis=1)
            if self.connect_way == 'concat':
                all_embeddings_s += [norm_embeddings_s]
                all_embeddings_t += [norm_embeddings_t]
            elif self.connect_way == 'mean':
                all_embeddings_s += norm_embeddings_s
                all_embeddings_t += norm_embeddings_t

why you add norm_embedding to all_embeddings_s?
this code is not consistent with paper !!!

@saladcat saladcat changed the title wrong this code is not consistent with paper Mar 4, 2021
@sunshinelium
Copy link
Owner

首先非常感谢您对我们工作的兴趣。代码中这里是分为了两种层融合策略进行的实验,论文中考虑的是self.connect_way=='concat'的情况。

@saladcat
Copy link
Author

saladcat commented Mar 5, 2021

norm_embeddings_s = tf.math.l2_normalize(ego_embeddings_s, axis=1)
抱歉可能我没表达清楚意思,请问这里为什么要做 L2呢?

@sunshinelium
Copy link
Owner

sunshinelium commented Mar 5, 2021 via email

@saladcat
Copy link
Author

saladcat commented Mar 5, 2021

可是这个l2正则项为什么要在后面concat到一起呢?而且这里是多层GCN卷积的部分,正则项也不应该在这里出现呀。

@sunshinelium
Copy link
Owner

sunshinelium commented Mar 5, 2021 via email

@saladcat
Copy link
Author

saladcat commented Mar 5, 2021

好的了解到,感谢您的回复。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants