Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is code in utils line 117-line 120 real? #1

Closed
PolarisRisingWar opened this issue Aug 12, 2021 · 9 comments
Closed

Is code in utils line 117-line 120 real? #1

PolarisRisingWar opened this issue Aug 12, 2021 · 9 comments

Comments

@PolarisRisingWar
Copy link

PolarisRisingWar commented Aug 12, 2021

y = y0
for i in range(K): 
    y = torch.matmul(adj, y)
    for i in idx:

i既在循环外又在循环内,是不是写错了?
另外最上面y=y0用的是直接引用,那后面对y做修改的话不会对y0也产生影响吗?这一步本来是想保存y0的值吗,那感觉应该用copy()或者clone()之类的代码?

@PolarisRisingWar PolarisRisingWar changed the title Is code in utils line 118-line 120 real? Is code in utils line 117-line 120 real? Aug 12, 2021
@PolarisRisingWar
Copy link
Author

我试验了一下,底下那个i既在大循环又在小循环好像对小循环里面的i没有影响。
但是y=y0这个直接引用好像就是会改变y0。
您这边看一下?

@PolarisRisingWar
Copy link
Author

我看了一下在models.py中的inference()函数也是这样写的:

    def inference(self, h, adj): 
        y0 = torch.softmax(h, dim=-1) 
        y = y0
        for i in range(self.K):
            y = (1 - self.alpha) * torch.matmul(adj, y) + self.alpha * y0
        return y

也是一样的问题,这个y能直接引用吗?

@DongHande
Copy link
Owner

感谢你对我们工作的关注。下面针对你提出的问题给出答复:

  1. 第一个问题,内外循环的问题。这个主要是因为循环体没用到外层的i,因此不会出bug,外层的循环只起控制循环多少次的作用。这个代码写的确实不太好,不利于阅读,我会更新下,改成 for _ in range(K):。
  2. 这个写法是没有问题的,不会对y0产生影响,你可以用调试模式观看下变量的值检查下。

@PolarisRisingWar
Copy link
Author

非常感谢!虽然我没搞懂机制,但是我自己调试了一下,确实是不会影响。感谢您的回复!

@PolarisRisingWar
Copy link
Author

另外还想问一下,我看到代码中除对邻接矩阵进行正则化外,还对feature也进行了正则化。请问对feature做正则化是专门有什么理论依据的吗?

@DongHande
Copy link
Owner

DongHande commented Aug 12, 2021 via email

@PolarisRisingWar
Copy link
Author

感谢您的回复!我去看了一下PPNP项目的代码,他们对feature做的正则化好像是这个函数:

def normalize_attributes(attr_matrix):
    epsilon = 1e-12
    if isinstance(attr_matrix, sp.csr_matrix):
        attr_norms = spla.norm(attr_matrix, ord=1, axis=1)
        attr_invnorms = 1 / np.maximum(attr_norms, epsilon)
        attr_mat_norm = attr_matrix.multiply(attr_invnorms[:, np.newaxis])
    else:
        attr_norms = np.linalg.norm(attr_matrix, ord=1, axis=1)
        attr_invnorms = 1 / np.maximum(attr_norms, epsilon)
        attr_mat_norm = attr_matrix * attr_invnorms[:, np.newaxis]
    return attr_mat_norm

想请问您是参考的这一代码吗?

@DongHande
Copy link
Owner

DongHande commented Aug 12, 2021 via email

@PolarisRisingWar
Copy link
Author

好的,非常感谢您的耐心回复!也感谢您与您的团队创作出优秀的论文!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants