Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplification #2

Closed
jmercat opened this issue Jul 2, 2020 · 14 comments
Closed

Simplification #2

jmercat opened this issue Jul 2, 2020 · 14 comments

Comments

@jmercat
Copy link

jmercat commented Jul 2, 2020

When reading the code, in Graph.py I had a hard time understanding the adjacency input, I made this simplified code and tested that it is equivalent :

        def get_adjacency(self, A):
		# compute hop steps
		transfer_mat = [np.linalg.matrix_power(A, d) for d in range(self.max_hop + 1)]
		transfer_mat = (np.stack(transfer_mat) > 0)
		# transfer_mat is True for all adjacent nodes regardless of hop distance
		# Here we filter out adjacency when it is already in lower hop distance:
		cum_A = np.zeros_like(A[0], dtype='bool')
		for i in range(0, self.max_hop + 1):
			transfer_mat[i, cum_A] = 0
			cum_A = np.logical_or(cum_A, transfer_mat[i])
		return transfer_mat

	def normalize_adjacency(self, A):
		A = A/np.maximum(1, A.sum((0, 1)))
		return A

Unlike the original code, if the input A is not of size (n_nodes, n_nodes), the output will still have the same size as A. Otherwise, it gives the same results.

@xincoder
Copy link
Owner

xincoder commented Jul 3, 2020

@jmercat Thank you for your interest in our work.
Your simplification code is easier to understand and will be helpful to others.
Thanks.

@jmercat
Copy link
Author

jmercat commented Jul 3, 2020

@xincoder I realize that I haven't thanked you for your great work and for sharing the code. So thanks a lot, I am grateful. I am working on an idea that might improve your model with minimal changes to it. I'll keep you informed.

@xincoder
Copy link
Owner

xincoder commented Jul 3, 2020

@jmercat Thank you very much. It makes my day.😄
I really hope to see that our work can inspire others in both academy and industry.

@tinmodeHuang
Copy link

tinmodeHuang commented Sep 5, 2020

hi! @xincoder ,could you tell me what tool you used to draw that algorithm architecture in the paper, thanks in advance!

@xincoder
Copy link
Owner

xincoder commented Sep 5, 2020

Hi @tinmodeHuang, thank you for your interest in our work.
Microsoft PowerPoint was used to draw the model architecture.

@tinmodeHuang
Copy link

tinmodeHuang commented Sep 10, 2020

Hi @tinmodeHuang, thank you for your interest in our work.
Microsoft PowerPoint was used to draw the model architecture.

well, I also wonder how to visualize prediction trajectories after I had a general knowlegde of dependency among all scripts and haven't found out dedicated utility to visualize it.

@xincoder
Copy link
Owner

Hi @tinmodeHuang, thank you for your interest in our work.
Microsoft PowerPoint was used to draw the model architecture.

well, I also wonder how to visualize prediction trajectories after I had a general knowlegde of dependency among all scripts and haven't found out dedicated utility to visualize it.

The visualized results reported in our paper were generated using Plotly. It is very easy to draw it using any library that you are familiar with, e.g., Matplotlib, etc.

@tinmodeHuang
Copy link

Hi @tinmodeHuang, thank you for your interest in our work.
Microsoft PowerPoint was used to draw the model architecture.

well, I also wonder how to visualize prediction trajectories after I had a general knowlegde of dependency among all scripts and haven't found out dedicated utility to visualize it.

The visualized results reported in our paper were generated using Plotly. It is very easy to draw it using any library that you are familiar with, e.g., Matplotlib, etc.

thanks for instant reply! I'm very new to Plotly-like library, if possible, are you willing to share your visualizing script with me? and then I can use it as introduction to relavent library

@tinmodeHuang
Copy link

tinmodeHuang commented Sep 18, 2020

@xincoder while I have taken an attempt to visualizing it, I agree with you, here I'm sorry for the thought of trying to get it without any efforts. By the way, I have confused about that you do the plane rotation transformation at a random angle in the script xin_feeder_baidu.py, is it to do so for add noise to improve robustness?

@xincoder
Copy link
Owner

@xincoder while I have taken an attempt to visualizing it, I agree with you, here I'm sorry for the thought of trying to get it without any efforts. By the way, I have confused about that you do the plane rotation transformation at a random angle in the script xin_feeder_baidu.py, is it to do so for add noise to improve robustness?

@tinmodeHuang , yes. The rotation is one kind of data augmentation during training. The benefit by doing so is reported in our GRIP++ paper (B12 VS B13 in Table3).

@tinmodeHuang
Copy link

@xincoder while I have taken an attempt to visualizing it, I agree with you, here I'm sorry for the thought of trying to get it without any efforts. By the way, I have confused about that you do the plane rotation transformation at a random angle in the script xin_feeder_baidu.py, is it to do so for add noise to improve robustness?

@tinmodeHuang , yes. The rotation is one kind of data augmentation during training. The benefit by doing so is reported in our GRIP++ paper (B12 VS B13 in Table3).

thanks for the reminder! well, would you or even people accept the interruption from questioners' appreciation? I have wondered so ever.

@xincoder
Copy link
Owner

@xincoder while I have taken an attempt to visualizing it, I agree with you, here I'm sorry for the thought of trying to get it without any efforts. By the way, I have confused about that you do the plane rotation transformation at a random angle in the script xin_feeder_baidu.py, is it to do so for add noise to improve robustness?

@tinmodeHuang , yes. The rotation is one kind of data augmentation during training. The benefit by doing so is reported in our GRIP++ paper (B12 VS B13 in Table3).

thanks for the reminder! well, would you or even people accept the interruption from questioners' appreciation? I have wondered so ever.

@tinmodeHuang ? Sorry, I did not get your point. Would you please provide more details about your question. Thanks.

@moriartyjack0520
Copy link

moriartyjack0520 commented Dec 28, 2020

  # compute hop steps

@xincoder @jmercat Thanks you,your code is very helpful for me.
But I don't know the code to # compute hop steps is for what? What is max_hop mean? Could you help me, thanks!
# compute hop steps transfer_mat = [np.linalg.matrix_power(A, d) for d in range(self.max_hop + 1)]

@xincoder
Copy link
Owner

@moriartyjack0520 Thank you for your question. The hop is a concept of Graph theory. It counts how many paths from one node to others (of a certain length). The code you highlighted above is used to calculate it (by multiplying the "adjacency matrix" several times, this is a simple way to achieve this goal).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants