Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

for GNN #24

Closed
aozorahime opened this issue Dec 8, 2022 · 12 comments
Closed

for GNN #24

aozorahime opened this issue Dec 8, 2022 · 12 comments

Comments

@aozorahime
Copy link

Hi, I know it's kinda OOT but I am curious whether I can apply multi-task learning to graph neural network. What I learn from HPS, we shall share the encoder/decoder across the layer. I am curious should I create an encoder on top of the graph layer? Kinda stuck in this experiment, any suggestion would be helpful. Thanks

@Baijiong-Lin
Copy link
Collaborator

I know very little about the graph neural network but I think maybe you can try to define the encoder in HPS as some graph layers to extract some shared features among tasks.

@aozorahime
Copy link
Author

I see. How about the attention layer on top of graph, I shall define the encoder after that, right? Do you have an MTL example (either code or reference) for text classification?

@Baijiong-Lin
Copy link
Collaborator

We have no such example using GNN in LibMTL now. But there are many works about multi-task text classification and I think it is easy to find them on Google or Github. You can refer to how a multi-task GNN is built.

@aozorahime
Copy link
Author

yes, thank you for your suggestion

@Baijiong-Lin
Copy link
Collaborator

Closed as no further updates.

@davodogster
Copy link

@aozorahime Hi I have been using PyTorch geometric and added my own code to make it work for multi-task learning. Let me know if you're interested.

@Baijiong-Lin Cool repo! I am interested in using loss weights such as RLW for my Loss.
Do you know how I can put the two together? It is important because my paramaters are not on the same scale so I want to weight the losses accordingly. Thanks! Sam

Will something like this work for my data?
image

Here is my current code for how I obtain the losses:
image

@Baijiong-Lin
Copy link
Collaborator

@davodogster Maybe you can try it, I am not sure whether it can work in such a GNN case.

@aozorahime
Copy link
Author

@davodogster Hi, cool you made it! Have you published it in your repo?

@davodogster
Copy link

@aozorahime No Sorry, I haven't published anything yet. How urgent is it?
Hopefully this can help for now:

image
the "3" represents 3 different regression tasks.

image
image

PyG Dataset example:
image

This gets passed to the PyG DatsetClass

image

Dataset gets passed to the dataloader

test_loader = DataLoader(DBH_DS_test, batch_size=TRAIN_BS, shuffle = False, num_workers=25)

@davodogster
Copy link

@davodogster Maybe you can try it, I am not sure whether it can work in such a GNN case.

@Baijiong-Lin Do you know what variable and shape is "task_num" in the example above? How do I modify it for three-task learning?

I think it might be able to work for my use case but I will need some tips :)

@aozorahime
Copy link
Author

@davodogster Cool! so, the point is you use three different losses for the task right? MTL is for my thesis currently. Can we discuss this further later? or is it okay to discuss this here? :D

@davodogster
Copy link

@aozorahime yup you use three losses. It can be a mix of classification and regression losses depending on the tasks/params. Then you either sum the losses or use a technique like weighting the losses / GradNorm etc. OR you can scale all features if regression, to be on the same scale e.g. mean 0, SD 1. Or minMax = 0-100. then I think you dont need to weight the loss.
I'm using this one

https://github.com/ywatanabe1989/custom_losses_pytorch/blob/master/multi_task_loss.py

Yes we can discuss it more some time if needed - March is a really busy month for me this year but I can try to fit it in some time! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants