-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
query node #11
Comments
excuse me, @imelnyk , sorry to disturb you, I am also interestred in the code of query node section, could you please add it to code or sent to me, I really wander that how to implemented it and interested in it. thanks you a lot. |
Hi, here is a rough sketch on how to implement this:
Note: Optionally, you can also implement Matcher class which estimates permutation matrix to match |
Thank you very much, I will give it a try. |
@imelnyk I am so sorry to disturb you. I add query_embed in litgrapher self.query_embed = nn.Embedding(self.max_nodes, self.model.hidden_dim) # query_node and then pass parameters to grapher logits_nodes, logits_edges = self.model(text_input_ids,
text_input_attn_mask,
target_nodes,
target_nodes_mask,
self.query_embed.weight,
target_edges) but it seems get failed inference (eg. failded_edge --> failded_node) after training, because the dimension of query_node is |
Hi, |
@imelnyk I am sorry to bother you again. joint_features = output.decoder_hidden_states[-1] Second, are the node features decoded into node logits same as the node features used to generate edges? Third, should the sample function in the grapher.py def sample(self, text, text_mask): be modified after the query_embed is added? |
|
@imelnyk
joint_features = transformer(input_ids=text, attention_mask=text_mask, decoder_inputs_embeds=query_embed).last_hidden_state
|
|
@imelnyk def sample (self, text, text_mask):
output = self.transformer.generate(input_ids=text,
max_length=150,
attention_mask=text_mask,
output_hidden_states=True,
output_scores=True,
return_dict_in_generate=True) If yes, which parameter should the query_embed be passed to? decoder_inputs_embeds? But the parameters of the generate function do not seem to have the decoder_inputs_embeds parameter. |
Yes, this part is a bit tricky. |
OK. Thank you very much, I will give it a try. |
Hi @imelnyk I apologize for reaching out again. In section 2.2 "Node Generation: Query Nodes" of the paper, it is mentioned that the node features are encoded as Fn ∈ Rd×N. May I kindly ask you how to pass this to the GRUDecoder to generate logits nodes (Seq_len X voc_size X num_nodes) in detail? I have attempted numerous ways to modify it, but the issue still persists. If you still have the code archive, I would be grateful if you could share it with me. I am very interested in your implementation of this part. Thank you for tirelessly teaching me how to make modifications. |
@imelnyk I am sorry to bother you again. One thing I wonder to figure out:
but the accuracy is very low;
|
Yes, the query node training is not easy, you have to train longer, and play with learning rates, gradient clipping, etc. For us, the performance was not great, however it was still able to generate legible nodes and edges. It looks like in your case it might be the training problems or even some issues with implementation. |
Okay. Thank you very much. |
Hi, @imelnyk I am sorry to bother you again. |
Yes, as the model trains, it evaluates the model, and saves the results. You can see it here: Line 179 in 97b1f7f
|
Okay. Thank you very much. |
Hello, @imelnyk , I am sorry to bother you again. So far, we are still a little puzzled about the function of query node. Could you please explain it? |
Hello author, do you still have the archive of the query node code? I am more interested in this piece, so I want to study it.
The text was updated successfully, but these errors were encountered: