Using NL for multiclass classification task #27
Replies: 1 comment 24 replies
-
Hi, I have recently written a short tutorial on rules mapping that might help you understand NL syntax more. One layer of RGCN could be rewritten as (I added comments on top of the rule to make mapping more clear; also, the multiplication is the other way around in the comment - it is actually t = Template()
metadata = [Aggregation.SUM, Activation.IDENTITY]
for relation in all_relations:
# h_i1 h_j0 * W_r + h_i0 * W_0 Ensuring j \in N^r_i
t += (R.h_1(V.I) <= (R.h_0(V.J)[1,], R.h_0(V.I)["W_0": 1,], R.hidden.edge(V.I, relation, V.J))) | metadata
t += R.h_1 / 1 | [Activation.SIGMOID] For additional layers, you would change only indices, that is: # W_0 in the 1st layer
t += (R.h_2(V.I) <= (R.h_1(V.J)[1,], R.h_1(V.I)["W_1": 1,], R.hidden.edge(V.I, relation, V.J))) | metadata
t += R.h_2 / 1 | [Activation.SIGMOID] You will have one rule for each relation, The outer sum over relations is done in the background as well - rules with the same (grounded) head are summed when queried. The only part I omitted is the normalization constant - it's a bit tricky, and I'm not sure how you want to define it. You could, for example, add it into the dataset examples as: R.c("entity_1", "relation_1")[0.5] # Some value c_{i,r} And then change the template a little bit (the new (first) rule computes for relation in all_relations:
# C_i,r * W_r * h_j0
t += (R.c_0(V.I, relation, V.J) <= (R.c(V.I, relation), R.h_0(V.J)[1,])) | Metadata(activation="product-identity")
# "product-" enforces multiplicating instead of summing body relations
# That's why it is (C_i,r * W_r * h_j0) and not (C_i,r + W_r * h_j0)
# the 'r' is equal to 'relation'
t += R.c_0 / 3 | [Activation.IDENTITY]
# h_i1 (C_i,r * W_r * h_j0) + h_i0 * W_0 Ensuring j \in N^r_i
t += (R.h_1(V.I) <= (R.c_0(V.I, relation, V.J), R.h_0(V.I)["W_0": 1,], R.hidden.edge(V.I, relation, V.J))) | metadata
t += R.h_1 / 1 | [Activation.SIGMOID] I haven't really tested it, but it should correspond to your RGCN definition. When it comes to the multiclass classification, you should either go with a one-hot encoded vector or make a query for each possible combination of entities and relations (where you will learn query for the correct class to be 1 and others to be 0). // EDIT: I tested the template, visualized it, and it seems to generate the correct computation graph. |
Beta Was this translation helpful? Give feedback.
-
Hello,
In the past days I was trying to figure out how to use the NeuraLogic framework to imitate the computation performed by Relational graph convolutional networks (RGCN) introduced here. Unfortunately, I wasn't able to come up with any working solution. I just can't seem to understand how to translate such computations into the logic-based language.
I have my examples and queries set up in a way that is very similar to the one already discussed here . I would like to perform a multiclass classification task (with 22 classes) using the following propagation rule:
In RGCNs, the transformation applied to the neighboring nodes depends on the particular type of the relation (category) between the two nodes, and I can't figure out how to encode this using the NL syntax. I managed to come up with an initialization of the embeddings:
where I initialize a 3x3 weight matrix for each of the relations, but I don't know whether this is the correct line of thinking. Alternatively, what would be the simplest way to set up a template for multiclass classification on this dataset - one that I could use as a starting point?
Any help would be much appreciated! Thank you.
Beta Was this translation helpful? Give feedback.
All reactions