You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I'm really interested in your pipelined approach in this paper. I notice that you mentioned a second pre-trained encoder in the relation model. I'm a little confused about that. I'm not sure how you get the second pre-trained encoder. Is the training procedure completely identical with BERT? Or is there anything special in getting the encoder?
Thanks,
The text was updated successfully, but these errors were encountered:
Hi! Thanks for your interests! A "second pre-trained encoder" means that we use a separate encoder for the relation model (instead of sharing the encoder with the entity model). Both relation encoder and entity encoder are BERT.
Sorry about the confusion! Let me know if you have any further questions!
Hello! Thanks for your help! But I still have a question about it. As you mentioned, the additional inserted marker plays an important role in obtaining the relation representation. Here is the thing. As BERT haven't seen these markers before, so I find it really hard to understand how to use BERT to get reasonable representation for these markers. I guess there are some pre-trained tasks like MLM to help BERT see these special markers at first? Please correct me if I miss the point.
You are right that BERT haven't seen these markers during pre-training. We don't have any special pre-training tasks for marker tokens. The representations of marker tokens are learned merely during fine-tuning on the downstream task (relation classification).
Hello, I'm really interested in your pipelined approach in this paper. I notice that you mentioned a second pre-trained encoder in the relation model. I'm a little confused about that. I'm not sure how you get the second pre-trained encoder. Is the training procedure completely identical with BERT? Or is there anything special in getting the encoder?
Thanks,
The text was updated successfully, but these errors were encountered: