中文实体关系抽取,pytorch,bilstm+attention
-
Updated
Nov 13, 2021 - Python
中文实体关系抽取,pytorch,bilstm+attention
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类
Implementation of papers for text classification task on SST-1/SST-2
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
中文情感分类 | 基于三分类的文本情感分析
Explainable Sentence-Level Sentiment Analysis – Final project for "Deep Natural Language Processing" course @ PoliTO
The task of post modifier generation requires to automatically generate a post modifier phrase describing the target entity (an entity essentially refers to a noun but here we only consider people) that contextually fits in the input sentence.
Add a description, image, and links to the bilstm-attention topic page so that developers can more easily learn about it.
To associate your repository with the bilstm-attention topic, visit your repo's landing page and select "manage topics."