Docker image for a remote isanlp processor.
- Download the weights into
data/models_saved/model.pth
anddata/models_saved/xl_model.pth
. Link 1 (Baidu), link 2 (Dropbox). - Configure the environment:
source setup_environment.sh
. If necessary, install the environment as a jupyter kernel:pip install ipykernel && python -m ipykernel install --name "rst_parser_zhang"
- Use in the code:
from isanlp import PipelineCommon
from isanlp.processor_udpipe import ProcessorUDPipe
from isanlp_processor import ProcessorRST
ppl = PipelineCommon([
#(ProcessorSpaCy(model_name='en_core_web_sm', morphology=True, parser=False, ner=False, delay_init=False),
(ProcessorUDPipe(model_path='data/english-ewt-ud-2.5-191206.udpipe', parser=False),
['text'],
{'tokens': 'tokens',
'sentences': 'sentences'}),
(ProcessorRST(),
['text', 'tokens', 'sentences'],
{0: 'rst'})
])
some_text = "Brown fox jumped over the tree because it had to."
result = ppl(some_text)
>>> print(result['rst'][0])
id: 1
text: Brown fox jumped over the tree because it had to.
proba: 1.0
relation: Explanation
nuclearity: NS
left: Brown fox jumped over the tree
right: because it had to.
start: 0
end: 49