Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ramon-astudillo committed May 9, 2023
1 parent 3ca8d19 commit 849b1ee
Showing 1 changed file with 6 additions and 8 deletions.
14 changes: 6 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Transition-based Neural Parser
State-of-the-Art Abstract Meaning Representation (AMR) parsing, see [papers
with code](https://paperswithcode.com/task/amr-parsing). Models both
distribution over graphs and aligments with a transition-based approach. Parser
supports any other graph formalism as long as it is expressed in [Penman
supports generic text-to-graph as long as it is expressed in [Penman
notation](https://penman.readthedocs.io/en/latest/notation.html).

Some of the main features
Expand All @@ -29,7 +29,7 @@ all scripts source a `set_environment.sh` script that you can use to activate
your virtual environment as above and set environment variables. If not used,
just create an empty version

```
```bash
# or e.g. put inside conda activate ./cenv_x86
touch set_environment.sh
```
Expand All @@ -44,7 +44,7 @@ installation instructions.

(Please install the cpu version of torch-scatter; and model training is not fully supported here.)

```
```bash
pip install transition-neural-parser
# for linux users
pip install torch-scatter -f https://data.pyg.org/whl/torch-1.13.1+cu117.html
Expand All @@ -54,7 +54,7 @@ pip install torch-scatter -f https://data.pyg.org/whl/torch-1.13.1+cu117.html

If you plan to edit the code, clone and install instead

```
```bash
# clone this repo (see link above), then
cd transition-neural-parser
pip install --editable .
Expand All @@ -63,7 +63,7 @@ pip install torch-scatter -f https://data.pyg.org/whl/torch-1.13.1+cu117.html

If you want to train a document-level AMR parser you will also need

```
```bash
git clone https://github.com/IBM/docAMR.git
cd docAMR
pip install .
Expand Down Expand Up @@ -185,8 +185,7 @@ This table shows you available pretrained model names to download;

<sup>2 Smatch on AMR3.0 Multi-Sentence dataset </sup>

we also provide the trained `ibm-neural-aligner` under names
`AMR2.0_ibm_neural_aligner.zip` and `AMR3.0_ibm_neural_aligner.zip`. For the
contact authors to obtain the trained `ibm-neural-aligner`. For the
ensemble we provide the three seeds. Following fairseq conventions, to run the
ensemble just give the three checkpoint paths joined by `:` to the normal
checkpoint argument `-c`. Note that the checkpoints were trained with the
Expand All @@ -198,7 +197,6 @@ individual models. A fast way to test models standalone is

bash tests/standalone.sh configs/<config>.sh


## Training a model

You first need to pre-process and align the data. For AMR2.0 do
Expand Down

0 comments on commit 849b1ee

Please sign in to comment.