Skip to content

Commit

Permalink
Merge pull request #15 from christopher-delphai/master
Browse files Browse the repository at this point in the history
Fixed urls in readme
  • Loading branch information
ines committed Aug 19, 2020
2 parents 63057df + ec6cb89 commit 0037b32
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Expand Up @@ -57,8 +57,8 @@ You can edit the code in the recipe script to customize how Prodigy behaves.
| [`ner.teach`](ner/ner_teach.py) | Collect the best possible training data for a named entity recognition model with the model in the loop. Based on your annotations, Prodigy will decide which questions to ask next. |
| [`ner.match`](ner/ner_match.py) | Suggest phrases that match a given patterns file, and mark whether they are examples of the entity you're interested in. The patterns file can include exact strings or token patterns for use with spaCy's `Matcher`. |
| [`ner.manual`](ner/ner_manual.py) | Mark spans manually by token. Requires only a tokenizer and no entity recognizer, and doesn't do any active learning. |
| [`ner.manual.bert`](other/transformers_tokenizerspy) | Use BERT word piece tokenizer for efficient manual NER annotation for transformer models. |
| [`ner.make-gold`](ner/ner_make-gold.py) | Create gold-standard data by correcting a model's predictions manually.聽 |
| [`ner.manual.bert`](other/transformers_tokenizers.py) | Use BERT word piece tokenizer for efficient manual NER annotation for transformer models. |
| [`ner.make-gold`](ner/ner_make_gold.py) | Create gold-standard data by correcting a model's predictions manually.聽 |
| [`ner.silver-to-gold`](ner/ner_silver_to_gold.py) | Take an existing "silver" dataset with binary accept/reject annotations, merge the annotations to find the best possible analysis given the constraints defined in the annotations, and manually edit it to create a perfect and complete "gold" dataset. |

### Text Classification
Expand Down

0 comments on commit 0037b32

Please sign in to comment.