Skip to content

๐Ÿ”Ž A Prodigy plugin for evaluating spaCy pipelines

License

Notifications You must be signed in to change notification settings

explosion/prodigy-evaluate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

56 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ”Ž Prodigy-evaluate

This repository contains a Prodigy plugin for recipes to evaluate spaCy pipelines. It features multiple recipes:

  1. evaluate.evaluate: Evaluate a spaCy pipeline on one or more datasets for different components. Passing flags like --label-stats or --confusion-matrix will compute a variety of evaluation metrics, including precision, recall, F1, accuracy, and more.

  1. evaluate.evaluate-example: Evaluate a spaCy pipeline on one or more datasets for different components on a per-example basis. This is helpful for debugging and for understanding the hardest examples for your model.

  1. evaluate.nervaluate: Evaluate a spaCy NER component on one or more datasets. This recipe uses the nervaluate library to calculate various metric for NER. You can learn more about the metrics in the nervaluate documentation. This is helpful because the approach takes into account partial matches, which may be a more relevant metric for your NER use case.

You can install this plugin via pip.

pip install "prodigy-evaluate @ git+https://github.com/explosion/prodigy-evaluate"

To learn more about this plugin and additional functionality, you can check the Prodigy docs.

Issues?

Are you have trouble with this plugin? Let us know on our support forum and we'll get back to you!