Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automated performance testing framework #39

Closed
benknoll-umn opened this issue May 6, 2019 · 4 comments · Fixed by #71
Closed

Automated performance testing framework #39

benknoll-umn opened this issue May 6, 2019 · 4 comments · Fixed by #71
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@benknoll-umn
Copy link
Member

Add functionality for comparing a gold standard of labels to the outputs of a EventProcessor.

@GregSilverman
Copy link
Member

GregSilverman commented May 29, 2019

@benknoll-umn , do you have a specific annotation task in mind to to do this with? Should I start with sentence detection?

@benknoll-umn
Copy link
Member Author

Specifically I want to do sentence detection, we have a BRAT annotations gold standard and want to validate.

@GregSilverman
Copy link
Member

GregSilverman commented May 30, 2019

@benknoll-umn Let me know where the source, machine annotated and reference files are and I'll start digging into modifying the evaluation code I have for this task.

I am rewriting my cooccurence code, so this would be a good test for it. I can treat the label just like another UIMA annotation type.

@GregSilverman
Copy link
Member

GregSilverman commented Jun 7, 2019

Hey @benknoll-umn , what up? I keep getting a flourish of activity on this repo, but have not had time to sift through the details if anything deals with this particular issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants