Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Make acceptance test(s) failures easier to read #46

Open
neomatrix369 opened this issue Oct 18, 2020 · 0 comments
Open

[FEATURE] Make acceptance test(s) failures easier to read #46

neomatrix369 opened this issue Oct 18, 2020 · 0 comments
Labels
2. medium-priority Good if it can be attended to be soon, but not urgent enough enhancement New feature or request tests

Comments

@neomatrix369
Copy link
Owner

Missing functionality
At the moment when acceptance tests fail, it's hard to make out due to which columns the failures occurred, see error messages in past failures.

Proposed feature
For each column that there is a mismatch we provide the name and the degree of inaccuracy between them. And display all failing columns - which makes it easier to read and understand what could have led to the failure.

@neomatrix369 neomatrix369 added enhancement New feature or request 2. medium-priority Good if it can be attended to be soon, but not urgent enough tests labels Oct 18, 2020
@neomatrix369 neomatrix369 added this to To do in NLP Profiler via automation Oct 18, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2. medium-priority Good if it can be attended to be soon, but not urgent enough enhancement New feature or request tests
Projects
NLP Profiler
  
To do
Development

No branches or pull requests

1 participant