Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate warnings in report #725

Merged
merged 35 commits into from Jan 28, 2020
Merged

Generate warnings in report #725

merged 35 commits into from Jan 28, 2020

Conversation

adelavega
Copy link
Collaborator

@adelavega adelavega commented Jan 16, 2020

Closes #441 #704

Adds a warnings field to the Report model. This field is update if any warnings are generated, and even if the Report ends up crashing.

We should display these above the traceback using ant's alerts, such as this one:
Screenshot from 2020-01-16 17-02-31

If each warning is pretty short, we should be able to stack them above the traceback (if any).

Sometimes warnings will be spit out even when there are no errors and the design matrix is compiled.

Maybe they can go after Correlation Matrix? These are not going to be subject specific so they should not be included within a Run tab.

I will next add a warning for #704 to test this out.

@codecov-io
Copy link

codecov-io commented Jan 16, 2020

Codecov Report

❗ No coverage uploaded for pull request base (master@64d8c8c). Click here to learn what that means.
The diff coverage is 60%.

Impacted file tree graph

@@            Coverage Diff            @@
##             master     #725   +/-   ##
=========================================
  Coverage          ?   85.87%           
=========================================
  Files             ?       67           
  Lines             ?     3022           
  Branches          ?        0           
=========================================
  Hits              ?     2595           
  Misses            ?      427           
  Partials          ?        0
Impacted Files Coverage Δ
neuroscout/tasks/upload.py 54.92% <ø> (ø)
neuroscout/tasks/utils/io.py 96.22% <100%> (ø)
neuroscout/tasks/report.py 75.67% <50%> (ø)
neuroscout/tasks/utils/warnings.py 88.46% <66.66%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 64d8c8c...b5b07bf. Read the comment docs.

@adelavega
Copy link
Collaborator Author

Cool, it's working on the backend

@adelavega
Copy link
Collaborator Author

Alright, all is looking good. @rwblair please take a look next week and test & see if there's a cleaner way to do things.

On the backend, still need to catch errors in the warnings module. Uncaught errors in Reports will result in Report hanging indefinitely.

Otherwise, just need to add more warnings. For now, focus on checking if n/as are handled correctly.

@adelavega
Copy link
Collaborator Author

adelavega commented Jan 27, 2020

  • Try to generalize as many warning as possible into a Validator class in fitlins that checks all internal arguments that starts with _check

  • Add wrapper that points to Neurostars / Github issues. Possibly hide traceback by default

  • Could also catch specific pybids Exception by text, and replace them with friendlier text when possible.

@rwblair rwblair mentioned this pull request Jan 27, 2020
@adelavega adelavega merged commit f062f0c into master Jan 28, 2020
@adelavega adelavega deleted the report_warnings branch January 28, 2020 22:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Warning messages in reports
3 participants