Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Control traceback length via assignment config #536

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

joelostblom
Copy link
Contributor

@joelostblom joelostblom commented Oct 3, 2022

Resolves #419

@chrispyles This sits on top of #533, so the diff looks bigger than what it is. I can't quite figure out how to pass options from from the assignment config to the doctest runner. What I have currently only works for the default value, but it does not change when including the key in the first cell of the course notebook so I would appreciate any pointers you might have on how to achieve this. I'm also curious to hear your thoughts in general on the approach I have taken here.

These are the results of the three different options:

assertion_msg This is helpful because the assertion message can change dynamically (such as in pd.assert_frame_equal) and is therefore more informative than a static manually written message:
image

none This is the least verbose output that gives away the least possible cues
image

full This is the current default and in my opinion the hardest to read for student and it gives away the specific test that is being used:
image

@joelostblom
Copy link
Contributor Author

Note to self, should change to allow for any type of error to be propagated and not just assertion errors to account for situations where the failure message is not helpful to understand the actual reason the test fails, e.g. an undefined variable name:

image

Copy link
Member

@chrispyles chrispyles left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think some thought should go in to how this gets specified; specifically, the granularity of this configuration. Is it something that gets applied in the assignment config and applies to every test in the assignment, or can it be specified on a question-by-question or test case-by-test case basis? If the latter, then this is something that gets specified in the question or test case config, which changes how you access this information when generating each test case.

Regardless, generally the way to pass configurations to the TestFiles is to make it part of a key in the OK test format or a keyword argument in otter.test_files.test_case (for exception-based tests). Take a look at OKTestFile.from_spec to see how the OK format is parsed. You would then pass the value from the Assignment instance into one of these in this function (or, more likely, in one of the functions/templates it uses).

@@ -16,6 +16,13 @@ class Assignment(fica.Config, Loggable):
Configurations for the assignment.
"""


traceback_length: Optional[str] = fica.Key(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: please add new configurations at the bottom of the Keys (i.e. after runs_on)

Comment on lines +54 to +64
from ..assign.assignment import Assignment
if Assignment().traceback_length == 'full':
return False, runresults.getvalue()
elif Assignment().traceback_length == 'assertion_msg':
err_msg = runresults.getvalue()
if 'AssertionError: ' in err_msg:
return False, err_msg[err_msg.index('AssertionError: '):]
else:
return False, ''
elif Assignment().traceback_length == 'none':
return False, ''
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Calling Assignment() just creates an Assignment instance with the default values, so traceback_length will always be the default.

@joelostblom
Copy link
Contributor Author

joelostblom commented Oct 8, 2022

I think some thought should go in to how this gets specified; specifically, the granularity of this configuration. Is it something that gets applied in the assignment config and applies to every test in the assignment, or can it be specified on a question-by-question or test case-by-test case basis? If the latter, then this is something that gets specified in the question or test case config, which changes how you access this information when generating each test case.

My vote would be to make it an assignment wide option first since I imagine that the most common use case would be to change how all tests are displayed, but keep uniformity between tests withing the same assignment. It would always be possible to add an option later to control this on a test by test level if there is interest in that.

And thanks for the pointers of how to pass the options around, I will have a look!

@joelostblom
Copy link
Contributor Author

joelostblom commented Oct 8, 2022

One distinction I could see as useful would be to show the full traceback or at least the line that makes the assertion on gradescope if show_hidden=true, so that students can see exactly what was tested when their submission is graded.

@chrispyles
Copy link
Member

My vote would be to make it an assignment wide option first since I imagine that the most common use case would be to change how all tests are displayed, but keep uniformity between tests withing the same assignment. It would always be possible to add an option later to control this on a test by test level if there is interest in that.

SGTM.

Also, for adding this to the Assignment class, can you add it under the tests key (see the nested TestsValue class) instead of as a top-level configuration?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Hide traceback and verbose otter output to only show assertion message instead
2 participants