Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add manually graded questions within the same gradescope assignment #800

Open
nadia-eecs opened this issue May 1, 2024 · 0 comments
Open
Labels
enhancement New feature or request

Comments

@nadia-eecs
Copy link

nadia-eecs commented May 1, 2024

Description

Add manually graded questions within the same gradescope assignment. For example, when configuring Gradescope programming assignments (without otter-grader) rich formatting can be added to the autograder output, where there is a "html" option. This means that manual test cases can be treated as "autograder" (in the otter grader sense) to generate the associated .py test files. The files can then be ran to plot student functions and write them to html by creating a temporary file/BytesIO object. By creating a function that runs student plots the images can be rendered in the output of the same assignment. Below is an example of what the feature could potentially look like (source does not utilize otter-grader or notebook homeworks but standalone functions)

Screenshot from 2024-04-30 18-26-22

Implementation Strategies

Strategy 1:

Create a custom plugin that configures Gradescope autograder with html output format

Strategy 2 (compliments of @chrispyles)

  1. add a new key to the AutograderConfig to enable this behavior (something like export_manual_responses ?).
  2. update otter assign to somehow link manually graded portions of notebooks to their question name.
  3. add a method to the GradingResults object that accepts a notebook, extracts all of the manual response cells, and stores them keyed by the question name
  4. update GradingResults.to_gradescope_dict to convert the manual cells to Gradescope’s format and include them in the results dictionary it generates
  5. update both the python and r runners run methods to call the new GradingResults method with the submission notebook (note that it’s possible to submit non-notebook files which also get run by these runners, so this step should only be performed if the submission file’s extension is .ipynb)
  6. add unit tests for all new behavior and update the changelog

Feature advantages

The feature would enable students to submit one notebook file with all test cases manual and automatic outputs written to Gradescope autograder directly allowing to streamline the grading process for larger undergraduate courses reliant on python notebook submissions.

@nadia-eecs nadia-eecs added the enhancement New feature or request label May 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Development

No branches or pull requests

1 participant