Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 16 additions & 8 deletions otter/gradescope-debug.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,11 @@ It is possible that the grading container may have a different environment than

It is also possible that a student's submission has changed since the public tests were run on datahub. Data 100 has seen a number of cases where an indent was accidentally added to a code block, preventing a cell with a student's answer from executing in the grading container. Consequently, those students failed tests in the grading container that they previously passed.

## Interact with a student's submission in the grading container
## Open JupyterLab in the grading container

1. On Gradescope, select a specific submission to debug. Once you can view the submission, on the bottom toolbar click "Debug via SSH." It may take a few minutes for the debugging session to initialize.

2. SSH into the grading container using the provided command from the blue box at the top of the screen with the submission. The command will look something like `ssh username@remote-host -p password`.
2. SSH into the grading container using the provided command from the blue box at the top of the screen with the submission. The command will look something like `ssh username@remote-host -p password`. What is printed to your terminal after SSH-ing into the grading container is useful information. Read it!

3. Activate the virtual environment from `requirements.txt` in the grading container using `conda activate otter-env`.

Expand All @@ -29,10 +29,13 @@ It is also possible that a student's submission has changed since the public tes

7. Access JupyterLab locally in your browser via the given link from the output from Step 4. Choose the link beginning with `http://localhost:8888/lab?token=...`

### Notes:
* What is printed to your terminal after SSH-ing into the grading container is useful information. Read it!
* You can run otter directly by running `python3 source/run_otter.py`.
* To inspect the output of running otter directly on the student's submission, once you have `results/results.pkl`, run the following [Python snippet](https://otter-grader.readthedocs.io/en/latest/debugging.html#viewing-the-executed-notebook) to look at and `.ipynb`:
Note: If you open a terminal from JupyterLab you will need to rerun `conda activate otter-env` if you see `(base)` on the lefthand side of the prompt.

## Run the autograder on the student's submission

1. Run otter directly with `python3 source/run_otter.py` to obtain `results/results.pkl`.

2. Convert the `.pkl` file into a notebook by running the following [Python snippet](https://otter-grader.readthedocs.io/en/latest/debugging.html#viewing-the-executed-notebook) in the `results` directory:

```
import dill
Expand All @@ -44,6 +47,11 @@ with open("results.pkl", "rb") as f:
nbformat.write(res.notebook, "executed.ipynb")
```

* If you want to interact with, not just inspect `executed.ipynb`, you will need to move that notebook from the `results` directory to the `submission` directory.
* You may need to adjust the kernel to use the virtual environment `otter-env`. If you open a terminal from JupyterLab you will need to rerun `conda activate otter-env` if you see `(base)` on the lefthand side of the prompt.
3. You can now inspect the output of running the autograder directly on the student's submission by viewing `executed.ipynb`.

## Interact with the student's submission

1. Move `executed.ipynb` from the `results` directory into the `submission` directory. This ensures you have access to any associated files and allows otter commands to run without issue.

2. You may need to adjust the kernel to use the virtual environment `otter-env`.

38 changes: 38 additions & 0 deletions otter/releasing-assignment.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
layout: page
title: Releasing an Assignment
parent: Otter Grader
nav_order: 1
---

<p class="warning">
This page is a work in progress. This is only the first draft.
</p>

# Releasing an Assignment using Otter Grader

This is one of many ways to release an assignment that students will complete on [DataHub](https://datahub.berkeley.edu/) and be graded with [Otter Grader](https://otter-grader.readthedocs.io/en/latest/index.html#). Data Science Undergraduate Education recommends this method.

Note: If you are working on an assignment that will be released to students on a course specific hub, please supply that hub whenever this guide references "DataHub" or "the hub" For example, Data 100 uses [data100.datahub](https://data100.datahub.berkeley.edu/), instead of [DataHub](https://datahub.berkeley.edu/).

## Before the start of the semester

1. Review what Python packages your class is using.
* If you are not developing or altering assignments, you can use files like `requirements.txt` and look at your `import` statements.
<!--
* TODO: For a systematic way to gather this information, place this notebook at the root of your directory and run through it.
-->

2. Check that all Python packages are available on the DataHub offering that your course is using.

3. If you need to request a package for your hub, fill out a new [GitHub Issue](https://github.com/berkeley-dsep-infra/datahub/issues). Choose the "Package Addition/Change Request" Template.

## Before an assignment is released

We recommend running `otter assign` on DataHub. We do NOT recommend running this command locally. You can clone the development repo in DataHub via the command line/terminal. It is safe to do so; you do not need to worry about students accidentally getting access to the solutions.

Ensure that the solution runs on DataHub without issue; i.e. "Run All Cells" on the solution notebook generated from `otter assign`.

Only push to the student facing repository once all changes have been made. You don't want to accidentally break an nbgitpuller link.

Create your nbgitpuller link using [this plug in](https://chromewebstore.google.com/detail/datahub-link-generator/ijbgangngghdanhcnaliiobbiffocahf). Ensure that the JupyterHub URL is for the intended DataHub.
Loading