Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Print assessments for offline work and grading #2104

Open
trombonekenny opened this issue Feb 29, 2020 · 6 comments
Open

Print assessments for offline work and grading #2104

trombonekenny opened this issue Feb 29, 2020 · 6 comments
Labels
discussion This issue is for general discussion on a topic or tracking a large project enhancement A desired new feature or change (not a bug)

Comments

@trombonekenny
Copy link
Contributor

While PrairieLearn is an online question platform, scenarios come up where being able to print questions or assessments for off-line work has been requested. Consider this a placeholder issue for discussion and consideration for that issue.

Some thoughts:

  • Would the assessment belong to the instructor or assigned to a particular student?
  • Would an answer key be printed too? Or some way to easily load the digital version of that exam to do data entry?
  • What needs to be changed to restyle the webpages to print correctly?
  • Output is a PDF?
@trombonekenny trombonekenny added enhancement A desired new feature or change (not a bug) discussion This issue is for general discussion on a topic or tracking a large project labels Feb 29, 2020
@tdy
Copy link
Contributor

tdy commented Apr 18, 2020

Apart from the integration and styling issues, can we just pandoc the rendered htmls to pdfs as a crude starter?

@andrewstec
Copy link
Contributor

@trombonekenny , just roughly responding to your points:

  • I second output as PDF, as this was requested by UBC staff under two separate instances to help support a manual grading export and upload to GS for marking flow.
  • Printing questions that are interactive (such as do not display unless selecting from drop-down might be a limitation or edge case that might want to be considered)
  • Just one idea of many: we may be able to use a headless browser and a Node library plugin to load and print the rendered webpage (ie. https://github.com/puppeteer/puppeteer). Downside: It may be slow and use lots of memory. Advantage: Print-out should display as actually rendered. We could do this for each question, submission, and answer panel to get a separate PDF for each document.
  • I think we would want the answer and submission printed as well because then this allows faculty to download the panels in parts to upload them to GS, or a different platform, for manual grading. That would help provide a manual grading option in PL without needing to support manual grading logic.

@kristenvaccaro
Copy link

kristenvaccaro commented Dec 12, 2023

I would also find this useful. We're using PL for a new class writing with a lot of new questions, where being able to proofread exams on hard copy (not on the screen) helps catch more typos/issues!

I'll also add that our OSD runs testing facilities that only organize paper exams, so it would be nice to print one/several variants to be able use in that context as well.

@alonor
Copy link

alonor commented May 3, 2024

Thanks for the #9814 cross-reference.

In addition to pdf, it would also be helpful to see all the questions together in html format. This over-view provides an idea of all the questions, their relationship, etc.

@ffund
Copy link
Contributor

ffund commented May 30, 2024

+1 to the suggestion of HTML view. It would be very useful to be able to click on a link in an assessment log and get a one-page view of the entire assessment, including submissions.

Example use case: you have an assessment where there is a relationship between questions. For example,

  • Q1: do X in workspace
  • Q2: do Y in workspace
  • Q3: (manually graded) discuss your work on X and Y in the context of Z.

currently to grade Q3, I would open Q3 in the manual grading interface, click on the assessment log link, click on Q1 and click on Q2 to see the submissions. Now I have three tabs open. See if the answer to Q3 makes sense given those submissions to Q1 and Q2, then I can assign a grade to Q3 and close the other tabs.

@ffund
Copy link
Contributor

ffund commented Aug 19, 2024

This was raised on the SIGCSE mailing list as a way to potentially deter students from using AI to cheat on exams (for those of us without a CBTF):

Distribute the questions on paper, but administer the exams online. Then if I give a program snippet and ask students how to finish it, they can't just paste it into an LLM; they have to type it in, and they wouldn't have enough time to do that (assuming that enough questions are of that form). While this allows us to continue administering the exam online, it gives up the authentic environment and raises the risk that students will share answers (we wouldn't have question pools to inhibit this anymore).

I would be very interested in doing this, personally. I would want to -

  • have a panel that shows in "print" view but not other views - kind of like the panel that only shows in manual grading view, for example. I would put the main contents of the question in this panel.
  • print an exam per student (each student still gets their own randomized version)
  • students do exam on computers, but the main contents of each question is on paper, not on computer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion This issue is for general discussion on a topic or tracking a large project enhancement A desired new feature or change (not a bug)
Projects
None yet
Development

No branches or pull requests

6 participants