Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests should not load outside of the LTI parent #26

Closed
chrismgraham opened this issue Feb 22, 2018 · 5 comments
Closed

Tests should not load outside of the LTI parent #26

chrismgraham opened this issue Feb 22, 2018 · 5 comments

Comments

@chrismgraham
Copy link

chrismgraham commented Feb 22, 2018

It’s possible to open the Numbas runtime frame in a new window e.g. “open frame in a new tab” or similar, which opens a stand-alone test. This is a problem from the point of view of giving a student in-the-know an opportunity to see solutions (albeit with a new randomisation), but it can also result in a student mistakenly completing the test in the second window, where no data is being sent back to the server and with no warning when they complete the test.

One fix could be to offer a checkbox in the Editor exam settings, “Require Numbas LTI Tool?” or similar, which toggles a function in the runtime that can check for a (or the specific) parent and stop the page from loading if it does not exist.

@christianp
Copy link
Member

I don't think an editor setting is the right choice, from a UX perspective - there's no sign that you've forgotten to set it.
I think this is a SCORM problem, not an LTI-specific one.
Maybe the SCORM package version of an exam should refuse to run if there's no SCORM API present. The downside of this is that teachers have to make sure to get a standalone version of an exam if they just want to put it on the web.

I think this is in the same category as cheating with developer tools.

We could do something to tell students their results won't be saved anywhere - maybe the intro screen could have a big red "Your progress on this exam is not being saved" message above the Start button. That would give a warning to students who are expecting their results to be saved, and is also of some use to students taking a test on the web.

@chrismgraham
Copy link
Author

Off topic, but the same could be said for other editor settings, such as "Allow user to regenerate questions?". Here at Newcastle we would simply add this to our practice/exam toggle bookmarklet, and perhaps there's an argument that this could be formalised into (customisable) presets for different modes of exam.

Anyhow... appreciate your point about this being at the SCORM level and I agree with a message, rather than refusing to run.

@christianp
Copy link
Member

christianp commented Mar 1, 2018

This is a lot more subtle than that - you'd have to run your SCORM package outside the VLE to see any difference in behaviour. You can at least spot that you've left "try another question" on by running the exam after you've uploaded it.

What could the LTI provider do to act as a sanity-check? I'm reluctant to add anything to the Numbas runtime that's specific to the LTI tool - it would make normal SCORM even worse, and drive us down the path towards requiring the LTI tool.

The LTI provider could give a quick rundown of options after you upload an exam package, like this:

  • ✓ Students can regenerate questions
  • ✓ Students get immediate feedback on their answers
  • ✗ The front page is not shown

(maybe with open/closed eye icons instead of ticks and crosses - the idea is to quickly show which things give the students more info)

@chrismgraham
Copy link
Author

Yes, very true that it can't be eye-balled in the test run. Anyway, your suggestion in numbas/Numbas#529 should at least help to protect against the cases which end up with lost marks.

A rundown of the options, after upload to the LTI provider, would be a nice addition. At present, after upload, the user is presented with the blank dashboard, so a success notification (which might include the test options) would be good.

@christianp
Copy link
Member

I've opened #31 for the summary of feedback settings idea.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants