-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tests should not load outside of the LTI parent #26
Comments
I don't think an editor setting is the right choice, from a UX perspective - there's no sign that you've forgotten to set it. I think this is in the same category as cheating with developer tools. We could do something to tell students their results won't be saved anywhere - maybe the intro screen could have a big red "Your progress on this exam is not being saved" message above the Start button. That would give a warning to students who are expecting their results to be saved, and is also of some use to students taking a test on the web. |
Off topic, but the same could be said for other editor settings, such as "Allow user to regenerate questions?". Here at Newcastle we would simply add this to our practice/exam toggle bookmarklet, and perhaps there's an argument that this could be formalised into (customisable) presets for different modes of exam. Anyhow... appreciate your point about this being at the SCORM level and I agree with a message, rather than refusing to run. |
This is a lot more subtle than that - you'd have to run your SCORM package outside the VLE to see any difference in behaviour. You can at least spot that you've left "try another question" on by running the exam after you've uploaded it. What could the LTI provider do to act as a sanity-check? I'm reluctant to add anything to the Numbas runtime that's specific to the LTI tool - it would make normal SCORM even worse, and drive us down the path towards requiring the LTI tool. The LTI provider could give a quick rundown of options after you upload an exam package, like this:
(maybe with open/closed eye icons instead of ticks and crosses - the idea is to quickly show which things give the students more info) |
Yes, very true that it can't be eye-balled in the test run. Anyway, your suggestion in numbas/Numbas#529 should at least help to protect against the cases which end up with lost marks. A rundown of the options, after upload to the LTI provider, would be a nice addition. At present, after upload, the user is presented with the blank dashboard, so a success notification (which might include the test options) would be good. |
I've opened #31 for the summary of feedback settings idea. |
It’s possible to open the Numbas runtime frame in a new window e.g. “open frame in a new tab” or similar, which opens a stand-alone test. This is a problem from the point of view of giving a student in-the-know an opportunity to see solutions (albeit with a new randomisation), but it can also result in a student mistakenly completing the test in the second window, where no data is being sent back to the server and with no warning when they complete the test.
One fix could be to offer a checkbox in the Editor exam settings, “Require Numbas LTI Tool?” or similar, which toggles a function in the runtime that can check for a (or the specific) parent and stop the page from loading if it does not exist.
The text was updated successfully, but these errors were encountered: