-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes for assessments dashboard #44887
Conversation
e50772a
to
46dad3a
Compare
46dad3a
to
81c1aec
Compare
81c1aec
to
0f50c8e
Compare
level_result[:student_result] = student_answer | ||
level_result[:status] = "" | ||
when Multi | ||
answer_indexes = Script.cache_find_level(level.id).correct_answer_indexes_array |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to have a method in the Multi.rb
model where it takes a student answer and returns the result (unsubmitted
, correct
, incorrect
)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a good idea, but I think if we did that, we'd want to also move the logic for the other level types into their respective classes and that feels out of scope for this change. Ok if we save that for another day?
# A given user with no UserLevel matching the given criteria is omitted from | ||
# the returned hash. The associated LevelSource data for each UserLevel is also | ||
# prefetched to prevent n+1 query issues. | ||
def self.user_levels_by_user(user_ids, script_id, level_ids) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm curious why you chose to put this in the User model
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question! It seems like several of our progress-related methods are here including User#user_levels_by_level
right above this method and User# user_levels_by_user_by_level
right below which seemed very similar. I'm open to moving it if there's a better place to put it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My initial instinct is a query that's returning UserLevels for multiple users would make more sense in the UserLevel model, but if this is the patten, I'm fine with leaving it as-is. I'd love if we had more clear patterns on what should go in our models and where queries should live, because it's often unclear to me/ feels really disorganized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
100% agree that this is confusing and disorganized. 😞
This PR contains several fixes for the assessments dashboard.
The biggest change is a medium-term workaround for a bug where the assessments dashboard displays incorrect data. This bug arises because student answers for assessments are stored in two places: (1) the answer for each individual question is stored on the UserLevel corresponding to the individual question (a.k.a. sublevel), and (2) a JSON blob with the answers to all questions in the assessment is stored on the UserLevel corresponding to the assessment (a.k.a. LevelGroup). These two places are updated separately by the client so they can and do get out of sync.
This PR changes the assessments dashboard to read the student answers from the individual questions like other parts of the website. In order to not regress performance, we now process students in batches of 20, retrieving all UserLevel rows needed to construct the response for the 20 students in a single query.
The long-term direction here is still TBD. After this change, the JSON blob on the UserLevel for the assessment is not currently being used. I've left the related code and data alone for now and will revisit and cleanup as appropriate when we take a deeper look at assessments next year. This is tracked as LP-2220.
While I was in this area, I also fixed some cosmetic issues that made the page look really sloppy.
Before:
After:
Links
Testing story
Deployment strategy
Follow-up work
Privacy
Security
Caching
PR Checklist: