New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add difficulty filter on mentor dashboard #4105

Open
nicolechalmers opened this Issue Aug 1, 2018 · 8 comments

Comments

Projects
None yet
3 participants
@nicolechalmers

nicolechalmers commented Aug 1, 2018

Mentoring can get repetitive when working through the same easy core solutions. See screenshot - actual mentor's dashboard.

Suggestion: Add difficulty filter to "Solutions you're mentoring" so mentors can vary the exercises they give feedback on to make the process more varied and fun.

Other related ideas for discussion:

  • Have a random solution generator button which gives mentors a list of the most varied set of solutions to work through
  • When mentors sign up we can ask if they would like to give feedback on "Easy" solutions or "All" solutions. I know some mentors aren't sure if they're ready to mentor so choosing Easy (at least to start with) would make sense and would make things more interesting for more experience mentors.

screen shot 2018-08-01 at 13 53 25

@frerich

This comment has been minimized.

frerich commented Aug 1, 2018

I think somehow bringing more difficult exercises closer to the top of the list is not only for the mentors benefit -- as it is, I wonder whether people submitting solutions for harder or less popular problems ever get any mentor feedback?

@nicolechalmers

This comment has been minimized.

nicolechalmers commented Aug 1, 2018

I think solutions for harder or less popular exercises are treated equal to all other exercises. @iHiD - can you confirm?

@iHiD

This comment has been minimized.

Contributor

iHiD commented Aug 1, 2018

Exercises marked as CORE are ordered by oldest first. So there's no particular thing in the software meaning that hard exercises are not showing up.

@frerich

This comment has been minimized.

frerich commented Aug 1, 2018

Are CORE exercises always preferred over the optional ones? If so, I wonder whether it can happen that some users submit solutions to optional exercises but never actually get any feedback since the mentoring queue is saturated by the CORE submissions (which are most certainly more numerous since everyone has to go through them and they are easier, so they are submitted at a higher rate).

@iHiD

This comment has been minimized.

Contributor

iHiD commented Aug 1, 2018

Yes. Because core exercises block students from progressing, whereas the extra ones don't.

I see the core as teaching opportunities and the extras as practice opportunities. And while mentoring is great for both, it's more essential on the former.

@frerich

This comment has been minimized.

frerich commented Aug 1, 2018

One crazy idea:

Instead of (or in addition to) using a one-dimensional chronological sorting, maybe it would be interesting to group the submissions by exercise and then display the exercises as tiles, i.e. in a two-dimensional grid. You could then use a 'heat map' to indicate where mentoring is needed. I.e. each exercise is a rectangular tile with a color ranging from e.g. white over green, yellow, red to purple.
The color indicates the average response time for user submissions -- a higher temperature means a bigger response time, so more activity is needed.

The idea being that even if some exercises cause a lot of traffic, you still have the overview and -- if you like -- can choose to mentor on an entirely different solution.

It's just a crazy idea I wanted to throw into the room. :-)

@frerich

This comment has been minimized.

frerich commented Aug 1, 2018

I think it makes perfect sense to consider the core exercises as 'teaching opportunities'. However, since there are only a handful of core exercises (they are vastly outnumbered by the optional ones), I wonder whether maybe those teaching opportunities eventually become a bit repetitive as long as as core exercise submissions are preferred in the display over optional ones.

@frerich

This comment has been minimized.

frerich commented Aug 1, 2018

This may also just be an unfortunate consequence of the Haskell language track in which 3,802 students are distributed over 7 mentors. So assuming everyone is equally active, that would mean I have to look at 543 solutions to testing whether a year is a leap year; I wouldn't mind seeing some other code every now and then. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment