Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Global idea: Improve detection of reviews that really need more comments #20
Context: there is (and always will be) more people posting reviews than reviewers, so anything we can do to help reviewers focus their efforts on the reviews that need it the most will benefit everyone (the players because they'll have comparatively more reviewers available, and reviewers because they'll increase their chance of their comments being useful and appreciated)
More concretely, when going through the list of reviews I haven't looked at yet, there are two categories of reviews where I'm never sure if I should add my inputs on:
Update: I wanted to start detailing more in details what the feature looks like. The idea would be to give each review a "score", that indicates how desirable (what this means exactly needs to be defined) it is to comment on the review.
Related: hide the entries that don't need more comments (or conversely showcase the ones that need more). Criteria to decide whether or not they need more comments could be:
Rationale: there is (and always will be) more people posting reviews than reviewers, so anything we can do to help reviewers focus their efforts on the reviews that need it the most will benefit everyone (the players because they'll have comparatively more reviewers available, and reviewers because they'll increase their chance of their comments being useful and appreciated)
Indeed. Been pondering this one - without conclusion yet - for a while now myself. I continue to ponder...
Bascially, there needs to be a way to focus the volunteer attention on the most (1) worthy (e.g. becaue the OP has put the effort in too) and (2) valuable (to the OP, to the commentator, to the community in general, to Z2H, etc.) games posted for review. Guaranteeing attention is the function of paid coaching, so I think it is fine to have an approch to being selective and generating focus for the rest. Quality over quantity: critical, as quality of review is the site's current USP I think.
(This underlying thinking is what will be guiding my thoughts on what to do and form the success criteria to test against)
Yup - totally agree. Plus, attempt to define "need it most" in more detail is above.
I agree this is a tricky one (and one I've wondered about frequently myself). We're balancing two conflicting goals: (1) ensuring as many reviews as possible get a response that helped (2) a tendency for reviews that have multiple participants to produce the best and most informative debate.
Is the answer simply to divide the time in a structured way? Thus:
I think it's fine to be hard-nosed about this. I myself review games where the reviewer went through it first before going looking for any others. Particularly with the automated uploader being a thing now, there's a lot more of this, so we need to focus on the ones where there's evidence the OP themselves is also focussing on it.
A useful (new) distinction to make might be between "upload" and "post". With the HDT plugin, there now exists (many) games that could be argued to be uploaded, but not actually posted. "Posting" a game, to me, means filling in all the fields, being explicit about any specific questions I have, adding the decklist, etc. (i.e. all the things that were true of what constituted a "good post" with the old comments system still apply) then finally making it public. Thus "posting date" is the point in time the game was set to "public".
So... given all this... I think we want to try to find games that are:
Some (or perhaps all) of these are probably better implemented as prioritising factors rather than strict inclusion/exclusion criteria... indeed, it's probably better to develop a "need" algorithm rather than exclude anything (anything with a low "need" score would be implicitly left off simply by virtue of ending up at the bottom of a long list).
Regarding what the OP can give to help this process:
I agree this would be... essential. I think the aforementioned prioritised list should only include those that - in theory anyway - need attention.
Not sure about this one... or at least it should take a while before reviews "expire" and drop off the list.
I think this is a good idea and useful addition to the review tools (including regardless of what else is ultimately decided on to implement)
Or... both: "resolved" is defaulted to true if the OP marks comments as "helpful" (but he/she has the option to set it back to "looking for further input"), and the OP can set "resolved" to true even if they don't make any other changes.
(use case: I tend to mark comments as "helped me" if I learnt something actually new, whereas I tend to use the upvote feature for comments that are otherwise good (i.e. confirmed my thinking, generally positive, expanded on a point, etc. etc. So, it is possible for a review to not actually teach me anything new, but be good/quality/satisfying nonetheless - and it would be nice to be able to indicate so)
Definitely. Further reputation points?
btw, is the reputation point system still fit for purpose after all the ways-of-working changes on the site?
Need to avoid nagging ofc, but perhaps one reminder per game might make sense. I suggest it follows the same conventions (and respects the same settings) as the notification system in general however (i.e. it just becomes another type of notif).