Skip to content
This repository has been archived by the owner on Aug 1, 2023. It is now read-only.

Proposal: Community Audit of Monthly App Mining Rankings #50

Closed
GinaAbrams opened this issue Feb 26, 2019 · 8 comments
Closed

Proposal: Community Audit of Monthly App Mining Rankings #50

GinaAbrams opened this issue Feb 26, 2019 · 8 comments
Assignees

Comments

@GinaAbrams
Copy link
Contributor

In its current state, there is some opacity around the app mining rankings and the payouts until after the payouts are complete. Would like to add a community QA element before the payouts are done.

Open question is methodology:

  • should we include the whole community, or rotate individuals?
  • Should there be a reviewer who specifically works on this every month?
  • What are the necessary criteria to result in changing the draft of rankings?

cc @hstove @cuevasm @jeffdomke

@friedger
Copy link
Contributor

Should the quality of the quality of the reviewer be audited as well?

@GinaAbrams
Copy link
Contributor Author

@friedger are you thinking on a reviewer by reviewer basis or in the selection of the auditor?

@friedger
Copy link
Contributor

friedger commented Feb 28, 2019

@GinaAbrams I was thinking about reviewing the quality of the reviewers score as part of the audit. Not about the selection of the auditor.

@hstove
Copy link
Collaborator

hstove commented Mar 1, 2019

The game theorist's paper includes some possible methods for reviewing the reviewers.

However, this issue is about making sure we've done calculations correctly based on the raw data. Determining if a reviewer is doing a good job is part of our normal feedback process, which can be done in this Github and our app mining community meetings.

@hstove
Copy link
Collaborator

hstove commented Mar 8, 2019

Ok - so to recap, this is about wanting to reduce errors in the actual calculations of scores. Not really related to 'meta-comments' about how scores 'should' be, more so making sure we are converting the raw data to rankings correctly.

We are leaning towards having a single community member be invited to review these calculations (in the form of a spreadsheet), and having a day or two of audit time before publishing the full rankings and doing payouts.

Do community members have any feeling about who this person should be?

@fiatexodus
Copy link

Alternately @hstove, you could just announce preliminary results and release the data for the app community to review for a day before they become official? (Removes the burden/extra process of selecting a person each month.)

@dantrevino
Copy link

I may be stating the obvious, but preliminary reviews will need to include any methodology/calculations used. The formulas are in the spreadsheet cells, but there should be language around how/why the calculations are the way they are. I had a discussion today with an app miner about the difference between the scores displayed on the TryMyUI site and the spreadsheet.

@GinaAbrams
Copy link
Contributor Author

Thanks @dantrevino, agree with you there 🙏.

For April and moving forward we are going to release the sheet of results two business days ahead of payments so that the community can review. This could change down the line, but is a start. Going to close this soon unless there are any objections.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants