Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve internal submission data reporting #2820

Closed
NateWr opened this issue Sep 27, 2017 · 21 comments

Comments

@NateWr
Copy link
Contributor

commented Sep 27, 2017

The 2017 sprint produced a document proposing two reports (submissions and reviewer reports), which detailed the fields required, set priorities and noted errors in the existing data set.

https://docs.google.com/spreadsheets/d/e/2PACX-1vS9mypS4nXwcQcsdVUv0kmjjHFhLKJY21jWUUIPs0CT20dla825Z7ACxfGMMOC2FuKYHqVxLnRV6N_i/pubhtml#

Further information can be found in the sprint report post:

https://pkp.sfu.ca/2017/09/26/pkp-2017-sprint-report-internal-statistics/

There are three broad tasks:

  1. Add missing data to existing reports, or create reports where missing.
  2. Correct faulty data in existing reports.
  3. Add some kind of visual display of key internal stats.

Related issues: #1265, #2061

@NateWr NateWr added this to the OJS/OMP 3.2 milestone Sep 27, 2017

@NateWr NateWr added this to Current Issues in Statistics Framework Nov 28, 2017

@jmacgreg jmacgreg added the Hosting label Jan 11, 2018

@jmacgreg jmacgreg removed this from Current Issues in Statistics Framework Jan 17, 2018

@jmacgreg jmacgreg added this to Current Issues in Statistics Framework Jan 17, 2018

@jmacgreg jmacgreg moved this from To Do to Needs Specification in Statistics Framework Feb 12, 2018

@jmacgreg jmacgreg moved this from Needs Specification to Notes in Statistics Framework Feb 12, 2018

@jmacgreg jmacgreg changed the title Improve internal statistics reporting Improve internal submission data reporting Feb 12, 2018

@jmacgreg jmacgreg moved this from Notes to Needs Specification in Statistics Framework Feb 12, 2018

@jmacgreg jmacgreg moved this from Needs Specification to In Progress in Statistics Framework May 7, 2018

@bozana bozana self-assigned this Oct 9, 2018

@Baytan

This comment has been minimized.

Copy link

commented Nov 6, 2018

I learned that the submission and last decision dates will be added to the article reports. It is very good and thanks. But, I think first decision date is also important as some journals report the first decision date to authors to express how fast they process.

In short, I think additional to the submission and final decision dates, first decision date should be added to the reports. Thank you all.

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Dec 3, 2018

Review Report now contains the following columns:

Stage
Round
Submission Title
Submission ID
Reviewer
Given Name
Family Name
ORCID iD
Country
Affiliation
Email
Reviewing interests
Date Assigned
Date Notified
Date Confirmed
Date Completed
Date Acknowledged
Unconsidered
Date Reminded
Response Due Date
Response Overdue Days
Review Due Date
Review Overdue Days
Declined
Recommendation
Comments On Submission

The report row represents, as till now, one reviewer assignment per submission and review round, which I think is correct. Because of that, the assigned editors as well as decision do not make much sense in this report -- they will just be repeated for each row of a submission and a round. Furthermore, this data is considered in the article report, so I believe there is not need to consider it here?
The following additional reviewer data are added, as wished: e-mail, ORCID, country, affiliation. We do not have gender any more, so this is not possible to consider.
I hope, all wished dates are now considered? -- Eventually I do not understand some terms from the google doc correctly...
What exactly do those "# Times ..." for a reviewer review mean?
@jmacgreg (and eventually @NateWr), could you take a look at this new columns list and the one from the google doc i.e. from the requirements and tell me if you think that something is missing or should be differently or ...?

@alexxxmendonca

This comment has been minimized.

Copy link
Contributor

commented Dec 3, 2018

Because of that, the assigned editors as well as decision do not make much sense in this report -- they will just be repeated for each row of a submission. Furthermore, this data is considered in the article report, so I believe there is not need to consider it here?

I'm not so sure about this. One would have to cross data between reports if one needs to find out which editors were assigned and what was the final decision.

What exactly do those "# Times ..." for a reviewer review mean?

It stands for "Number of times". Just replace the # to "Number of" ;)

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Dec 3, 2018

It stands for "Number of times". Just replace the # to "Number of" ;)

But if a report row is only for one submission and one reviewer and one round: a reviewer will only once be invited/acknowledged etc.
If this is a general reviewer information of the reviewer, it will be again duplicated for each row with one and the same reviewer.
Thus, maybe @jmacgreg and @NateWr and/or other could also tell what they think about those duplication of information in reports... i.e. if there could eventually be a better way... depending on the most requirements... Especially for assigned editors and decisions, because there could be several editors assigned (containing given + family name, ORCID, e-mail) and several decisions made (containing decision date and decision itself)...

@alexxxmendonca

This comment has been minimized.

Copy link
Contributor

commented Dec 3, 2018

Yes, that makes sense.

@NateWr

This comment has been minimized.

Copy link
Contributor Author

commented Dec 3, 2018

Looks good, @bozana. 👍 The only thing I didn't see mentioned from the spreadsheet is # days between invitation and completion. This can be easily calculated from the invitation and completion columns, so we're probably ok there.

One would have to cross data between reports if one needs to find out which editors were assigned and what was the final decision.

To determine whether we should put the assigned editors and final decision information into the reviewer report, I think we need to decide what use-cases we want to support. Are the reports meant to be a raw data dump that will be processed in some way? Or do we treat them as something for non-technical end users to look through line by line?

If it's the latter, it might be worth adding this information in, even if it's duplicated for each review assignment. But I'd want to hear more details about the specific use-cases here (@alexxxmendonca can you outline some?). It may be that those use-cases are better served by a filter for assigned editors, or adding to the submission report, or maybe by creating a new report altogether.

@jmacgreg

This comment has been minimized.

Copy link
Member

commented Dec 3, 2018

Hi @bozana, this looks great! I'll review it myself, and will be having Minnesota review it as well. They can probably get any comments back within a week.

In terms of use cases, I think there's a case to be made, over time, for a high-level Production report (ie. the current Articles report) that provides information on the submission's activity per stage, including dates; and a more detailed Reviews report, which is what we have here. In both cases, some duplication of content will be inevitable. In the case of the Reviews report, I think it's important for the managing editor/editorial team to see the assigned reviewer and the decision from that review stage, if available. But less important than the content already included.

@alexxxmendonca

This comment has been minimized.

Copy link
Contributor

commented Dec 3, 2018

To determine whether we should put the assigned editors and final decision information into the reviewer report, I think we need to decide what use-cases we want to support. Are the reports meant to be a raw data dump that will be processed in some way? Or do we treat them as something for non-technical end users to look through line by line?

I think it's best if we go for a data dump approach (it is also easier to deal with). OJS itself shouldn't necessarily provide highly elaborated reports since it is not within its core purpose. However, it should provide enough data to be exported and imported in a tool that was designed specifically for data manipulation (for example, Excel).

I am in favor of providing as much data as possible and the journal can just get rid of the columns/data that they are not interested.

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Dec 3, 2018

The article report will now contain the following columns:

Submission ID
Title
Abstract
Given Name (Author X)
Family Name (Author X)
ORCID iD (Author X)
Country (Author X)
Affiliation (Author X)
Email (Author X)
URL (Author X)
Bio Statement (e.g., department and rank) (Author X)
Given Name (Editor X)
Family Name (Editor X)
ORCID iD (Editor X)
Email (Editor X)
Editor Decision Y (Editor X)
Date decided Y (Editor X)
Date submitted
Last modified
Section title
Language
Status

Gender is again not there, because we do not have this information.
Having all decisions + corresponding date decided for each editor (editor = editor + sub editor) provides really all decisions made + the information by whom + showing if it is just a recommendation or a decision.
Status corresponds to the "Current task" from the google document, I think.
Here again, the review information is not included, as there is the review report for that.
Regarding the dates: the last decision date is not explicitly displayed (because all decisions and their dates are listed), as well as "Days since..." because I thought the user can calculate them then in Excel.

Note: we export the data in CSV format, that can be included in Excel or Calc and calculated/managed further as wished...

To all (@jmacgreg, @NateWr, @alexxxmendonca): OK so?

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Dec 3, 2018

I will then provide the PRs here soon, for you (maybe @jmacgreg) to take a look/test it with real data.
And will to hear from you if the mentioned duplicate data or something else should be included as well...

@alexxxmendonca

This comment has been minimized.

Copy link
Contributor

commented Dec 3, 2018

I understand not adding all the information regarding peer review rounds, but perhaps just a "# of reviews" field?

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Dec 3, 2018

i.e. the changes are in my repositories, branch 2820... -- for testing...

@asmecher

This comment has been minimized.

Copy link
Member

commented Dec 17, 2018

@bozana, could you back-port this to the ojs-stable-3_1_1 branch/repos as well? Thanks!

@asmecher

This comment has been minimized.

Copy link
Member

commented Dec 17, 2018

(Rescheduled tentatively so I don't forget -- not confirmed!)

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Jan 10, 2019

@asmecher, now it would be enough to rebase/cherry-pick/port this with/to stable-3_1_2 (and not any more with/to ojs-stable-3_1_1), right?

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Jan 13, 2019

@jmacgreg and @asmecher, I've rebased the changes with the current stable-3_1_2 branch -- s. the PRs above.
@jmacgreg, is the branch stable-3_1_2 enough for Minnesota to be able to review it?

When all OK, I will then rebase with the current master as well...

@jmacgreg

This comment has been minimized.

Copy link
Member

commented Jan 14, 2019

Hi @bozana, yep, that will be perfect! I'll be setting up a new review branch for them this week. We'll get any comments back ASAP.

@jmacgreg

This comment has been minimized.

Copy link
Member

commented Feb 11, 2019

FYI I've reviewed these, with Minnesota, and all looks good! This can be closed AFAIK.

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Feb 11, 2019

Thanks a lot @jmacgreg! I will then rebase and merge with stable-3_1_2 and cherry-pick to master...

bozana added a commit to bozana/pkp-lib that referenced this issue Feb 14, 2019

bozana added a commit to bozana/reviewReport that referenced this issue Feb 14, 2019

bozana added a commit to bozana/ojs that referenced this issue Feb 14, 2019

bozana added a commit to bozana/ojs that referenced this issue Feb 14, 2019

bozana added a commit to pkp/reviewReport that referenced this issue Feb 14, 2019

bozana added a commit to bozana/pkp-lib that referenced this issue Feb 14, 2019

bozana added a commit to bozana/reviewReport that referenced this issue Feb 14, 2019

bozana added a commit to bozana/ojs that referenced this issue Feb 14, 2019

bozana added a commit to bozana/ojs that referenced this issue Feb 14, 2019

bozana added a commit to bozana/ojs that referenced this issue Feb 14, 2019

@bozana

This comment has been minimized.

Copy link
Collaborator

commented Feb 14, 2019

merged, thus closing...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.