Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project health data collection #24

Open
jimthematrix opened this issue Feb 3, 2022 · 14 comments
Open

Project health data collection #24

jimthematrix opened this issue Feb 3, 2022 · 14 comments

Comments

@jimthematrix
Copy link
Contributor

jimthematrix commented Feb 3, 2022

Overview

This issue tracks the discussions and the work for project health indicator data.

As part of the TSC meeting on 1/6/2022, the policy around project quarterly reports was brought up. As part of the discussions, collecting data to accurately reflect a project's health came up.

A proposal to pre-populate the project quarterly report template with project health data was made. This will help project teams to not be stressed about filling out the reports because each report already comes with useful data. Equally important, this gives the TSC members a standard set of data dimensions to review in order to properly evaluate the health of each project.

Sources of Health Data

Currently, for Hyperledger project, the following sources of data are available:

  • Linux Foundation Insights: a custom developed application maintained by LF. The app currently covers github based activities (commits, PRs, issues) and social engagements (rocket chat).
  • Project github contributor reports: data pulled using github APIs assembled in report PDFs. Currently focused on contributor count and status: total, new, core/regular/casual, active/inactive, etc.

Future Requirements

What can be done to allow project health data to be accurately identified and properly captured?

Stable APIs

Insights currently doesn't offer stable APIs that can be used outside of the Insights' own dashboard UI. Having stable APIs would allow the information to be embedded in other places such as TSC wiki during review meetings.

It's important to capture point-in-time snapshots for the quarterly reports, or other places where such information is used, such as the Learning Materials Development Working Group.

This requirements for snapshots can be accomplished via one of the following ways:

  • APIs to allow a date range to be specified and reliably produce the same results for the same range
  • APIs to allow the data to rendered inside a report or image for the charts, and be downloaded as a pdf or image file

More data sources for Insights

Are there other data sources that can be useful to load into Insights?

Updated List (ver. 03/02/2022)

The following bullets capture the summary from the ongoing discussions.

  • community
    • growth: both in terms of new interested individuals and conversion to contributor. data that reflects this dimension:
      • the number of contributors to the code base (github PRs)
      • the number of contributors to design discussions (discord)
      • the number of contributors to requirements (github issues)
    • diversity: no single organization keeps the project live. data that reflects this dimension:
      • the number of organizations contributing to the code base (github PRs)
    • retention: interesting/useful projects attract contributors, healthy projects retain them. data that reflects this dimension:
      • active contributor longevity (github PRs, discord)
    • maturity: I'm not able to properly articulate this one, maybe someone can help here?
    • responsiveness: how long until proposed changes (code, design, bug reports, etc.) are given attention? data that reflects this dimension:
      • time to resolve PRs and issues (github)
      • time to respond to questions (discord)
  • code
    • usefulness: is the project being adopted by customers and tire kickers? data that reflects this dimension:
      • usage information provided by customers and developers
      • number of questions from clients trying to use the code
      • docker pulls
      • release binary downloads
      • tagged online resources: case studies, presentations, mentorship programs
    • production-readiness: is the current code base coherent enough to be usable in a real-world scenario? data that reflects this dimension:
      • release number (latest is 1.0.0 or later?)
      • test coverage
      • performance and reliability testing data
      • user documentation
@ryjones
Copy link
Member

ryjones commented Feb 3, 2022

@tkuhrt
Copy link
Contributor

tkuhrt commented Feb 4, 2022

A proposal to pre-populate the project quarterly report template with project health data was made. This will help project teams to not be stressed about filling out the reports because each report already comes with useful data. Equally important, this gives the TSC members a standard set of data dimensions to review in order to properly evaluate the health of each project.

I would suggest that we do not want to completely replace project reports with a completely pre-populated report. There is still value to having the maintainers complete the population of a project report as can be seen at The Importance of TSC Quarterly Project Updates.

These are the things that I believe we are trying to do:

  1. Provide a form for gathering metrics that would eliminate the maintainer from having to find the right link on insights for the report.
  2. Provide a static view of the metrics at the time of the report generation that does not change when viewed at a later date.
  3. Simplify the project report while still providing enough information to the TSC to allow them to understand the health of the project, as well as, what is happening within the project that is interesting to others in the community.

@davidwboswell
Copy link

For me the big question is what do we pull from these data sources? We could generate dozens of graphs and charts from Insights and Github, although that's not going to be helpful. Some of the data available doesn't seem all that relevant (like the top emojis used in chat) and some could even be counter-productive to include in reports (for instance, tracking lines of code added seems like it could be problematic for a variety of reasons).

This could be a chance to take another look at the Project Badging document that was being worked on last year. It lists a set of things TSC members thought were important to look at. For instance, what is the contributor diversity for a project, do projects have sufficient test coverage, are projects using their official channels, how responsive are projects to new contributions, etc. Not everything there could be generated in an automated report, but this could give us a set of things we'd like to see in an automated report.

https://wiki.hyperledger.org/display/~shemnon/Project+Badging+Proposal

And for another link to check out, I wrote an article about community metrics several years ago that summarized my experience of trying to pull together reports of community health at Mozilla. There are some thoughts and recommendations to consider in that.

https://cmxhub.com/community-health-metrics-retention-diversity-maturity/

@tkuhrt
Copy link
Contributor

tkuhrt commented Feb 4, 2022

David's link to his article reminded me that I had put together the following requirements document when I originally started working on the problem of determining community health. In case that helps in any way.

@arsulegai
Copy link
Member

I see the issue as a question of "who is interested in what in the system". We have multiple ways to get the data/metrics information, but there is no flexibility to apply intelligence on that. At present, TSC & community members do analyze the available metrics information to the best of their ability.

Looking at the concern here, it would help if the TSC can arrive at the set of intelligence metrics to measure project's health, figure out outliers in the available dataset. Much better if we can use BI dashboards, that would allow us to customize graphs per individual's interests.

The LFX insights does provide such a feature to some extent. For instance, it calls out percentage increase in contributors count, average PR wait time in the last quarter, metrics increase/decrease on the regular contributors etc. However, having flexibility through intelligence tooling would be great.

In terms of quarterly reports at Hyperledger, maintainers can continue to provide quarterly dataset available to them through one of the sources (LFX insights, the contributors report or any others for that matter).

@jimthematrix
Copy link
Contributor Author

jimthematrix commented Mar 1, 2022

So the first order of business is defining the dimensions TSC would like to use to properly judge the health of a project.

Trying to summarize the various inputs I've got so far (thanks David, Tracy, and Ry for providing the various sources of information written in the past on the topic).

It looks like we believe a healthy project, regardless of the stage of the lifecycle they are in (incubating, active, graduated), should have the following attributes (note that these are different from measurements, which are concrete data to be presented in order to properly demonstrate the attributes):

  • community
    • growth: both in terms of new interested individuals and conversion to contributor
    • diversity: no single organization keeps the project live
    • retention: interesting/useful projects attract contributors, healthy projects retain them
    • maturity: I'm not able to properly articulate this one, maybe someone can help here?
  • code
    • usefulness: is the project being adopted by customers and tire kickers?
    • completeness: is the current code base coherent enough to be usable in a real-world scenario?

Would love for folks to comment on the list above so we can all establish a common baseline. Then we can proceed to decide what types of data are needed, recognizing that not all dimensions above can necessarily be ascertained.

@tkuhrt
Copy link
Contributor

tkuhrt commented Mar 1, 2022

I like the first three bullet points under community. I believe that trend data will help with visualizing. For maturity, this might imply the ability to do releases on a regular frequency. I also wonder if there is something around answering questions, resolving issues, and reviewing PRs that fall into this category as well.

For the code section, I am not sure how you would measure either of these items. usefulness is a subjective term and would mean different things to different people. completeness could imply that there is no roadmap as the functionality is "complete". Both fall into the hard to measure because of the subjectiveness.

@hartm
Copy link

hartm commented Mar 1, 2022

To follow up on @tkuhrt 's point on usefulness: one thing that I think we don't necessarily do well is track where codebases are used. A lot of code in Hyperledger gets used by people who aren't connected to the project itself and don't contribute. If we had a nice way of recording where things were used (that maybe many people didn't know about) it might be a good way to measure "usefulness."

@hartm
Copy link

hartm commented Mar 1, 2022

I also think it might make sense to redefine "completeness" to something like "production-ready." Production readiness would seemingly capture the current definition completeness and could also be used to incorporate things that we'd like projects to do that aren't captured by the earlier criteria (e.g. security audits). What does everyone think about this?

@petermetz
Copy link
Member

I also think it might make sense to redefine "completeness" to something like "production-ready." Production readiness would seemingly capture the current definition completeness and could also be used to incorporate things that we'd like projects to do that aren't captured by the earlier criteria (e.g. security audits). What does everyone think about this?

+1

@petermetz
Copy link
Member

petermetz commented Mar 2, 2022

One more idea to measure maturity is by the number of users running it in production. I know of course that this is impossible to measure completely accurately, but we as the TSC could at least give some advice to projects that they could create issues like my favorite k8s project does: metallb/metallb#5 where they ask people to disclose the fact that they are using the project.

One more thing I noticed projects doing is including the names/logos of users (companies/businesses) in their readme (if the users agree to be featured of course). It makes the project readme more convincing and it could be useful information for the TSC in terms of maturity.

@davidwboswell
Copy link

For maturity, I think it is helpful to state where in the project lifecycle a project is and that puts the other metrics in perspective. For instance, a brand new project that has recently started incubation will very likely not have a diverse set of maintainers yet and that is understandable. A graduated project without a diverse set of maintainers though would be more cause for concern. The article I referenced called this out as something that is important to measure since not all open source projects have defined a project lifecycle, although Hyperledger's project lifecycle is well defined.

One other community attribute to consider calling out is 'responsiveness'. For instance, how long does a new contributor wait for feedback on a potential contribution or question?

And I like Hart's comment about usefulness being measured by where we see a project being used. We'll never know everywhere an open source project is being used, but we can make better use of the information we do have. For instance, we have a case study library that documents how certain Hyperledger projects are being used and people who are building things with projects regularly present about them to SIGs and at meetups. Including recent case studies or presentations about a project could be useful to add in reports. That could be automated too if we are consistent in tagging this content.

@davidwboswell
Copy link

I recently was looking at how the Eclipse Foundation presents information about their projects and I thought this would be helpful input for this conversation. In their project listings it looks like they present automated metrics for each project. You can see an example at:

https://projects.eclipse.org/projects/tools.tracecompass/who

This shows some diversity metrics which seem useful and it also makes it very easy to see who is leading the project and who else is involved and that seems very useful too. This data about who is involved seems like it isn't always that discoverable for our projects.

I don't necessarily agree with the usefulness of having a commit activity graph though -- seeing how many commits have happened doesn't seem all that informative and doesn't seem to say much about project health. Commit counts could go down for legitimate reasons -- for instance, maybe someone who made frequent small commits was replaced by someone contributing the same amount but they prefer less frequent larger commits.

The full list of Eclipse projects is at:

https://projects.eclipse.org/

@jimthematrix
Copy link
Contributor Author

jimthematrix commented Mar 2, 2022

agree with comments above that "usefulness" is an important dimension and we need to be creative about how to capture the data. I like the suggestions offered so far:

  • dedicated project-specific issue for the developers to tell where the project is used
  • featured logos in the main readme
  • tagging hyperledger-owned online resources (case studies, presentations, etc.)

Thanks for suggesting "production-readiness" Hart. I agree it captures a wider range of concerns than completeness.

I've updated the list in the main issue description.

@tkuhrt tkuhrt added task-force-proposal Task Force Proposal and removed task-force-proposal Task Force Proposal labels Jan 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants