Some other open-source projects are adding sections to the contributing guides asking contributors some restrictions on usage of generative AI on comments and PRs. For example, here's what Matplotlib's Restrictions on Generative AI usage section in their contributing guide states:
We expect authentic engagement in our community.
- Do not post output from Large Language Models or similar generative AI as comments on GitHub or our discourse server, as such comments tend to be formulaic and low content.
- If you use generative AI tools as an aid in developing code or documentation changes, ensure that you fully understand the proposed changes and can explain why they are the correct approach.
Make sure you have added value based on your personal competency to your contributions. Just taking some input, feeding it to an AI and posting the result is not of value to the project. To preserve precious core developer capacity, we reserve the right to rigorously reject seemingly AI generated low-value contributions.
So far we haven't had any case where contributors post comments or submit PRs that are clearly generated by AI (update 2026-04-21: we already had), but the chances of receiving one of them in the future are not slim.
I'm thinking it would be nice to have a similar statement in our Contributing Guide / Code of Conduct for two reasons:
- Raise awareness that an AI generated text or code might save typing time from the contributor, but shifts the load to maintainers that will take the content seriously, and will spend time and effort replying and reviewing such code.
- Could allow us to enforce the CoC in more extreme cases.
I would like to hear what the rest thinks about this
Some other open-source projects are adding sections to the contributing guides asking contributors some restrictions on usage of generative AI on comments and PRs. For example, here's what Matplotlib's Restrictions on Generative AI usage section in their contributing guide states:
So far we haven't had any case where contributors post comments or submit PRs that are clearly generated by AI (update 2026-04-21: we already had), but the chances of receiving one of them in the future are not slim.
I'm thinking it would be nice to have a similar statement in our Contributing Guide / Code of Conduct for two reasons:
I would like to hear what the rest thinks about this