-
-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Providing Guidance to Working Groups Around Privacy #294
Comments
It was also discussed that the new Community Operations Team should have a hand in moving forward with this, since it touches on all metrics across Working Groups. cc @germonprez |
A starting point for the text of the guidance document: Privacy Guidance for Working GroupsMetrics compilation and publication may lead to privacy violations. Each metric must be examined for potential data ethics problems. Data relevant to a metric may have heightened sensitivity. Working groups should consider sensitivity levels. Organizations may be exposed to risks. These risks may flow from compliance with the GDPR in the EU, with state law in the US, or with other regulations. These risks may flow from terms of service for data providers such as GitHub and GitLab. |
We need to provide guidance to consumers of metrics (either implementors or consumers of implementations) as well as to working groups creating metrics. |
Signed-off-by: Lucas Gonze <lucas@gonze.com>
Thanks for the draft @lucasgonze. I think this is a good start. I can also imagine that at some point we provide a list of ethical concerns and WGs can check which ones they might apply to any given metric. |
With regard to Silona's comments today on specific regulations in the EU, I have added a "# Relevant Regulations" section for gathering data on concrete restrictions. |
I'm closing this as the privacy/ethics statement has been added to all metrics -- even the Chinese/Mandarin release |
During multiple discussions in our weekly Community Call and at CHAOSScon, the idea was surfaced that we should be providing guidance to our Working Groups around the consideration of privacy and ethics while developing metrics. For instance, sometimes our metrics include PII (Personally Identifiable Information) and we feel it's important to flag this for anyone who is using this metric in practice.
This "guidance" will consist of 3 things:
We will also include a data privacy review of previously released metrics when doing our Metrics Audit for next release.
(For context, we have a similar framework for thinking about how a metric might impact DEI).
Thanks to @lucasgonze for championing this!
The text was updated successfully, but these errors were encountered: