Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Case Study: Gaining Diverse Community Insights (Mozilla) #1

Closed
emmairwin opened this issue Aug 12, 2017 · 11 comments
Closed

Case Study: Gaining Diverse Community Insights (Mozilla) #1

emmairwin opened this issue Aug 12, 2017 · 11 comments

Comments

@emmairwin
Copy link

Best practices, standards, recommendations, quotes evolved from 3 months of Diversity & Inclusion research in Mozilla's communities.

Summary from existing sources (blog posts below, with 2 more still on their way)
https://medium.com/mozilla-open-innovation

I also have a workshop 'Imposter Syndrome in Open Source' (but warning it challenges the idea of meritocracy, which I see another chapter seems to recommend.

@semioticrobotic
Copy link
Member

This is lovely, @emmairwin. We'd be delighted to see a draft of this chapter/case study for the book. In general and wherever possible, case studies should cover:

  • The problem/issue the organization experienced
  • The solution, based on open principle(s), that the organization implemented
  • The outcomes or results of this implementation

You and I can riff on that structure, too, to help your specific vision take shape.

As for the workshop: Yes, we would be interested in seeing that, too! "Meritocracy" is a pervasive notion in open source, but it's far from being a universally accepted one.

@emmairwin
Copy link
Author

emmairwin commented Aug 13, 2017 via email

@emmairwin
Copy link
Author

emmairwin commented Aug 14, 2017 via email

@semioticrobotic
Copy link
Member

Thanks, @emmairwin. It may still be. Even though we are indeed looking for case studies from organizations that have completed certain initiatives, perhaps there's a story here about designing the assessment tool or interview protocol itself. How did you produce it and determine the object of your analysis? Was that initiative community-focused, etc.?

@emmairwin
Copy link
Author

emmairwin commented Aug 14, 2017

Perhaps framing might be "diverse methods for conducting research in open communities".

Roughly:

Problem: To generate a strategy for diversity & inclusion in Mozilla's communities, we need to understand what we mean by diversity & inclusion - in global context, across languages, gender identity, cultural identity, legal and many dimensions we are simply not aware of.

The solution, based on open principle(s), that the organization implemented

  • Qualitative Methods (interviews, focus groups - in person, focus groups via text-chat, in person, in video, in person/video in native language)
  • Quantitative Methods - creating a baseline - evaluating data sources for D&I insights.

The outcomes or results of this implementation

  • things we learned (i.e. text chat isn't only for low bandwidth, introverts and non-English speakers also chose this option).
  • Playbook for generating diverse insights into your open community
  • Metrics that matter - standards & best practices for building effective base lines and evaluation of d&I in your community (wip)

The strategy resulting from this research comes out towards the end of September. If this feels valuable I can propose a draft, but also if it doesn't feel complete enough that's fine as well - and will keep in mind for future.

@semioticrobotic
Copy link
Member

This is great, @emmairwin! Spectacular. Having this in the book would be lovely. Assuming you are the author, I will make sure our working table of contents is up to date. Please send a draft whenever you have one.

Thanks!

@emmairwin emmairwin changed the title Unit 2 - Inclusivity Inclusivity - Gaining Diverse Community Insights Aug 15, 2017
@yevster
Copy link

yevster commented Aug 16, 2017

An idea for an exercise under this chapter: a Bias Defense Checklist when evaluating contributions.

The idea would be to enumerate a number of known biases that an evaluator might have prior to considering a contribution (e.g. a project administrator evaluating a PR or even assigning a priority to an issue). For each item, the reader would write +1 if the item is likely to bias him/her in favor of the contributor, -1 if the item is likely to bias him against the contributor, 0 if no bias is likely.

For instance:

  • Has this contributor ever helped or done even a small favor for you? (+1 if yes, 0 if no). [Bias: reciprocity]
  • Have you ever helped out or done even a small favor for this contributor? (+1 if yes, 0 if no) [Bias: "Foot in the door"]
  • Is this person good looking? (+1 if yes, -1 if no, 0 if unknown). [Bias: Halo Effect]
  • Is the person of the same or similar gender, age, and/or race as you? (+1 if yes, 0 if no) [Bias: Similarity Heuristic]
  • Is the person male? (+1 if yes, 0 if no) [Bias: Cultural image of the successful contributor]
  • Have you previously expressed an opinion on the value of the contribution or the skill of the contributor? (+1 if prior opinion was positive, -1 if prior opinion was negative, 0 if no prior expressed opinion) [Bias: commitment/cognitive dissonance avoidance]

Etc.
Obviously, there's not a lot of rigor in knowing the relative strengths of one bias compared to another. The goal is not so much to compute a net result, as to be reminded of biasing factors prior to evaluating a contribution.

@semioticrobotic
Copy link
Member

Love the idea of a Bias Defense Checklist, @yevster, and would like to see more. One question:

The idea would be to enumerate a number of known biases that an evaluator might have prior to considering a contribution (e.g. a project administrator evaluating a PR or even assigning a priority to an issue). For each item, the reader would write +1 if the item is likely to bias him/her in favor of the contributor, -1 if the item is likely to bias him against the contributor, 0 if no bias is likely.

Do you think there's a way to "abstract" the exercise outside the context of software development/code contribution? I ask because ideally the handbook should be useful to all kinds of organizations.

@yevster
Copy link

yevster commented Aug 17, 2017

Yes, of course! Sorry, I seem to have a "developer bias" where I associate "open" with "open source". But when I say "evaluating a contribution", I do mean a contirbution in any domain from any member of an organization or community.

@semioticrobotic
Copy link
Member

Fantastic, @yevster. Thanks for considering that feedback and direction. I will add you to the book's working table of contents! Expect to see a code push soon.

@semioticrobotic semioticrobotic changed the title Inclusivity - Gaining Diverse Community Insights Case Study: Gaining Diverse Community Insights (Mozilla) Aug 25, 2017
@semioticrobotic
Copy link
Member

We've crossed our first milestone, @emmairwin and @yevster! That means we're now moving onto the writing stage of the project. Please feel free to drop me a line (here, email—whatever works) with your questions, comments, and concerns as you go along. You can also send a note to bbehrens@redhat.com if you'd like me to add you to the author email list. Thanks, and happy writing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants