Skip to content
A GitHub App built with probot that peforms a "background check" to identify users who have been toxic in the past, and shares their toxic activity in the maintainer’s repo.
JavaScript Shell
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github CODEOWNERS added to fix #5 May 29, 2018
assets improve readme which fixes #40 (#43) Jun 11, 2018
docs updated inline docs which fixes #69 (#78) Aug 1, 2018
lib updated inline docs which fixes #69 (#78) Aug 1, 2018
public getIssues & getComments basic ready May 10, 2018
scripts getIssues & getComments basic ready May 10, 2018
test discussionBoardSetup test added (#79) Aug 10, 2018
.env.example configurable made which fixes #66 (#67) Jul 7, 2018
.gitignore code coverage deleted and gitignore rule added (#25) May 30, 2018
CODE_OF_CONDUCT.md
LICENSE Create LICENSE (#80) Aug 11, 2018
README.md fixed readme which fixes #70 (#72) Jul 10, 2018
index.js add collaborator on install which fixes #64 (#65) Jul 7, 2018
package.json createDiscussionRepo github-api method created which fixes #56 (#57) Jul 5, 2018

README.md

background-check

A GitHub App built with probot that peforms a "background check" to identify users who have been toxic in the past, and shares their toxic activity in the maintainer’s discussion repo.

Inline docs App Installations

Demo

How to Use

  • Go to the github app page.
  • Install the github app on your repos.
  • You'll get an invitation to a private repo, accept it and add other maintainers to the repo as well.

FAQ

1. How does the bot finds the background?

The bot listens to comments on repos in which the bot is installed. When a new user comments, the bot fetches public comments of this user and run sentiment analyser on them. If 5 or more comments stand out as toxic, then the bot concludes that the user is of hostile background and an issue is opened for this user in probot-background-check/{your-name}-discussions private repo so that the maintainers can review these toxic comments and discuss whether or not they will like to allow this hostile user to participate in their community.

2. What happens if the sentiment analysis is incorrect?

In case of false positives where the sentiment analysis flags certain comments as toxic while they are not, the discussion issue would still be created. As the bot posts the toxic comments in the issue description, the maintainers can then verify the toxicity and then close the issue if they find the sentiment analysis incorrect.

3. Why does the app maintain a separate org for discussions?

The discussion about a user who has been hostile in the past must ke kept private, so that only maintainers can see it. Because not every account (individual/org) has access to private repo, the app instead uses it's own org. Whenver the app is installed, a private repo for the maintainer's account gets created in the org and the installer is added as collaborator. This way discussions can be held privately.

How To Contribute

1. Setup project in your development machine

  • Fork this repo.
  • Clone the forked repo in your development machine
  • cd into the repo directory, cd background-check probably.
  • Run npm i to setup project.

2. Setup Environment

  • Run cp .env.example .env.
  • Open .env file.
  • Generate API key for Perspective API.
  • Paste this API key against PERSPECTIVE_API_KEY in .env file.
  • Create an org for the app.
  • Create a personal access token and paste that against GITHUB_ACCESS_TOKEN in .env file.
  • Create a Github App and follow these instructions.

Do npm start to check if github app runs correctly in your dev machine. After this create branch, make changes, run tests, commit the changes and make a PR.

Common CLI commands

To make the development of the project faster, these CLI commands are created.

# Install dependencies
npm install

# Run the bot
npm start

# Run bot in dev mode which watches files for changes
npm run dev

# Run Unit Tests
npm test

# Run Unit Tests in watch mode
npm run test:watch

# Run linter and fix the issues
npm run lint

# Serve documentation locally
npm run docs:serve

# Run sandbox
npm run sandbox -- --sandboxName

Eg - npm run sandbox -- --getCommentsOnIssue
You can’t perform that action at this time.