Skip to content

Putting together VS Code extension, express, tensorflow.js, and a text toxicity classifier into a simple project.

License

Notifications You must be signed in to change notification settings

caponetto/vscode-tfjs-toxicity

Repository files navigation

Text toxicity classification on VS Code

Putting together VS Code extension, express, tensorflow.js, and a text toxicity classifier into a simple project.

Just for fun. :P

Modules

  1. client: Activate the VS Code extension by initializing the express server and establishing a connection with the language server.

  2. server: Language server responsible for validating the text document and reporting back the diagnostics to the client. It also includes a simple express server that classifies text toxicity through POST requests.

Running the example

  • Open this example in VS Code 1.43+
  • In the terminal, execute yarn run init && yarn run build:fast
  • F5 to start debugging

Open a text file with some content or type stuff.

VS Code will detect whether text contains toxic content such as threatening language, insults, obscenities, identity-based hate, or sexually explicit language.

Note: The model will be loaded on the first time a text document is opened. This operation takes a few seconds.

Troubleshooting

Execute the following command if you run into issues when loading the model:

$ npm rebuild @tensorflow/tfjs-node --build-from-source

Then, build the packages with yarn again.

About

Putting together VS Code extension, express, tensorflow.js, and a text toxicity classifier into a simple project.

Topics

Resources

License

Stars

Watchers

Forks