Skip to content

Type4Py's Local Model

Amir M. Mir edited this page Sep 6, 2022 · 4 revisions

Requirements

To use Type4Py's local model, the minimum system requirements are as follows:

  • Docker
  • At least 4 CPU threads with 4GB of free memory or higher.

Note: If you have Docker Desktop, allocate the said resources in preferences. See here for more info.

Installation

Type4Py's pre-trained model is provided in a Docker image, which can be queried locally without requesting our central server type4py.com. To use the local model, simply, follow the below two steps:

1- Pull Type4Py's Docker image on your system:

docker pull ghcr.io/saltudelft/type4py:latest

2- Run the Docker image:

docker run -d -p 5001:5010 -it ghcr.io/saltudelft/type4py:latest

The local model can now be queried via Rest API at localhost:5001. Check out the guide on getting type information for a Python file using Type4Py's Rest API here.

Note: Concerning privacy, the local model runs fully on your machine and no data is sent to any external server.

Usage in VS Code

After the installation, the dockerized local model can also be used in the VS Code extension. Check out this tutorial video.

Clone this wiki locally