A template to help you get run Retool Locally via Docker Desktop
🐳 Docker Desktop: version 4.25+
In general settings:
- choose
VirtioFSso that virtualization framework is enabled - Enable
Use Rosetta for x86/amd64 emulation on Apple Silicon
In resources:
- Allocate at least
4 CPUand8GB Memory
🐙 Git installed, along with a SSH key configured for your GitHub account in the tryretool org.
💻 A Text Editor/IDE set up, and some comfort using a Terminal for CLI commands
-
Use this template to create a new repo owned your github user, e.g
my-local-docker -
Clone that repo to your machine. If you don't already have one, create a folder in your home directory to help keep organized, e.g
~/local-retool -
Open up that repo in our text editor of choice
This template is made up of some Dockerfiles, some Compose files, and dynamic configuration file for Temporal (don't worry too much about this one).
There are two separate Compose files: one for a deployment with Workflows, and one without. I recommend working mostly with a non-Workflows deployment and spinning up the one with Workflows when needed.
-
Set an image tag in the
Dockerfileand optionallyCodeExecutor.Dockerfileif you're spinning up Workflows with Python support -
Rename the template env file to
docker.env, and set theJWT_SECRETandENCRYPTION_KEY. Optionally, you can replace theLICENSE_KEYwith your own (which is useful for testing feature flags)
Run the default compose.yaml:
docker compose up -d --build
Pass a specific compose file to use:
docker compose -f compose-workflows.yaml up -d --build
It typically takes around one minute for the deployment to be ready to handle requests, after which you can access it via localhost:3000.
To spin things down, run
docker compose down
Spend some time getting familiar with the Docker Desktop UI, which will show the status of your services.
Primarily you'll use Docker Desktop to view container logs, see metrics/charts of your containers CPU and Memory, and exec into containers themselves.
Third-party tools/apps to extend Docker Desktop functionality. I primarily use two:
ContainerWatch: Provides nice graphs for CPU and Memory usage for containers, and some nice logging.
Docker Debug Tools: Makes execing into continers and running commands more pleasant
One of the benefits running locally is being to easily inspect database state, and doing so is much easier using a GUI application (rather than running psql commands).
Install an app like Postico, TablePlus, or something from this list to better view your deployment data, see table structure, and run ad-hoc queries.
This template starts with a postgres container that keeps your data in a local volume via Docker Desktop. If you ever lose this volume (e.g if you run docker system prune) then your data is gone.
Consider setting up a database on Neon, Render, Supabase, or another cloud service to manage your database for free or very cheap.
I personally pay for Neon (ends up $2 or so a month) primarily for their Branching feauture, which comes in very handy for testing Retool things across versions where database state can potentially get messed up.
New versions of Retool may include changes to the schema of the postgres database, and these are implemented through DB Migrations.
It's important to keep these in mind when doing large jumps between Retool versions locally using the same postgres volume/instance, as Retool does not run down-migrations automatically when you downgrade.
-
DB Migration info is stored in the single-column
SequelizMetatable (the beginning digits of each migration name roughly translate to the date it was first created) -
You can copy the name and search for it in Retool's codebase to find out what the migration is actually doing, and a link to the PR which should give more info on why it was created and what version it should be run in.
Power-up your local deployment by:
-
Creating a GitHub or GitLab repo to set up Source Control
-
Creating an Okta or Auth0 account to set up Custom SSO
-
Adding NGINX to your configuration to setup SSL