This repository provides a streamlined process for setting up and managing a self-hosted Turborepo remote cache on Google Cloud, utilizing the ducktors/turborepo-remote-cache project.

- Clone this repository
- Ensure Terraform is set up and the "Resource Manager API" is enabled on Google Cloud
- Create service accounts with the following roles:
- Editor (or roles with sufficient permissions to create necessary resources)
- Secret Manager Admin
- Security Admin
- Download the private key (in JSON format) of the service account and place it in the root of this repository as "credentials.json"
-
Authenticate the gcloud CLI:
gcloud auth login --cred-file=./credentials.json
-
Prepare your Terraform variables:
cp terraform.tfvars.example terraform.tfvars # Edit the copied file (terraform.tfvars) with your details
-
Initialize and apply your Terraform configuration:
cd terraform terraform init terraform plan terraform apply
-
Pull the fox1t/turborepo-remote-cache image from Docker Hub and push it to your Artifact Registry:
Click here for detailed instructions
# only amd64 image works docker pull fox1t/turborepo-remote-cache:(tag)@(digest-of-amd64-image) docker tag fox1t/turborepo-remote-cache:(tag)@(digest-of-amd64-image) (artifact-registry-repository-location)/turborepo-remote-cache:(tag) docker push (artifact-registry-repository-location)/turborepo-remote-cache:(tag)
-
Now your remote cache server is set up and ready to use! π
turbo run ${command} \ --api=${cloud run url} \ --team=${your team name} \ --token=${turbo_token in terraform.tfvars}
Note: If you push another image into the Artifact Registry, a new revision will be created automatically.
Enjoy your self-hosted Turborepo remote cache!
Any feature requests or pull requests are welcome!