Supercharge your Turborepo builds with our dedicated GitHub Actions caching service, designed to make your CI workflows faster and more efficient.
This GitHub Action provides an alternative approach to Turborepo vercel remote caching in CI/CD pipelines. While Vercel's official solution works well, there are some key advantages to using this action:
- No need for Vercel account or tokens
- Works entirely within GitHub's ecosystem
- Reduces external service dependencies
- Free forever
The main technical difference lies in how caching is handled:
Vercel's Approach
- Uses a remote caching server hosted by Vercel
- Become expensive for large monorepos with multiple apps/packages
- May have limitations based on your Vercel plan
This Action's Approach
- Simulates a local remote caching server on
localhost:41230
- Uses GitHub Actions' built-in caching system
- Github will automatically remove old cache entries
This solution might be better when:
- You have a large monorepo with multiple apps/packages
- You want to avoid external service dependencies
- You need more control over your caching strategy
- You want to leverage GitHub's existing infrastructure
However, if you're already using Vercel and their remote caching works well for your needs, there's no pressing need to switch. Both solutions are valid approaches to the same problem.
Easily integrate our caching action into your GitHub Actions workflow by adding
the following step before you run turbo build
:
- name: Cache for Turbo
uses: rharkor/caching-for-turbo@v2.1.3
This GitHub Action facilitates:
- Server Initialization: Automatically spins up a server on
localhost:41230
. - Environment Setup: Sets up
TURBO_API
,TURBO_TOKEN
, andTURBO_TEAM
environment variables required byturbo build
. - Efficient Caching: Leverages GitHub's cache service to significantly accelerate build times.
You can also use this package as a global dependency to run the cache server locally during development. This allows you to use the same caching infrastructure (including S3) that you use in CI.
Add this package as a global dependency:
npm install -g @rharkor/caching-for-turbo
The package provides a turbogha
binary that you can use to start the cache
server:
# Start the server in background mode (recommended for development)
turbogha start
# Or run the server in foreground mode
turbogha start --foreground
To stop the server, you can use the following command:
turbogha kill
To ping the server, you can use the following command:
turbogha ping
Create a .env
file in your project root to configure the cache server:
# Cache provider (github or s3)
PROVIDER=s3
# S3 Configuration (required when using s3 provider)
AWS_ACCESS_KEY_ID=secret # Or S3_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY=secret # Or S3_SECRET_ACCESS_KEY
AWS_REGION=us-east-1 # Or AWS_DEFAULT_REGION or S3_REGION
AWS_ENDPOINT_URL_S3=https://s3.amazonaws.com # Or AWS_ENDPOINT_URL or S3_ENDPOINT
S3_BUCKET=my-bucket
S3_PREFIX=turbogha/
# Optional: Custom cache prefix
CACHE_PREFIX=turbogha_
Once the server is running, you can use Turbo with remote caching:
export TURBOGHA_PORT=41230
export TURBO_API=http://localhost:41230
export TURBO_TOKEN=turbogha
export TURBO_TEAM=turbogha
# Now run your turbo commands
turbo build
See: https://turborepo.com/docs/reference/system-environment-variables
To stop the cache server:
turbogha kill
# or
curl -X DELETE http://localhost:41230/shutdown
Customize the caching behavior with the following optional settings (defaults provided):
with:
cache-prefix: turbogha_ # Custom prefix for cache keys
provider: github # Storage provider: 'github' (default) or 's3'
# S3 Provider Configuration (variables will be read from environment variables if not provided)
s3-access-key-id: ${{ secrets.S3_ACCESS_KEY_ID }} # S3 access key
s3-secret-access-key: ${{ secrets.S3_SECRET_ACCESS_KEY }} # S3 secret key
s3-bucket: your-bucket-name # S3 bucket name
s3-region: us-east-1 # S3 bucket region
s3-endpoint: https://s3.amazonaws.com # S3 endpoint
s3-prefix: turbogha/ # Optional prefix for S3 objects (default: 'turbogha/')
By default, this action uses GitHub's built-in cache service, which offers:
- Seamless integration with GitHub Actions
- No additional setup required
- Automatic cache pruning by GitHub
For teams requiring more control over caching infrastructure, the action supports Amazon S3 or compatible storage:
- Store cache artifacts in your own S3 bucket
- Works with any S3-compatible storage (AWS S3, MinIO, DigitalOcean Spaces, etc.)
- Greater control over retention policies and storage costs
- Useful for larger organizations with existing S3 infrastructure
It is very important to note that by default the cached files are stored forever. It is recommended to set a max-size (or other cleanup options) to avoid unexpected costs.
Example S3 configuration:
- name: Cache for Turbo
uses: rharkor/caching-for-turbo@v2.1.3
with:
provider: s3
s3-bucket: my-turbo-cache-bucket
To prevent unbounded growth of your cache (especially important when using S3 storage), you can configure automatic cleanup using one or more of these options:
with:
# Cleanup by age - remove cache entries older than the specified duration
max-age: 1mo # e.g., 1d (1 day), 1w (1 week), 1mo (1 month)
# Cleanup by count - keep only the specified number of most recent cache entries
max-files: 300 # e.g., limit to 300 files
# Cleanup by size - remove oldest entries when total size exceeds the limit
max-size: 10gb # e.g., 100mb, 5gb, 10gb
When using the GitHub provider, the built-in cache has its own pruning mechanism, but these options can still be useful for more precise control.
For S3 storage, implementing these cleanup options is highly recommended to control storage costs, as S3 does not automatically remove old cache entries.
Example with cleanup configuration:
- name: Cache for Turbo
uses: rharkor/caching-for-turbo@v2.1.3
with:
provider: s3
s3-bucket: my-turbo-cache-bucket
# Cleanup configuration
max-age: 2w
max-size: 5gb
-
Start the development server:
npm run dev-run
-
In a separate terminal, execute the tests:
npm test -- --cache=remote:rw --no-daemon
npm run cleanup
Licensed under the MIT License. For more details, see the LICENSE file.
This project is inspired by dtinth and has been comprehensively rewritten for enhanced robustness and reliability.