Skip to content

fastertools/ftl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

ftl by fastertools

Faster tools for AI agents

License WebAssembly Rust Discord

Docs | Contributing | Releases

⚑ Quick Start

FTL is an open-source framework for building and running polyglot Model Context Protocol(MCP) servers. It's designed from the ground up to be fast, secure, and portable, using a modern stack of open standards.

We believe the future of AI tooling shouldn't be locked into proprietary ecosystems. FTL is our commitment to that vision, built entirely on:

  • WebAssembly (WASM): For secure, sandboxed execution with sub-millisecond cold starts.
  • The Component Model: To compose tools written in different languages (Rust, Python, Go, TS) into a single application.
  • Spin: The CNCF-hosted developer tool and runtime for building and running WASM applications.

This foundation ensures that what you build with FTL today will be compatible with the open, interoperable ecosystem of tomorrow.

Project Components

This monorepo contains everything you need to build and deploy AI tools:

  • ftl: CLI for managing FTL applications and deployments (Go)
  • MCP Components: Pre-built gateway and authorizer for secure MCP servers (Rust/WASM)
  • SDKs: Multi-language support for building AI tools (Python, Rust, TypeScript, Go)
  • Templates: Quick-start patterns for common use cases
  • Examples: Real-world applications demonstrating best practices

FTL vs. FTL Engine

  • ftl (This Repo): The open-source framework and CLI for building MCP servers that can run anywhere Spin apps are supported.
  • FTL Engine: Our optional, managed platform for deploying ftl applications to a globally distributed edge network for the simplest path to production

Features

  • Polyglot by Design: SDKs for Python, Rust, TypeScript, and Go let you write tools in the best language for the job.
  • Seamless Composition: Mix and match tools written in different languages within a single MCP server.
  • Secure & Sandboxed: Each tool runs in an isolated WASM sandbox, with no access to the host system unless explicitly granted
  • Run Anywhere: Deploy to any host compatible with Spin/Wasmtime.
  • MCP Compliant: Out-of-the-box support for Streamable HTTP and spec-compliant Authorization
  • Blazing Fast: Sub-millisecond cold starts and near-native performance, powered by Wasmtime.

Quick Start

Prerequisites

To build tools in different languages, you'll need their corresponding toolchains:

  • Rust: cargo (via rustup)
  • TypeScript/JavaScript: node and npm (via Node.js)
  • Python: python3 and componentize-py (install with pip install componentize-py)
  • Go: go and tinygo (via Go and TinyGo)

Installing and Updating

To get ftl installed run the install script. Download and run manually or download and install with curl/wget:

curl -o- https://raw.githubusercontent.com/fastertools/ftl/main/install.sh | bash
wget -qO- https://raw.githubusercontent.com/fastertools/ftl/main/install.sh | bash

Create a new project and scaffold a tool

ftl init fast-project 

cd fast 

ftl add fast-tool --language rust

Run the Local Development Server

ftl up --watch

β†’ Starting development server with auto-rebuild...

πŸ‘€ Watching for file changes

Serving http://127.0.0.1:3000
Available Routes:
  mcp: http://127.0.0.1:3000 (wildcard)

Connect your MCP Client

Example mcp.json config

{
  "mcpServers": {
    "fasttools": {
      "url": "http://127.0.0.1:3000",
      "transport": "http"
    }
  }
}

Add To Claude Code

claude mcp add -t http fasttools http://127.0.0.1:3000

Ready to Deploy?

Deploying to FTL Engine

For the simplest path to a production-grade, globally-distributed deployment, you can use FTL Engine. It handles scaling, security, and distribution for you on Akamai's edge network.

First join Discord to request early access.

Log in to FTL Engine

ftl eng login

Deploy

ftl eng deploy

β–Ά Deploying project to FTL Engine
β†’ Configuring MCP authorization settings...
βœ“ MCP authorization set to: public
βœ“ Deployed!

  MCP URL: https://8e264fc0-xxxx-aaaa-9999-9f5ab760092a.fwf.app

Architecture

FTL composes your individual tool components with our gateway and authorizer components into a single Spin application. All calls between components happen securely in-memory, eliminating network latency between your tools.

graph TB
    subgraph "MCP Clients"
        Desktops["Cursor, Claude, ChatGPT"]
        Agents["LangGraph, Mastra, ADK, OpenAI Agents SDK"]
        Realtime["11.ai, LiveKit, Pipecat"]
    end
    
    MCP["Model Context Protocol<br/>(Streamable HTTP)"]
    

      subgraph "Host"
          subgraph "Spin/Wasmtime Runtime"
              subgraph "FTL Application"
                  subgraph "FTL Components"
                      MCPAuth["MCP Authorizer"]
                      MCPGateway["MCP Gateway<br/>(Protocol, Routing, Validation)"]
                  end
                  
                  subgraph "User Tool Components"
                      Weather["Weather Tools<br/>(TS/JS)"]
                      Physics["Physics Tools<br/>(Rust)"]
                      Data["Data Tools<br/>(Python)"]
                      Custom["Fun Tools<br/>(Go)"]
                  end
              end
          end
      end
    
    Desktops -.->| | MCP
    Agents -.->| | MCP
    Realtime -.->| | MCP
    MCP -.->| | MCPAuth
    MCPAuth -.->|"Authorized requests (in-memory call)"| MCPGateway
    MCPGateway -.->|"In-memory call"| Weather
    MCPGateway -.->|"In-memory call"| Physics
    MCPGateway -.->|"In-memory call"| Data
    MCPGateway -.->|"In-memory call"| Custom
Loading

Fast Anywhere in the World

Example: A Python text processing tool called from 5 continents

results summary results details results

Secure by design

Internal isolation and MCP-compliant authorization.

Each WebAssembly module executes within a sandboxed environment separated from the host runtime using fault isolation techniques.

A component is a WebAssembly binary (which may or may not contain modules) that is restricted to interact only through the modules' imported and exported functions.

Allowed outbound hosts and accessible variables can be configured per individual tool component within a server.

Out-of-the-box support for configurable MCP-compliant authorization, including

  • Spec-compliant OAuth 2.1 implementation
  • OAuth 2.0 Dynamic Client Registration Protocol (RFC7591).
  • OAuth 2.0 Protected Resource Metadata (RFC9728).
  • OAuth 2.0 Authorization Server Metadata (RFC8414).

Plug in your own JWT issuer with simple configuration.

Edge deployments on FTL Engine

FTL Engine is an end-to-end platform for running remote tools called by AI agents.

Tools cold start in under half a millisecond, instantly scale up to meet demand, and scale down to zero.

Engines run on Fermyon Wasm Functions and Akamai, the most globally distributed edge compute network.

Cost scales predictably with usage. There are no idle costs and no price variables like execution duration, region, memory, provisioned concurrency, reserved concurrency, etc. Cold starts and init phases are architected out. Engine specs are fixed and scaling is completely horizontal and automatic.

Tools are automatically deployed across the global network edge. Tool calls are routed to an Engine running on the most optimal Akamai edge PoP, enabling consistently low latency across geographic regions.

The FTL components handle MCP implementation, auth, tool call routing, and tool call argument validation.

Bring your own JWT issuer or OAuth provider via simple configuration. Or use FTL's by default.

Contributing

We welcome contributions and discussion. Please see the Contributing Guide for details.

License

Apache-2.0 - see LICENSE for details.

Acknowledgments

FTL is built on top of these excellent projects: