Skip to content

rashadphz/farfalle

Repository files navigation

Farfalle

Open-source AI-powered search engine. (Perplexity Clone)

Run local LLMs (llama3, gemma, mistral, phi3), custom LLMs through LiteLLM, or use cloud models (Groq/Llama3, OpenAI/gpt4-o)

Demo answering questions with phi3 on my M1 Macbook Pro:

local-demo.mp4

Please feel free to contact me on Twitter or create an issue if you have any questions.

πŸ’» Live Demo

farfalle.dev (Cloud models only)

πŸ“– Overview

πŸ›£οΈ Roadmap

  • Add support for local LLMs through Ollama
  • Docker deployment setup
  • Add support for searxng. Eliminates the need for external dependencies.
  • Create a pre-built Docker Image
  • Add support for custom LLMs through LiteLLM
  • Chat History
  • Chat with local files

πŸ› οΈ Tech Stack

Features

  • Search with multiple search providers (Tavily, Searxng, Serper, Bing)
  • Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
  • Answer questions with local models (llama3, mistral, gemma, phi3)
  • Answer questions with any custom LLMs through LiteLLM

πŸƒπŸΏβ€β™‚οΈ Getting Started Locally

Prerequisites

  • Docker
  • Ollama (If running local models)
    • Download any of the supported models: llama3, mistral, gemma, phi3
    • Start ollama server ollama serve

Get API Keys

Quick Start:

docker run \
    -p 8000:8000 -p 3000:3000 -p 8080:8080 \
    --add-host=host.docker.internal:host-gateway \
    ghcr.io/rashadphz/farfalle:main

Optional

  • OPENAI_API_KEY: Your OpenAI API key. Not required if you are using Ollama.
  • SEARCH_PROVIDER: The search provider to use. Can be tavily, serper, bing, or searxng.
  • OPENAI_API_KEY: Your OpenAI API key. Not required if you are using Ollama.
  • TAVILY_API_KEY: Your Tavily API key.
  • SERPER_API_KEY: Your Serper API key.
  • BING_API_KEY: Your Bing API key.
  • GROQ_API_KEY: Your Groq API key.
  • SEARXNG_BASE_URL: The base URL for the SearXNG instance.

Add any env variable to the docker run command like so:

docker run \
    -e ENV_VAR_NAME1='YOUR_ENV_VAR_VALUE1' \
    -e ENV_VAR_NAME2='YOUR_ENV_VAR_VALUE2' \
    -p 8000:8000 -p 3000:3000 -p 8080:8080 \
    --add-host=host.docker.internal:host-gateway \
    ghcr.io/rashadphz/farfalle:main

Wait for the app to start then visit http://localhost:3000.

or follow the instructions below to clone the repo and run the app locally

1. Clone the Repo

git clone git@github.com:rashadphz/farfalle.git
cd farfalle

2. Add Environment Variables

touch .env

Add the following variables to the .env file:

Search Provider

You can use Tavily, Searxng, Serper, or Bing as the search provider.

Searxng (No API Key Required)

SEARCH_PROVIDER=searxng

Tavily (Requires API Key)

TAVILY_API_KEY=...
SEARCH_PROVIDER=tavily

Serper (Requires API Key)

SERPER_API_KEY=...
SEARCH_PROVIDER=serper

Bing (Requires API Key)

BING_API_KEY=...
SEARCH_PROVIDER=bing

Optional

# Cloud Models
OPENAI_API_KEY=...
GROQ_API_KEY=...

# See https://litellm.vercel.app/docs/providers for the full list of supported models
CUSTOM_MODEL=...

3. Run Containers

This requires Docker Compose version 2.22.0 or later.

docker-compose -f docker-compose.dev.yaml up -d

Visit http://localhost:3000 to view the app.

For custom setup instructions, see custom-setup-instructions.md

πŸš€ Deploy

Backend

Deploy to Render

After the backend is deployed, copy the web service URL to your clipboard. It should look something like: https://some-service-name.onrender.com.

Frontend

Use the copied backend URL in the NEXT_PUBLIC_API_URL environment variable when deploying with Vercel.

Deploy with Vercel

And you're done! πŸ₯³

Use Farfalle as a Search Engine

To use Farfalle as your default search engine, follow these steps:

  1. Visit the settings of your browser
  2. Go to 'Search Engines'
  3. Create a new search engine entry using this URL: http://localhost:3000/?q=%s.
  4. Add the search engine.