Skip to content

run-llama/notebookllama

Repository files navigation

NotebookLlaMa🦙

A fluffy and open-source alternative to NotebookLM!

WLhI-yLBLWlfL8c6.mp4

A fully open-source alternative to NotebookLM, backed by LlamaCloud.

License Stars Issues

Prerequisites

This project uses uv to manage dependencies. Before you begin, make sure you have uv installed.

On macOS and Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

On Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

For more install options, see uv's official documentation.


Get it up and running!

1. Clone the Repository

git clone https://github.com/run-llama/notebookllama
cd notebookllama/

2. Install Dependencies

uv sync

3. Configure API Keys

First, create your .env file by renaming the example file:

mv .env.example .env

Next, open the .env file and add your API keys:

4. Activate the Virtual Environment

(on mac/unix)

source .venv/bin/activate

(on Windows):

.\.venv\Scripts\activate

5. Create LlamaCloud Agent & Pipeline

You will now execute two scripts to configure your backend agents and pipelines.

First, create the data extraction agent:

uv run tools/create_llama_extract_agent.py

Next, run the interactive setup wizard to configure your index pipeline.

⚡ Quick Start (Default OpenAI): For the fastest setup, select "With Default Settings" when prompted. This will automatically create a pipeline using OpenAI's text-embedding-3-small embedding model.

🧠 Advanced (Custom Embedding Models): To use a different embedding model, select "With Custom Settings" and follow the on-screen instructions.

Run the wizard with the following command:

uv run tools/create_llama_cloud_index.py

6. Launch Backend Services

This command will start the required Postgres and Jaeger containers.

docker compose up -d

7. Run the Application

First, run the MCP server:

uv run src/notebookllama/server.py

Then, in a new terminal window, launch the Streamlit app:

streamlit run src/notebookllama/Home.py

Important

You might need to install ffmpeg if you do not have it installed already

And start exploring the app at http://localhost:8501/.


Contributing

Contribute to this project following the guidelines.

License

This project is provided under an MIT License.

About

A fully open-source, LlamaCloud-backed alternative to NotebookLM

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages