Local AI search for the files you can't find.
Search by memory, meaning, and visual content across documents, photos, videos, spreadsheets, and more - without uploading your files.
Search for:
- "the photo of my dog on the beach"
- "the video where I saw a manta ray"
- "the spreadsheet that explains why my trip went over budget"
Unfoldly finds the relevant local file, shows the source, and lets you ask follow-up questions in context.
Website | Download | Demos | Privacy | Build | Uninstall | 中文
Early beta: The current public release is macOS-only. A signed and notarized build is planned.
Traditional file search works when you remember the filename, folder, or exact keyword.
But most of the time, you remember something else:
- what was inside a photo
- what happened in a video
- what a document was about
- what a spreadsheet explained
- where you saw an idea, chart, quote, number, or screenshot
Unfoldly is built for that gap between what you remember and where the file actually lives.
Think of it as an AI-native Everything for personal files: local-first like desktop search, but searchable by meaning, context, and visual memory.
Unfoldly turns selected files and folders into a private, searchable memory layer on your computer.
The workflow is simple:
-
Choose your sources
Add the folders or files you want Unfoldly to index. -
Build local memory
Unfoldly processes your files locally, extracting text, visual signals, metadata, transcripts, and searchable embeddings depending on the file type. -
Search and ask
Use natural language to find relevant files, inspect the source, narrow the search scope, or ask follow-up questions in context.
The whole experience is designed to run locally. Your files, indexes, downloaded models, preferences, logs, and chat history stay on your machine.
Search local images by what is inside them, not by filename.
Describe the moment you remember. Unfoldly finds the local video and the relevant moment.
Find the answer hidden in your files
Find the right local file first, then ask a follow-up question. Unfoldly keeps the source in context so you can inspect where the answer came from.
Unfoldly is designed for everyday personal files.
| Type | Formats |
|---|---|
| Documents | PDF, DOC, DOCX, TXT, Markdown, RTF, EPUB, MOBI |
| Spreadsheets | XLSX, XLS, CSV, TSV, Numbers, ODS |
| Slides | PPTX, PPT, Keynote, ODP |
| Images | JPG, PNG, HEIC, HEIF, WEBP, GIF, TIFF, BMP, SVG |
| Audio | MP3, WAV, M4A, FLAC, AAC, OGG, WMA, AIFF |
| Video | MP4, MOV, MKV, AVI, WEBM, M4V, WMV |
| Other structured files | JSONL, XML, SQL, YAML |
Support quality can vary by file structure, size, parser availability, and selected model.
- Search by memory: find files by what you remember, not just filenames or exact keywords.
- Search visual content: make photos, screenshots, scanned files, and video moments searchable.
- Ask source-aware questions: retrieve the right file, then ask follow-up questions in context.
- Choose what gets indexed: Unfoldly only indexes the files and folders you select.
- Run locally: the whole search workflow is designed to run on your machine.
Unfoldly is built around a simple principle:
Your local files should stay local.
Unfoldly keeps the whole search and chat workflow on your device.
| Data | Where it lives |
|---|---|
| Source files | Your selected local folders |
| Search indexes | Local data directory |
| Downloaded models | Local data directory |
| Preferences | Local data directory |
| Chat history | Local data directory |
| Logs | Local data directory |
Unfoldly may connect to the internet to download model files or application releases, but your personal file contents are not uploaded.
The current public build is focused on macOS.
- Download and open Unfoldly.
- Add the files or folders you want to search.
- Download or select a supported local model.
- Wait for indexing to complete.
- Search by memory, meaning, or visual content.
- Open the source file or ask a follow-up question.
The current beta build is not notarized.
If macOS says it cannot verify Unfoldly:
- Try opening
Unfoldly.apponce. - Open System Settings > Privacy & Security.
- Scroll down to the Security section.
- Click Open Anyway next to the Unfoldly warning.
- Launch Unfoldly again.
Unfoldly includes a macOS uninstall helper for users who want to inspect or remove local app data.
Show the app, database, model, preference, log, cache, and crash-report paths that would be removed:
bash scripts/uninstall-mac.sh --showFully remove Unfoldly and its local runtime data:
bash scripts/uninstall-mac.shThe uninstall script is intentionally destructive and requires typing yes before deletion. It removes local indexes, downloaded models, preferences, selected-source records, chat history, logs, caches, crash reports, and installed app bundles. It does not delete the original files you selected for indexing.
Unfoldly is built as a local desktop application.
| Layer | Technology |
|---|---|
| Desktop shell | Tauri |
| Frontend | React, TypeScript, Vite |
| Native bridge | Rust + PyO3 |
| Backend | Python |
| Vector store | ChromaDB |
| Retrieval | Embeddings, lexical search, reranking, source-scoped filtering |
| Model runtime | Local GGUF models through llama.cpp / llama-cpp-python |
| Storage | Local app data directory |
The packaged desktop app embeds the Python backend into the Tauri application, so normal packaged use does not require a separate local HTTP server.
For development and release packaging, see the full macOS build guide. For publishing GitHub Releases and enabling in-app update checks, see Release Publishing.
Minimum local toolchain:
- macOS
- Python 3.12
- Rust
- Node.js and npm
- Git and CMake
Start the local development app:
python3 -m venv .venv
.venv/bin/pip install -r requirements.txt
./scripts/start-dev.shCreate a clean macOS release build:
bash scripts/build-release.sh arm64Release artifacts are written to:
macos_bundle/release/Unfoldly.app
macos_bundle/release/Unfoldly.dmg
Runtime data such as indexes, downloaded models, preferences, logs, and chat history is stored in the local app data directory.
Unfoldly is designed to run supported local AI models through a local runtime.
The current model registry includes GGUF-based text and vision-language models from families such as:
- Gemma
- Qwen / Qwen-VL
- GLM
- Llama
- Ministral
- DeepSeek-R1 Distill
- gpt-oss
The retrieval system currently uses ChromaDB as the local vector store, with BGE-family embedding and reranking models.
Contributions are welcome. Please read CONTRIBUTING.md before opening a pull request.
All pull requests are reviewed by Unary Works maintainers before merge.
Unfoldly is licensed under the Apache License 2.0.
Copyright (c) 2026 Unary Works LLC.
Bundled third-party dependencies retain their own licenses. See Third-Party License Notes, Third-Party Notices, and NOTICE for dependency license and notice details.



