/!\ this question is usefull because AI tend to haluncinate a lot about 9/11 event when talking about septemeber 2001
An AI-powered research assistant for your terminal.
Deep Search is a command-line tool that uses local large language models (LLMs) to provide in-depth answers to complex questions. It breaks down your query, scours the web for relevant information, and synthesizes a comprehensive response, all within your terminal.
- AI-Powered Research: Leverages local LLMs (via Ollama) to understand and research your questions.
- Step-by-Step Process: Decomposes questions, searches multiple sources (Wikipedia, DuckDuckGo), filters for relevance, and summarizes findings.
- Local First: Works with your own Ollama-hosted models, keeping your data private.
- Minimalist CLI: A clean, focused interface for your research tasks.
The tool follows a structured research workflow:
- Decompose: The initial question is broken down into smaller, specific sub-questions.
- Search: Each sub-question is researched using Wikipedia or DuckDuckGo.
- Filter: The search results are filtered to identify the most relevant sources.
- Summarize: The content of each relevant page is summarized.
- Evaluate: The summaries are used to construct a final answer. If the answer is incomplete, the process can be iterated with new sub-questions.
- Answer: A final, synthesized answer is presented to the user.
Once the package is published to crates.io, you can install it directly using cargo:
cargo install deepsearchThis will install the deepsearch binary in your cargo bin directory, allowing you to run it from anywhere in your terminal.
-
Install Rust: If you don't have Rust, install it from rust-lang.org.
-
Install Ollama: You need a running Ollama instance. See the Ollama website for installation instructions.
-
Clone the repository:
git clone https://github.com/LightInn/deepsearch.git cd deepsearch -
Build the project:
For a development build, run:
cargo build
The executable will be at
./target/debug/deepsearch.For a release (production) build, run:
cargo build --release
The executable will be at
./target/release/deepsearch.
Once built, you can run the tool from the command line.
./target/release/deepsearch "Your research question"For development, you can run the tool directly with cargo:
cargo run -- "Your research question"You can customize the behavior of the tool with the following parameters:
--max-iterationsor-i: Set the maximum number of research iterations.--modelor-m: Specify the Ollama model to use.--verboseor-v: Enable verbose output for debugging purposes.
Example:
./target/release/deepsearch "How does photosynthesis work?" -i 5 -m "llama3"This will start a research task on "How does photosynthesis work?", with a maximum of 5 iterations, using the llama3 model.
Contributions are welcome! If you'd like to contribute, please feel free to submit a pull request or open an issue.
A core part of this tool is the quality of the prompts used to interact with the LLM. If you have ideas for improving the prompts, you are encouraged to modify the src/prompts.rs file and submit a pull request. Better prompts lead to better research outcomes!
This project is licensed under the MIT License. See the LICENSE file for details.
