Skip to content

Commit

Permalink
notest typos
Browse files Browse the repository at this point in the history
  • Loading branch information
pchalasani committed Jun 6, 2024
1 parent 434484f commit 50fde04
Show file tree
Hide file tree
Showing 4 changed files with 14 additions and 11 deletions.
13 changes: 6 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -411,9 +411,10 @@ with a postgres db, you will need to:
### Set up environment variables (API keys, etc)

To get started, all you need is an OpenAI API Key.
If you don't have one, see [this OpenAI Page](https://help.openai.com/en/collections/3675940-getting-started-with-openai-api).
Currently only OpenAI models are supported. Others will be added later
(Pull Requests welcome!).
If you don't have one, see [this OpenAI Page](https://platform.openai.com/docs/quickstart).
(Note that while this is the simplest way to get started, Langroid works with practically any LLM, not just those from OpenAI.
See the guides to using [Open/Local LLMs](https://langroid.github.io/langroid/tutorials/local-llm-setup/),
and other [non-OpenAI](https://langroid.github.io/langroid/tutorials/non-openai-llms/) proprietary LLMs.)
In the root of the repo, copy the `.env-template` file to a new file `.env`:
```bash
Expand Down Expand Up @@ -441,9 +442,7 @@ All of the following environment variable settings are optional, and some are on
to use specific features (as noted below).
- **Qdrant** Vector Store API Key, URL. This is only required if you want to use Qdrant cloud.
The default vector store in our RAG agent (`DocChatAgent`) is LanceDB which uses file storage,
and you do not need to set up any environment variables for that.
Alternatively [Chroma](https://docs.trychroma.com/) is also currently supported.
Alternatively [Chroma](https://docs.trychroma.com/) or [LanceDB](https://lancedb.com/) are also currently supported.
We use the local-storage version of Chroma, so there is no need for an API key.
- **Redis** Password, host, port: This is optional, and only needed to cache LLM API responses
using Redis Cloud. Redis [offers](https://redis.com/try-free/) a free 30MB Redis account
Expand Down Expand Up @@ -817,7 +816,7 @@ config = DocChatAgentConfig(
"https://en.wikipedia.org/wiki/N-gram_language_model",
"/path/to/my/notes-on-language-models.txt",
],
vecdb=lr.vector_store.LanceDBConfig(),
vecdb=lr.vector_store.QdrantDBConfig(),
)
```

Expand Down
7 changes: 4 additions & 3 deletions docs/quick-start/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,10 @@ For many practical scenarios, you may need additional optional dependencies:
## Set up tokens/keys

To get started, all you need is an OpenAI API Key.
If you don't have one, see [this OpenAI Page](https://help.openai.com/en/collections/3675940-getting-started-with-openai-api).
Currently only OpenAI models are supported. Others will be added later
(Pull Requests welcome!).
If you don't have one, see [this OpenAI Page](https://platform.openai.com/docs/quickstart).
(Note that while this is the simplest way to get started, Langroid works with practically any LLM, not just those from OpenAI.
See the guides to using [Open/Local LLMs](https://langroid.github.io/langroid/tutorials/local-llm-setup/),
and other [non-OpenAI](https://langroid.github.io/langroid/tutorials/non-openai-llms/) proprietary LLMs.)
In the root of the repo, copy the `.env-template` file to a new file `.env`:
```bash
Expand Down
3 changes: 3 additions & 0 deletions langroid/agent/task.py
Original file line number Diff line number Diff line change
Expand Up @@ -1059,6 +1059,9 @@ def result(self) -> ChatDocument:
"""
Get result of task. This is the default behavior.
Derived classes can override this.
Note the result of a task is returned as if it is from the User entity.
Returns:
ChatDocument: result of task
"""
Expand Down
2 changes: 1 addition & 1 deletion langroid/agent/tool_message.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ def examples(cls) -> List["ToolMessage" | Tuple[str, "ToolMessage"]]:
- a tuple (description, ToolMessage instance), where the description is
a natural language "thought" that leads to the tool usage,
e.g. ("I want to find the square of 5", SquareTool(num=5))
In some scenarios, ncluding such a description can significantly
In some scenarios, including such a description can significantly
enhance reliability of tool use.
Returns:
"""
Expand Down

0 comments on commit 50fde04

Please sign in to comment.