Skip to content

Commit

Permalink
Agent tool sources (#6854)
Browse files Browse the repository at this point in the history

Co-authored-by: Jerry Liu <jerryjliu98@gmail.com>
  • Loading branch information
logan-markewich and jerryjliu committed Jul 12, 2023
1 parent e82d6af commit aa56f21
Show file tree
Hide file tree
Showing 22 changed files with 395 additions and 305 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
## Unreleased

### New Features
- Added sources to agent/chat engine responses (#6854)
- Added basic chat buffer memory to agents / chat engines (#6857)

## [v0.7.5] - 2023-07-11
Expand Down
3 changes: 2 additions & 1 deletion docs/core_modules/query_modules/chat_engines/root.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,8 @@ To stream response:
```python
chat_engine = index.as_chat_engine()
streaming_response = chat_engine.stream_chat("Tell me a joke.")
streaming_response.print_response_stream()
for token in streaming_response.response_gen:
print(token, end="")
```


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,8 @@ This somewhat inconsistent with query engine (where you pass in a `streaming=Tru
```python
chat_engine = index.as_chat_engine()
streaming_response = chat_engine.stream_chat("Tell me a joke.")
streaming_response.print_response_stream()
for token in streaming_response.response_gen:
print(token, end="")
```

See an [end-to-end tutorial](/examples/customization/streaming/chat_engine_condense_question_stream_response.ipynb)
Expand Down
149 changes: 73 additions & 76 deletions docs/examples/agent/openai_agent.ipynb

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions llama_index/agent/context_retriever_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@
BaseOpenAIAgent,
)
from llama_index.bridge.langchain import print_text
from llama_index.chat_engine.types import AgentChatResponse
from llama_index.callbacks.base import CallbackManager
from llama_index.indices.base_retriever import BaseRetriever
from llama_index.llms.base import ChatMessage
from llama_index.llms.openai import OpenAI
from llama_index.memory import BaseMemory, ChatMemoryBuffer
from llama_index.prompts.prompts import QuestionAnswerPrompt
from llama_index.response.schema import RESPONSE_TYPE
from llama_index.schema import NodeWithScore
from llama_index.tools import BaseTool

Expand Down Expand Up @@ -149,7 +149,7 @@ def _get_tools(self, message: str) -> List[BaseTool]:

def chat(
self, message: str, chat_history: Optional[List[ChatMessage]] = None
) -> RESPONSE_TYPE:
) -> AgentChatResponse:
"""Chat."""
# augment user message
retrieved_nodes_w_scores: List[NodeWithScore] = self._retriever.retrieve(
Expand Down
Loading

0 comments on commit aa56f21

Please sign in to comment.