Skip to content

Commit

Permalink
v0.1.0
Browse files Browse the repository at this point in the history
Co-authored-by: Iryna Kondrashchenko <iryna-kondr@users.noreply.github.com>
  • Loading branch information
OKUA1 and iryna-kondr committed Aug 22, 2023
1 parent 9b4ba62 commit 718e28d
Show file tree
Hide file tree
Showing 14 changed files with 497 additions and 24 deletions.
31 changes: 31 additions & 0 deletions .github/workflows/pypi_deploy.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
name: PyPi Deploy

on:
release:
types: [published]
workflow_dispatch:

jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3

- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build and publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
run: |
python -m build
twine upload dist/*
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -160,3 +160,4 @@ cython_debug/
#.idea/
test.py
tmp.py
tmp_client.py
193 changes: 187 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,40 +27,61 @@ _Dingo_ allows you to easily integrate any function into ChatGPT by adding a sin

## Quick Start ⚡️

Step 1: Install `agent-dingo`
**Step 1:** Install `agent-dingo`

```bash
pip install agent-dingo
```

Step 2: Configure your OpenAI API key
**Step 2:** Configure your OpenAI API key

```bash
export OPENAI_API_KEY=<YOUR_KEY>
```

Step 3: Instantiate the agent
**Step 3:** Instantiate the agent

```python
from agent_dingo import AgentDingo

agent = AgentDingo()
```

Step 4: Add `agent.function` decorator to the function you wish to integrate
**Step 4:** Add `agent.function` decorator to the function you wish to integrate

```python
@agent.function
def get_current_weather(city: str):
...
```

Step 5: Run the conversation
**Step 5:** Run the conversation

```python
agent.chat("What is the current weather in Linz?")
```

**Optional:** Run an OpenAI compatible server

```python
from agent_dingo.wrapper import DingoWrapper
DingoWrapper(agent).serve()
```

The server can be accessed using the `openai` python package:

```python
import openai

openai.api_base = "http://localhost:8080"

r = openai.ChatCompletion.create(
model = "gpt-3.5-turbo",
messages = [{"role": "user", "content": "What is the current weather in Linz?"}],
temperature=0.0,
)
```

## Support us 🤝

You can support the project in the following ways:
Expand All @@ -71,7 +92,7 @@ You can support the project in the following ways:

📰 Post about _Dingo_ on LinkedIn or other platforms

## Our Related Projects 🔗
🔗 Check out our other projects (cards below are clickable):

<a href="https://github.com/OKUA1/falcon"><img src="https://raw.githubusercontent.com/gist/OKUA1/6264a95a8abd225c74411a2b707b0242/raw/3cedb53538cb04656cd9d7d07e697e726896ce9f/falcon_light.svg"/></a> <br>
<a href="https://github.com/iryna-kondr/scikit-llm"><img src="https://gist.githubusercontent.com/OKUA1/6264a95a8abd225c74411a2b707b0242/raw/029694673765a3af36d541925a67214e677155e5/skllm_light.svg"/></a>
Expand Down Expand Up @@ -174,6 +195,21 @@ from my_module import get_temperature
agent.register_function(get_temperature)
```

Alternatively, you can define a function descriptor manually and register it using the `register_descriptor` method. In this case, a `json_representation` compatible with [OpenAI function calling API](https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions) should be provided.

```python
from agent_dingo.descriptor import FunctionDescriptor

d = FunctionDescriptor(
name = "<function_name>",
json_representation = {name: "<function_name>", description: "<function_description>", parameters: ...}
func = function_callable
requires_context = True or False
)

agent.register_descriptor(d)
```

### Running the conversation

Once the functions are registered, you can run the conversation using the `chat` method of the agent.
Expand Down Expand Up @@ -246,3 +282,148 @@ agent.chat(
before_function_call=before_function_call,
)
```

### DingoWrapper + Web Server

In addition to using the agent directly, it is possible to wrap it into a `DingoWrapper`, which provides an OpenAI-like API.

```python
from agent_dingo.wrapper import DingoWrapper
wrapped_agent = DingoWrapper(agent, before_function_call = None, max_function_calls=10)
```

Once the agent is wrapped, it can be used to create the chat completions using the `chat_completion` method.

```python
r = wrapped_agent.chat_completion(
messages = [{"role": "user", "content": "What is the current weather in Linz?"}],
model = "gpt-3.5-turbo",
temperature=0.0, #optional
chat_context=None, #optional
)
```

In principle, this method can be used as a drop-in replacement for the `openai.ChatCompletion.create` method. However, there are several differences:

- DingoWrapper does not support most of the optional hyperparameters of the `openai.ChatCompletion.create` method (except `temperature`);
- DingoWrapper has an additional (optional) `chat_context` parameter that can be used to pass the global context of the conversation;

Example:

```python
# openai.ChatCompletion
r = openai.ChatCompletion.create(
messages = [{"role": "user", "content": "What is the current weather in Linz?"}],
model = "gpt-3.5-turbo",
temperature=0.0
)

# DingoWrapper
r = wrapped_agent.chat_completion(
messages = [{"role": "user", "content": "What is the current weather in Linz?"}],
model = "gpt-3.5-turbo",
temperature=0.0
)
```

The `DingoWrapper` can also be used to run a web server (also compatible with the OpenAI API). The server can be started using the `serve` method.

The serve method requires additional dependencies:

```bash
pip install agent_dingo[server]
```

```python
wrapped_agent.serve(port=8080, host="0.0.0.0", threads=4)
```

Once the server has started, it can be accessed using e.g. the `openai` python package.

```python
# client.py
import openai

openai.api_base = "http://localhost:8080"

r = openai.ChatCompletion.create(
model = "gpt-3.5-turbo",
messages = [{"role": "user", "content": "What is the temperature in Linz ?"}],
temperature=0.0,
)

print(r)
```

Response:

```json
{
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "The current temperature in Linz is 25\u00b0C and it is sunny.",
"role": "assistant"
}
}
],
"created": 1692537919,
"id": "chatcmpl-d6a9d6cc-7a26-41d5-a4a6-2c737b652f4b",
"model": "dingo-gpt-3.5-turbo-0613",
"object": "chat.completion",
"usage": {
"completion_tokens": 32,
"prompt_tokens": 318,
"total_tokens": 350
}
}
```

_Note: the "usage" metric accumulates the number of tokens used for all intermediate function calls during the conversation._

### LangChain Tools 🦜️🔗

It is possible to convert [Langchain Tools](https://python.langchain.com/docs/modules/agents/tools/) into function descriptors in order to register them with Dingo. The converter can be used as follows:

1. Install langchain:

```bash
pip install agent_dingo[langchain]
```

2. Define the tool, we will use the Wikipedia tool as an example:

```python
from langchain.tools.wikipedia.tool import WikipediaQueryRun
from langchain.utilities.wikipedia import WikipediaAPIWrapper
tool = WikipediaQueryRun(api_wrapper = WikipediaAPIWrapper())
```

Please refer to the [LangChain documentation](https://api.python.langchain.com/en/latest/api_reference.html#module-langchain.tools) for more details on how to define the tools.

3. Convert the tool into a function descriptor and register:

```python
from agent_dingo.langchain import convert_langchain_tool
descriptor = convert_langchain_tool(tool)
agent.register_descriptor(descriptor)
```

4. Run the conversation:

```python
# The agent will query Wikipedia to obtain the answer.
agent.chat("What is LangChain according to Wikipedia? Explain in one sentence.")

# > According to Wikipedia, LangChain is a framework designed to simplify the creation of applications using large language models (LLMs), with use-cases including document analysis and summarization, chatbots, and code analysis.
```

In comparison, when we try to query ChatGPT directly with the same question, we get the following hallucinated response (since it does not have access to the relevant up-to-date information):

```python
# > LangChain is a blockchain-based platform that aims to provide language learning services and connect language learners with native speakers for real-time practice and feedback.
```

_Note: some of the tools might be incompatible with (or simply unsuitable for) Dingo. We do not guarantee that all of the tools will work out of the box._
4 changes: 2 additions & 2 deletions agent_dingo/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from agent_dingo.agent import AgentDingo

__version__ = '0.1.0rc1'
__author__ = 'Oleh Kostromin, Iryna Kondrashchenko'
__version__ = "0.1.0"
__author__ = "Oleh Kostromin, Iryna Kondrashchenko"
22 changes: 22 additions & 0 deletions agent_dingo/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@
from agent_dingo.context import ChatContext
from agent_dingo.chat import send_message
from agent_dingo.docgen import generate_docstring
from agent_dingo.usage import UsageMeter
from agent_dingo.function_descriptor import FunctionDescriptor
from dataclasses import asdict
import json
import os

Expand Down Expand Up @@ -99,6 +102,23 @@ def _is_codegen_allowed(self) -> bool:
return bool(os.getenv("DINGO_ALLOW_CODEGEN", True))
return self._allow_codegen

def register_descriptor(self, descriptor: FunctionDescriptor) -> None:
"""Registers a function descriptor with the agent.
Parameters
----------
descriptor : FunctionDescriptor
The function descriptor to register.
"""
if not isinstance(descriptor, FunctionDescriptor):
raise ValueError("descriptor must be a FunctionDescriptor")
self._registry.add(
name=descriptor.name,
func=descriptor.func,
json_repr=descriptor.json_repr,
requires_context=descriptor.requires_context,
)

def register_function(self, func: Callable) -> None:
"""Registers a function with the agent.
Expand Down Expand Up @@ -152,6 +172,7 @@ def chat(
temperature: float = 1.0,
max_function_calls: int = 10,
before_function_call: Callable = None,
usage_meter: Optional[UsageMeter] = None,
) -> Tuple[str, List[dict]]:
"""Sends a message to the LLM and returns the response. Calls functions if the LLM requests it.
Expand Down Expand Up @@ -190,6 +211,7 @@ def chat(
model=model,
functions=available_functions_i,
temperature=temperature,
usage_meter=usage_meter,
)
if response.get("function_call"):
function_name = response["function_call"]["name"]
Expand Down
8 changes: 7 additions & 1 deletion agent_dingo/chat.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from typing import Optional, List
from typing import Optional, List, Callable
from agent_dingo.usage import UsageMeter
import openai

from tenacity import retry, stop_after_attempt, wait_fixed
Expand All @@ -10,6 +11,7 @@ def send_message(
model: str = "gpt-3.5-turbo-0613",
functions: Optional[List] = None,
temperature: float = 1.0,
usage_meter: Optional[UsageMeter] = None,
) -> dict:
"""Sends messages to the LLM and returns the response.
Expand All @@ -23,6 +25,8 @@ def send_message(
List of functions to use, by default None
temperature : float, optional
Temperature to use, by default 1.
log_usage : Callable, optional
Function to log usage, by default None
Returns
-------
Expand All @@ -36,4 +40,6 @@ def send_message(
response = openai.ChatCompletion.create(
model=model, messages=messages, temperature=temperature, **f
)
if usage_meter:
usage_meter.log_raw(response)
return response["choices"][0]["message"]
10 changes: 10 additions & 0 deletions agent_dingo/function_descriptor.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
from dataclasses import dataclass
from typing import Callable


@dataclass
class FunctionDescriptor:
name: str
func: Callable
json_repr: dict
requires_context: bool
Loading

0 comments on commit 718e28d

Please sign in to comment.