Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix typo in websockets notebook #2287

Merged
merged 3 commits into from
Apr 5, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 10 additions & 9 deletions notebook/agentchat_websockets.ipynb
ekzhu marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
Expand Up @@ -120,20 +120,22 @@
"\n",
" print(\" - on_connect(): Receiving message from client.\", flush=True)\n",
"\n",
" # 1. Receive Initial Message\n",
" initial_msg = iostream.input()\n",
"\n",
" # 2. Configure the LLM\n",
" llm_config = {\n",
" \"config_list\": config_list,\n",
" \"stream\": True,\n",
" }\n",
"\n",
" # 3. Instantiate ConversableAgent and UserProxyAgent\n",
" agent = autogen.ConversableAgent(\n",
" name=\"chatbot\",\n",
" system_message=\"Complete a task given to you and reply TERMINATE when the task is done. If asked about the weather, use tool weather_forecast(city) to get the weather forecast for a city.\",\n",
" llm_config=llm_config,\n",
" )\n",
"\n",
" # create a UserProxyAgent instance named \"user_proxy\"\n",
" user_proxy = autogen.UserProxyAgent(\n",
" name=\"user_proxy\",\n",
" system_message=\"A proxy for the user.\",\n",
Expand All @@ -143,12 +145,13 @@
" code_execution_config=False,\n",
" )\n",
"\n",
" # 4. Define Agent-specific Functions\n",
" @user_proxy.register_for_execution()\n",
" @agent.register_for_llm(description=\"Weather forecats for a city\")\n",
" def weather_forecast(city: str) -> str:\n",
" return f\"The weather forecast for {city} at {datetime.now()} is sunny.\"\n",
"\n",
" # we will use a temporary directory as the cache path root to ensure fresh completion each time\n",
" # 5. Initiate conversation\n",
" print(\n",
" f\" - on_connect(): Initiating chat with agent {agent} using message '{initial_msg}'\",\n",
" flush=True,\n",
Expand All @@ -166,15 +169,13 @@
"source": [
"Here's an explanation on how a typical `on_connect` function such as the one in the example above is defined:\n",
"\n",
"1. **Receiving Initial Message**: Immediately after establishing a connection, receive an initial message from the client. This step is crucial for understanding the client's request or initiating the conversation flow.\n",
"1. **Receive Initial Message**: Immediately after establishing a connection, receive an initial message from the client. This step is crucial for understanding the client's request or initiating the conversation flow.\n",
"\n",
"2. **Receiving Initial Message**: Immediately after establishing a connection, receive an initial message from the client. This step is crucial for understanding the client's request or initiating the conversation flow.\n",
"2. **Configure the LLM**: Define the configuration for your large language model (LLM), specifying the list of configurations and the streaming capability. This configuration will be used to tailor the behavior of your conversational agent.\n",
"\n",
"3. **Configure the LLM**: Define the configuration for your large language model (LLM), specifying the list of configurations and the streaming capability. This configuration will be used to tailor the behavior of your conversational agent.\n",
"3. **Instantiate ConversableAgent and UserProxyAgent**: Create an instance of ConversableAgent with a specific system message and the LLM configuration. Similarly, create a UserProxyAgent instance, defining its termination condition, human input mode, and other relevant parameters.\n",
"\n",
"4. **Instantiate ConversableAgent and UserProxyAgent**: Create an instance of ConversableAgent with a specific system message and the LLM configuration. Similarly, create a UserProxyAgent instance, defining its termination condition, human input mode, and other relevant parameters.\n",
"\n",
"5. **Define Agent-specific Functions**: If your conversable agent requires executing specific tasks, such as fetching a weather forecast in the example below, define these functions within the on_connect scope. Decorate these functions accordingly to link them with your agents.\n",
"4. **Define Agent-specific Functions**: If your conversable agent requires executing specific tasks, such as fetching a weather forecast in the example below, define these functions within the on_connect scope. Decorate these functions accordingly to link them with your agents.\n",
"\n",
"5. **Initiate Conversation**: Finally, use the `initiate_chat` method of your `UserProxyAgent` to start the interaction with the conversable agent, passing the initial message and a cache mechanism for efficiency."
]
Expand Down Expand Up @@ -621,7 +622,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
"version": "3.10.14"
}
},
"nbformat": 4,
Expand Down
Loading