Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions agents/web_search_agent/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
TAVILY_API_KEY=
OPENAI_API_KEY=
AGENTOPS_API_KEY=
10 changes: 10 additions & 0 deletions agents/web_search_agent/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
.env
.venv
.idea
.vscode
.DS_Store
**/__pycache__
swarmzero-data
*.db
*.log
*.log.*
71 changes: 71 additions & 0 deletions agents/web_search_agent/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
# Web Search Agent

A powerful web search agent built using SwarmZero framework that enables intelligent web searching capabilities.

## Description

This agent utilizes the Tavily API for performing web searches and is built on top of the SwarmZero framework, providing enhanced search capabilities with AI-powered processing.

## Prerequisites

- Python 3.11 or higher
- Poetry package manager
- Tavily API key
- AgentOps API key

## Installation

1. Clone the repository:
```bash
git clone https://github.com/swarmzero/examples.git
cd examples/agents/web-search-agent
```

2. Install dependencies using Poetry:
```bash
poetry install --no-root
```

3. Set up environment variables:
Create a `.env` file in the root directory and add your API keys:
```
TAVILY_API_KEY=your_tavily_api_key_here
AGENTOPS_API_KEY=your_agentops_api_key_here
OPENAI_API_KEY=your_openai_api_key_here
```

## Usage

1. Activate the Poetry shell:
```bash
poetry shell
```

2. Run the agent:
```bash
poetry run python main.py
```

3. Send a message to the agent:
```bash
curl -X 'POST' \
'http://localhost:8000/api/v1/chat' \
-H 'accept: application/json' \
-H 'Content-Type: multipart/form-data' \
-F 'user_id=test_user' \
-F 'session_id=test_web_search_agent' \
-F 'chat_data={"messages":[{"role":"user","content":"what is swarmzero.ai about?"}]}'
```

4. AgentOps will automatically capture the session:
- View the [agentops.log](agentops.log) file
- See the[AgentOps Dashboard](https://app.agentops.ai/drilldown)

## Dependencies

- `swarmzero`: Main framework for agent development
- `agentops`: Agent operations and monitoring
- `tavily-python`: Web search API client

## Learn more
Visit [SwarmZero](https://swarmzero.ai) to learn more about the SwarmZero framework.
45 changes: 45 additions & 0 deletions agents/web_search_agent/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
import os

import agentops
from dotenv import load_dotenv
from swarmzero import Agent
from tavily import TavilyClient

load_dotenv()
agentops.init(os.getenv("AGENTOPS_API_KEY"))
tavily_client = TavilyClient(api_key=os.getenv("TAVILY_API_KEY"))


async def web_search(query: str) -> dict:
response = tavily_client.search(query)
results = []
for result in response["results"][:3]:
results.append({"title": result["title"], "url": result["url"], "content": result["content"]})
return results


async def extract_from_urls(urls: list[str]) -> dict:
response = tavily_client.extract(urls=urls)

if response["failed_results"]:
print(f"Failed to extract from {response['failed_results']}")

results = []
for result in response["results"]:
results.append({"url": result["url"], "raw_content": result["raw_content"]})

return results


if __name__ == "__main__":
my_agent = Agent(
name="workflow-assistant",
functions=[
web_search,
extract_from_urls,
],
config_path="./swarmzero_config.toml",
instruction="You are a helpful assistant that can search the web and extract information from a given URL.",
)

my_agent.run() # see agent API at localhost:8000/docs
Loading