This project supports multiple LLM backends. Select the appropriate client based on your desired provider.
Ensure you have uv installed before proceeding.
Each command launches a client paired with the MCP server (server.py).
Anthropic (Claude)
uv run python ./client.py ./server.pyOpenAI
uv run python ./client_openai.py ./server.pyDeepSeek
uv run python ./client_deepseek.py ./server.pyBefore running, ensure the appropriate API key is set as an environment variable:
| Provider | Environment Variable |
|---|---|
| Anthropic | ANTHROPIC_API_KEY |
| OpenAI | OPENAI_API_KEY |
| DeepSeek | DEEPSEEK_API_KEY |
You can set these in a .env file at the project root.
For files related to Lumentum ROADM, Lumentum 400 GbE CFP2-DCO, Calient switch, or DiCon switch, please email us to request access.
If you have any questions or suggestions, feel free to open an issue on GitHub.