Welcome to Hello MCP, a Node.js project that provides tools to interact with the Star Wars API (SWAPI). 🚀 Whether you're a Star Wars fan or a developer looking to explore the galaxy far, far away, this project has you covered! 🌠
This is a server and client implementation of the MCP protocol. It wraps the Swapi.dev API.
-
🔍 Character Height Comparison
Find out if a Star Wars character is shorter or taller than a stormtrooper. -
🌍 Planet Information
Retrieve details about Star Wars planets, including gravity and population. -
🌍🌍🌍 Planets Information
Retrieve details about planets in Star Wars. -
⚡ Fast and Easy to Use
Built with Node.js for quick and efficient API interactions.
Follow these steps to get up and running:
-
Clone the Repository
git clone https://github.com/your-username/hello-mcp.git cd hello-mcp -
Install Dependencies
Make sure you have Node.js installed, then run:npm install
-
Run the Project
Start the server with:
npm run start:server
Start the client with:
npm run start:client
The client runs code like below:
const query = "Is Luke shorter than a stormtrooper?"; const result = await client.processQuery(query); console.log("Final result: \n ", result);
The preceding code would trigger the tool defined in
/src/tools/shorter.ts.
If starting the server with npm run start:server, the client will contain the tools:
-
Character Tool
Endpoint:/tools/call_swapi_character
Input:{ "name": "Luke Skywalker" }
Output:"Luke Skywalker is shorter than a stormtrooper." -
Planet Tool
Endpoint:/tools/call_swapi_planet
Input:{ "name": "Tatooine" }
Output:"Tatooine has gravity 1 standard with a population of 200000."
If starting the server with npm run start:chuck, the client will
contain the tools:
-
Chuck Norris Random Joke Tool
Endpoint:/tools/random-jokeInput:{}Output:"Chuck Norris joke: Chuck Norris can divide by zero." -
Chuck Norris Random Joke by Category Tool
Endpoint:/tools/random-joke-by-categoryInput:{"category": "Animal"}Output:"Chuck Norris joke: Chuck Norris can divide by zero."
The client negotiates with the server for its available tools. It will figure out whether to call both tool and getting a generic LLM response. The end result will be a combination of both if available.
A tool is created by calling the server.tool(<toolname>, <description>, { /* implementation */ })
-
Locate one of the tools in
src/toolsand create a new one like so:o
src/tools/exampleTool.tso Add tool to register it in import section insrc/tools/server.tso Write a query insrc/client.tsthat semantically matches the description you've created for the tool, for example: o query: "add 1 to 2" o tool description: "adding two numbers"
Contributions are welcome! Feel free to open issues or submit pull requests to improve the project.
This project is licensed under the MIT License.
May the Force be with you! 🌟
For Codespaces .vscode/mcp.json
{
"inputs": [],
"servers": {
"hello-mcp": {
"command": "sh",
"args": [
"-c",
"cd /workspaces/hello-mcp && ./node_modules/.bin/tsc && node ./build/server.js"
]
}
}
}- Than you Wassim Chegham for the original code. Also please check out his repo at https://github.com/manekinekko/openai-mcp-example