Skip to content

arun-esh/mcp-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MCP Server for Gemini CLI

This repository contains an MCP (Multi-modal Conversational Platform) server built with fastmcp that extends the capabilities of a Gemini CLI agent. It provides specialized tools and prompts, allowing the Gemini agent to interact with external services like YouTube for transcript retrieval and to process information into structured formats.

Features

The server exposes the following functionalities to a connected Gemini CLI agent:

1. get_youtube_transcript Tool

  • Purpose: Fetches the full transcript of a given YouTube video.
  • Parameters:
    • url_or_id (string): The URL or unique ID of the YouTube video.
    • include_timestamps (boolean, optional, default: False): If True, timestamps are included with each line of the transcript.
  • Output: Returns a JSON string containing the video_id, language, number of snippets, and the transcript content.
  • Error Handling: Gracefully handles cases where transcripts are disabled, not found, or the video is unavailable.

2. youtube_transcript_to_notes Prompt

  • Purpose: A detailed prompt designed to guide an AI agent in converting YouTube video transcripts into comprehensive, Markdown-formatted study notes.
  • Instructions for AI: The prompt provides a structured approach for the AI to:
    • Infer a course name from the video title/description.
    • Save notes to a specific file path (./[course_name]/index.md), ensuring existing notes are merged and enriched, not overwritten.
    • Focus on explaining concepts thoroughly, including definitions, key ideas, examples, and real-world applications.
    • Utilize Markdown for clear formatting (lists, tables, diagrams).
    • Conclude with a "Key Takeaways" section.
    • Return the entire updated Markdown file.

3. search_google_prompt Prompt

  • Purpose: A prompt that instructs an AI agent to use an external search_google_tool (assumed to be available to the AI) to perform web searches and format the results.
  • Instructions for AI: The prompt guides the AI to:
    • Execute a search using the search_google_tool with a given query.
    • Process the structured JSON results from the search API.
    • Present the results in a well-formatted Markdown list.
    • Save the formatted output to a Markdown file named /opt/custom/arun-esh.github.io/docs/gemini/query_{query}_time_stamp.md.

Prerequisites

  • Python 3.12+
  • pip (Python package installer)
  • (Optional) Docker for containerized deployment

Setup

You can set up and run the MCP server either locally or using Docker.

Local Setup

  1. Clone the repository:

    git clone https://github.com/test-user/mcp-server.git # Replace with actual repo URL if different
    cd mcp-server
  2. Create and activate a virtual environment:

    python3 -m venv .venv
    source .venv/bin/activate
  3. Install dependencies:

    pip install -r requirements.txt
  4. Run the server:

    python main.py

    The server will start and be accessible at http://0.0.0.0:8000/mcp.

Docker Setup

  1. Build the Docker image:

    docker build -t mcp-server-image .
  2. Run the Docker container:

    docker run -d -p 8081:8000 --name mcp-server-container mcp-server-image

    This command runs the server in a detached mode, mapping the container's port 8000 to port 8081 on your host machine. The server will be accessible at http://localhost:8081/mcp.

Connecting to Gemini CLI

To enable your Gemini CLI agent to use this MCP server, you need to configure its GEMINI_MCP_SERVER_URL.

  1. Install Gemini CLI: If you haven't already, install the Gemini CLI (refer to its official documentation for the most up-to-date installation instructions).

  2. Configure the server URL: Set the GEMINI_MCP_SERVER_URL environment variable to point to your running MCP server.

    • For local setup:
      export GEMINI_MCP_SERVER_URL="http://localhost:8000/mcp"
    • For Docker setup:
      export GEMINI_MCP_SERVER_URL="http://localhost:8081/mcp"

    You can also add this line to your shell's profile file (e.g., .bashrc, .zshrc) to make it persistent.

    Alternatively, you can configure your Gemini CLI using a settings.json file (typically located in ~/.gemini/settings.json or a similar configuration directory). This provides a more persistent way to manage your MCP server connections.

    Example settings.json:

    {
      "security": {
        "auth": {
          "selectedType": "oauth-personal"
        }
      },
      "mcpServers": {
        "gemini-mcp-server": {
          "httpUrl": "http://127.0.0.1:8081/mcp/"
        }
      }
    }

Usage Examples (Conceptual)

Once connected, your Gemini CLI agent can leverage the server's functionalities. For instance:

  • Fetching a YouTube transcript: An AI might call the get_youtube_transcript tool with a video URL.
  • Generating study notes: An AI could be prompted with youtube_transcript_to_notes("https://www.youtube.com/watch?v=example") to generate notes from a video.
  • Performing a Google search: An AI could use search_google_prompt("latest AI research") to get formatted search results.

Refer to the Gemini CLI documentation for specific commands and interaction patterns with MCP servers.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published