A Model Context Protocol (MCP) implementation that enables AI assistants to save conversation memories to Google Docs.
This server implements the Model Context Protocol (MCP) to provide AI assistants with memory capabilities. When integrated with compatible language models, it allows them to save important information from conversations, creating a persistent memory system stored in Google Docs.
remember_thistool: Allows the AI to store conversation summaries in a Google Doc- Automatic timestamp generation for each memory entry
- Secure authentication with Google Docs API
- Cross-platform memory sharing between different AI assistants
- Python 3.8+
- Google Cloud Service Account with Google Docs API access
- Required Python packages:
- httpx
- python-dotenv
- google-auth
- google-api-python-client
- mcp (Model Context Protocol)
- Environment variables:
GOOGLE_CREDENTIALS_PATH: Path to the service account credentials fileDOCUMENT_ID: The ID of the Google Doc to use as memory storage
- Clone the repository
- Install dependencies:
pip install -r requirements.txt(create this file with the dependencies listed above) - Set up environment variables (see below)
Create a .env file with the following:
GOOGLE_CREDENTIALS_PATH=path/to/your/credentials.json
DOCUMENT_ID=your_google_doc_id
python server.py
from memory import main
if __name__ == "__main__":
main()The server communicates via stdin/stdout using the MCP protocol.
This server is designed to be used with AI models that support the Model Context Protocol. When properly integrated, the AI can use the remember_this tool to save important information from conversations.
You can use the same Google Doc as a memory repository across different AI assistants to maintain context continuity. Here's how to attach the same Google Doc to different platforms:
- Create a new chat or open an existing one on chat.openai.com
- Access GPT settings and enable the "Web Browsing" plugin
- Share your Google Doc with the appropriate permissions (public or specific access)
- Provide the Google Doc URL to ChatGPT in your conversation
- Ask ChatGPT to reference the document for context in future interactions
- Start a new conversation in Claude or continue an existing thread
- Upload your Google Doc directly to the conversation using the attachment feature
- Alternatively, share the Google Doc link with Claude
- Ask Claude to review the document for context
- For Claude projects (Anthropic's persistent workspace feature), attach the document to the project for continuous access
- Open Gemini chat at gemini.google.com
- Use the Google account that has access to your memory Google Doc
- Reference the document directly since Gemini has native integration with Google Docs
- You can ask Gemini to "use the document [document name] for context"
- For new conversations, explicitly ask Gemini to reference the same document
- Use a consistent format in your Google Doc for easy parsing by different AI models
- Include clear section headers and timestamps
- When switching platforms, explicitly instruct the AI to reference the shared memory document
- Consider creating a table of contents or index at the top of the document
- Update the
DOCUMENT_IDenvironment variable to point to the same Google Doc across all instances
The server implements the following MCP endpoints:
list_tools: Returns available tools (remember_this)call_tool: Executes the requested tool function with the provided arguments
The Model Context Protocol enables:
- Standardized communication between language models and external tools
- Tool discovery through the
list_toolsendpoint - Tool execution through the
call_toolendpoint - Structured input/output through JSON schema validation
The server uses the MCP stdio communication method for transmitting requests and responses.