Skip to content

Conversation

@gpeal
Copy link
Contributor

@gpeal gpeal commented Oct 22, 2025

Motivation and Context

Most MCP servers return a 202 in response to notifications/initialized. However, it seams like some such as the allflow server from openai/codex#5208 return a 204. Other clients such as claude code and npx @modelcontextprotocol/inspector accept 204 so it seems reasonable to do so here.

The spec is ambiguous here from what I can tell. I don't see it explicitly mention that the notification response must be a 202.

How Has This Been Tested?

Test the failing server from openai/codex#5208. It now works.
CleanShot 2025-10-22 at 11 42 03

Breaking Changes

None

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update

Checklist

  • I have read the MCP Documentation
  • My code follows the repository's style guidelines
  • New and existing tests pass locally
  • I have added appropriate error handling
  • I have added or updated documentation as needed

@github-actions github-actions bot added T-core Core library changes T-transport Transport layer changes labels Oct 22, 2025
@alexhancock alexhancock merged commit 3822d9d into modelcontextprotocol:main Oct 22, 2025
11 checks passed
@github-actions github-actions bot mentioned this pull request Oct 22, 2025
gpeal added a commit to openai/codex that referenced this pull request Oct 24, 2025
Picks up modelcontextprotocol/rust-sdk#497 which fixes #5208 by allowing 204 response to MCP initialize notifications instead of just 202.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

T-core Core library changes T-transport Transport layer changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants