x-timeline-mcp is a remote, read-first MCP server for structured X (Twitter) timeline/search data retrieval in ChatGPT/OpenAI Apps analysis workflows.
V1 scope is intentionally narrow:
- Tool-first MCP surface
- Read-only only
- Mixed-auth capable (public/app-token for public reads, OAuth for user-scoped reads)
- Normalized outputs for downstream timeline analysis
This is a production-honest V1 scaffold, not a "fully production-ready" claim.
Implemented:
- Remote MCP HTTP endpoint with typed tools and schema validation
- Public + OAuth auth split
- OAuth Authorization Code + PKCE start/callback
- Fail-closed OAuth token verification path
- Rate-limit header capture and normalized mapping
- Durable session store abstraction with encrypted token persistence
- Postgres-backed session store as the default runtime mode
- In-memory session store for local/dev only
- Pending OAuth state persistence
- Session lifecycle states:
active,expired,revoked,invalid - Session cleanup on startup and a periodic cleanup loop
- Basic tests covering normalization, auth challenge behavior, token verification, and session lifecycle semantics
Not fully production-complete yet:
- No HA/distributed session orchestration
- No production audit sink or observability pipeline
- OAuth end-to-end integration tests against live X are not included
- Opaque access-token handling is mode-based and still depends on the chosen trust boundary
in_memorymode is not durable and should be treated as dev-only- The OAuth bridge pattern still uses
oauth_session_idas a caller-supplied session handle; this is not a full native OAuth UX integration
All tools are read-only and annotated with:
readOnlyHint: trueopenWorldHint: true
Use this when you need canonical user profiles for one or more X usernames and/or user IDs before timeline or search analysis.
- Auth: public app bearer token OR OAuth session
- Input:
usernames[]and/orids[], optionaloauth_session_id - Output: normalized user list + source/meta/rate-limit
Use this when you need last-7-days X search results for keywords, handles, hashtags, cashtags, or narrative phrases.
- Auth: public app bearer token OR OAuth session
- Input:
query,max_results, optional cursor/time window/session - Output: normalized timeline bundle
- Constraint: recent-search semantics only (no full archive in V1)
Use this when you need recent posts for a specific X user ID with optional reply/retweet exclusion and pagination.
- Auth: public app bearer token OR OAuth session
- Input:
user_id, optional exclusion flags + pagination + session - Output: normalized timeline bundle
Use this when you need to hydrate a known batch of X post IDs after a search or timeline retrieval step.
- Auth: public app bearer token OR OAuth session
- Input:
post_ids[], optional session - Output: normalized timeline bundle
Use this when you need to verify which X account is linked to the current OAuth session before user-scoped timeline analysis.
- Auth: OAuth required
- Input:
oauth_session_id - Output: normalized user lookup result (single user expected)
Use this when you need the reverse-chronological home timeline for the OAuth-linked X account.
- Auth: OAuth required
- Input:
oauth_session_id, optional exclusion flags + pagination - Output: normalized timeline bundle
Use this when you need one normalized analysis bundle generated from a search, timeline, home-timeline, or post-batch retrieval flow.
- Auth: mixed depending on mode (
home_timelinerequires OAuth) - Input:
mode+ mode-specific params - Output: normalized timeline bundle
- Note: no AI inference in this tool; pure retrieval normalization only
- User lookup:
/2/users,/2/users/by - Tweet lookup batch:
/2/tweets - Recent search:
/2/tweets/search/recent - User tweets timeline:
/2/users/:id/tweets - Authenticated user:
/2/users/me - Home timeline:
/2/users/methen/2/users/:id/timelines/reverse_chronological
- Full archive search
- Stream APIs
- Post/create/delete actions
- Follow/like/bookmark actions
- DM/messaging flows
- Any write-side actions of any kind
x.lookup_usersx.search_recent_postsx.get_user_timelinex.get_post_batch
These can run without linked-user OAuth if X_APP_BEARER_TOKEN is configured.
x.get_authenticated_userx.get_home_timeline
OAuth flow:
- Start at
GET /oauth/x/start(redirects to X authorize URL with PKCE). - X redirects to
X_REDIRECT_URI(defaultGET /oauth/x/callback). - Callback exchanges code for token, stores an encrypted session durably when the store is configured for Postgres, and returns
oauth_session_id. - MCP callers provide
oauth_session_idon OAuth-required tools. - In this V1 scaffold,
oauth_session_idis still a bridge between the OAuth callback and MCP tool calls, not a full native OAuth identity layer. - If a tool finds a missing or expired OAuth session, it returns an auth-challenge-ready error with top-level
mcp/www_authenticatemetadata and anoauth_start_urlhint. - Session records are persisted with explicit lifecycle state and are cleaned up explicitly when expired or revoked.
Per OAuth tool call, the server verifies:
- Token expiration
- Required scopes
- Issuer/audience/signature in
strict_jwtmode when JWKS is configured - Session-bound opaque-token trust in
opaque_trust_sessionmode - Verification skip in
dev_skip_verifymode for local-only testing
See .env.example.
Required in practice:
PORTPUBLIC_BASE_URLMCP_BASE_PATHX_CLIENT_IDX_REDIRECT_URIX_SCOPESSESSION_STORE_MODESESSION_ENCRYPTION_KEYLOG_LEVEL
Conditionally required:
X_CLIENT_SECRET(if confidential-client exchange is required)X_APP_BEARER_TOKEN(for noauth public-tool execution)X_JWKS_URL(for JWT signature verification)X_TOKEN_VERIFICATION_MODE(strict_jwt,opaque_trust_session, ordev_skip_verify)DATABASE_URLwhenSESSION_STORE_MODE=postgresOAUTH_PENDING_AUTH_TTL_SECONDS
- Install dependencies:
npm install- Configure env:
cp .env.example .envThen fill required values.
- Apply the session schema when using Postgres:
psql "$DATABASE_URL" -f sql/migrations/001_oauth_sessions.sql- Run dev server:
npm run dev- MCP endpoint:
POST {PUBLIC_BASE_URL}{MCP_BASE_PATH}(defaultPOST http://127.0.0.1:3000/mcp)
- Health:
GET http://127.0.0.1:3000/healthz
Default runtime posture.
- Stores OAuth sessions and pending OAuth state in Postgres
- Encrypts access tokens, refresh tokens, and PKCE verifiers before persistence
- Requires
DATABASE_URL,SESSION_STORE_MODE=postgres, andSESSION_ENCRYPTION_KEY - Fails closed if the schema is missing
Development-only bridge mode.
- Uses the same encrypted row format in memory
- Loses sessions on restart
- Not durable and not production-complete
- Useful for local testing when Postgres is unavailable
active: session can be used after scope, expiry, and token verification checks passexpired: session exceeded its lifetime and is no longer returned as activerevoked: session was explicitly invalidated and is never returned as activeinvalid: session failed a refresh or was otherwise marked unusable
Cleanup behavior:
- Expired and non-active sessions are removed by the cleanup job
- Pending OAuth state is also cleaned up after expiry
- Cleanup runs once at startup and then on a periodic loop
Session access rules:
- Lookup is fail-closed
- Expired sessions are not returned
- Revoked sessions are not returned
- Linked-account hydration persists back to the session store
Timeline-oriented tools normalize into this shape:
source(platform/endpoint/fetch timestamp/auth/query/cursor)scope(account and retrieval scope info)posts(normalized post objects)includes.users,includes.mediameta.partial,meta.limitations,meta.rate_limitpagination.next_cursor,pagination.previous_cursor
User lookup normalizes into:
sourceusers[]metawith normalized rate-limit + limitation fields
Server does not fabricate unavailable metrics or fields. Required string fields are now hard-validated during normalization; if upstream omits a required field, the tool fails with an upstream-data error instead of inventing empty strings.
429: mapped toUPSTREAM_RATE_LIMITEDwith reset metadata when available.401/403: mapped to explicit auth/scope/access errors.400: mapped to validation-oriented upstream error with payload details.- Network failures: mapped to
NETWORK_ERRORwith retryability signal. - No silent failures; structured error payloads are returned.
Current tests:
tests/rateLimit.test.tstests/normalize.test.tstests/toolCatalog.test.tstests/authChallenge.test.tstests/homeTimelinePath.test.tstests/normalizeFailure.test.tstests/tokenVerifierMode.test.tstests/sessionStore.test.ts
Run:
npm test
npm run checkThe repository includes a small opt-in live harness for local operator checks.
What it covers:
GET /healthz- MCP transport reachability and basic
tools/list - Safe public/noauth tool calls when explicitly enabled
- One intentional auth-required negative check
- Manual OAuth-assisted verification with a supplied
oauth_session_id
What it does not cover:
- Full production readiness
- Browser automation
- Continuous monitoring
- Write-side X actions
- Load, resilience, or HA validation
Quick run:
LIVE_TEST_BASE_URL=http://127.0.0.1:3000 \
LIVE_TEST_ENABLE_PUBLIC_X=true \
LIVE_TEST_QUERY=openai \
npm run live:checkManual OAuth helper:
LIVE_TEST_BASE_URL=http://127.0.0.1:3000 \
LIVE_TEST_OAUTH_SESSION_ID=<session-id> \
npm run live:oauth:checkIf you need the operator workflow, environment list, and debugging notes, see docs/live_integration_harness.md.
src/
auth/
clients/
config/
contracts/
lib/
routes/
tools/
mcp.ts
server.ts
tests/
.env.example
README.md
package.json
tsconfig.json
- Add OAuth integration tests against a controlled X app environment.
- Add request-level auth context bridging from MCP client identity into token resolution.
- Add optional cache for safe read endpoints with explicit TTL policy.
- Add structured audit sink (request provenance + tool invocation logs).
- Add optional widget/UI layer without changing the core tool contracts.