feat: add @objectstack/service-ai — unified AI capability service plugin#1000
feat: add @objectstack/service-ai — unified AI capability service plugin#1000
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
…ing, and conversation management - Add AIToolDefinition, AIToolCall, AIToolResult types for tool calling protocol - Add AIMessageWithTools extending AIMessage with 'tool' role and tool call fields - Add AIRequestOptionsWithTools extending AIRequestOptions with tool definitions - Add AIStreamEvent type for streaming chat completions - Add streamChat optional method to IAIService interface - Add AIConversation type and IAIConversationService interface for conversation CRUD - Add comprehensive tests for all new types and interfaces Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
…IToolDefinition Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
- Expand IAIService contract with streamChat, tool calling, conversation management - Create packages/services/service-ai with LLM adapter layer, ToolRegistry, InMemoryConversationService, REST/SSE routes, kernel plugin - Add 52 comprehensive tests covering all components - Update changeset config, CHANGELOG.md, ROADMAP.md Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/c48a190a-19bb-45fd-8caa-0b6b9e1f7c49 Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
Adds a new @objectstack/service-ai package and expands the IAIService contract to support streaming and tool-calling concepts, wiring the service into the kernel as an 'ai' service with framework-agnostic REST/SSE route definitions.
Changes:
- Expanded
packages/specAI contracts with tool-calling + streaming event types and added a conversation service contract. - Introduced
packages/services/service-aiimplementing an adapter-based AI service, tool registry, in-memory conversation store, kernel plugin, and REST/SSE route builders. - Updated project metadata/docs (ROADMAP, root CHANGELOG, changesets fixed group, lockfile) to include the new service.
Reviewed changes
Copilot reviewed 21 out of 22 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| ROADMAP.md | Marks service-ai as implemented and updates AI roadmap items/status. |
| pnpm-lock.yaml | Adds lock entries for the new packages/services/service-ai workspace package. |
| packages/spec/src/contracts/ai-service.ts | Extends AI contracts with tool calling, streaming events, and conversation management interfaces. |
| packages/spec/src/contracts/ai-service.test.ts | Adds contract-level type tests for tool calling, streaming, and conversations. |
| packages/services/service-ai/tsconfig.json | TypeScript config for the new service package. |
| packages/services/service-ai/src/tools/tool-registry.ts | Implements a basic tool registry with execution helpers. |
| packages/services/service-ai/src/tools/index.ts | Exports tool registry public API. |
| packages/services/service-ai/src/routes/index.ts | Exports route builder public API. |
| packages/services/service-ai/src/routes/ai-routes.ts | Defines framework-agnostic REST/SSE route handlers for AI endpoints. |
| packages/services/service-ai/src/plugin.ts | Registers AIService as the kernel 'ai' service and triggers hooks for tools/routes. |
| packages/services/service-ai/src/index.ts | Package entrypoint exporting service, plugin, adapters, tools, conversations, and routes. |
| packages/services/service-ai/src/conversation/index.ts | Exports in-memory conversation service. |
| packages/services/service-ai/src/conversation/in-memory-conversation-service.ts | Implements IAIConversationService with an in-memory Map store. |
| packages/services/service-ai/src/ai-service.ts | Implements IAIService orchestrator delegating to an adapter and exposing tool/conversation subcomponents. |
| packages/services/service-ai/src/adapters/types.ts | Defines the LLMAdapter interface. |
| packages/services/service-ai/src/adapters/memory-adapter.ts | Provides a deterministic in-memory adapter for dev/test. |
| packages/services/service-ai/src/adapters/index.ts | Exports adapter types and memory adapter. |
| packages/services/service-ai/src/tests/ai-service.test.ts | Adds unit/integration tests for adapters, registry, conversations, routes, and plugin behavior. |
| packages/services/service-ai/package.json | Declares the new package, exports, scripts, and dependencies. |
| packages/services/service-ai/CHANGELOG.md | Adds initial package changelog entry. |
| CHANGELOG.md | Adds unreleased notes for the AI service plugin and contract expansions. |
| .changeset/config.json | Adds @objectstack/service-ai to the fixed versioning group. |
Files not reviewed (1)
- pnpm-lock.yaml: Language not supported
Comments suppressed due to low confidence (1)
.changeset/config.json:45
- The repo uses Changesets for versioning (
package.jsonscripts runchangeset version/publish), but this PR doesn’t add a.changeset/*.mdentry for the new@objectstack/service-aipackage / contract changes. Without a changeset, automated versioning/changelog generation may miss this release. Consider adding an appropriate changeset file describing the addition.
"@objectstack/service-job",
"@objectstack/service-queue",
"@objectstack/service-realtime",
"@objectstack/service-ai",
"@objectstack/service-storage",
"@objectstack/docs",
"create-objectstack",
"objectstack-vscode"
]
| @@ -83,4 +177,90 @@ export interface IAIService { | |||
| * @returns Array of model identifiers | |||
| */ | |||
| listModels?(): Promise<string[]>; | |||
|
|
|||
| /** | |||
| * Stream a chat completion as an async iterable of events | |||
| * @param messages - Array of conversation messages | |||
| * @param options - Optional request configuration (supports tool definitions) | |||
| * @returns Async iterable of stream events | |||
| */ | |||
| streamChat?(messages: AIMessage[], options?: AIRequestOptionsWithTools): AsyncIterable<AIStreamEvent>; | |||
| } | |||
There was a problem hiding this comment.
Tool definitions/choice are only plumbed through streamChat via AIRequestOptionsWithTools, while chat/complete still accept AIRequestOptions. This makes non-streaming tool calling impossible through the contract even though the PR advertises “tool calling support”. Consider widening chat options (and potentially messages) to the tool-aware types as well, or clarifying that tool calling is streaming-only.
| export interface AIConversation { | ||
| /** Conversation ID */ | ||
| id: string; | ||
| /** Title / summary */ | ||
| title?: string; | ||
| /** Associated agent ID */ | ||
| agentId?: string; | ||
| /** User who owns the conversation */ | ||
| userId?: string; | ||
| /** Messages in the conversation */ | ||
| messages: AIMessage[]; | ||
| /** Creation timestamp (ISO 8601) */ | ||
| createdAt: string; | ||
| /** Last update timestamp (ISO 8601) */ | ||
| updatedAt: string; | ||
| /** Conversation metadata */ | ||
| metadata?: Record<string, unknown>; | ||
| } |
There was a problem hiding this comment.
AIConversation.messages is typed as AIMessage[], which cannot carry tool-call metadata (toolCalls, toolCallId) or the 'tool' role. If tool calls/results are part of persisted conversations, this should likely use the tool-aware message type (or a union) to avoid losing data / forcing any casts downstream.
| /** | ||
| * A chat message that may carry tool-related metadata. | ||
| * Widens the `role` union to include `tool` for tool result messages. | ||
| */ | ||
| export interface AIMessageWithTools { | ||
| /** Message role – adds `tool` for tool result messages */ | ||
| role: 'system' | 'user' | 'assistant' | 'tool'; | ||
| /** Message content */ | ||
| content: string; | ||
| /** Tool calls requested by the assistant */ | ||
| toolCalls?: AIToolCall[]; | ||
| /** ID of the tool call this message responds to (for role='tool') */ | ||
| toolCallId?: string; | ||
| } |
There was a problem hiding this comment.
AIMessageWithTools introduces role 'tool' and tool-call metadata, but the core APIs (IAIService.chat, IAIService.streamChat) and AIConversation.messages still use AIMessage[] (which cannot represent tool-result messages). Consider unifying the message type used across the contract (e.g., extend AIMessage with tool fields / 'tool' role, or switch the service + conversation APIs to the tool-aware message type/union) so tool calls/results don’t force any casts or data loss.
| export interface AIServiceConfig { | ||
| /** LLM adapter to delegate calls to (defaults to MemoryLLMAdapter). */ | ||
| adapter?: LLMAdapter; | ||
| /** Logger instance. */ | ||
| logger?: Logger; | ||
| /** Pre-registered tools. */ | ||
| toolRegistry?: ToolRegistry; | ||
| /** Conversation service (defaults to InMemoryConversationService). */ | ||
| conversationService?: InMemoryConversationService; | ||
| } |
There was a problem hiding this comment.
AIServiceConfig.conversationService is typed as InMemoryConversationService (and the class property is the same), which prevents injecting any other IAIConversationService implementation (e.g., persistent storage) without type casts. Consider typing this as IAIConversationService in the config and on the AIService instance, while still defaulting to new InMemoryConversationService().
| export function buildAIRoutes(service: AIService, logger: Logger): RouteDefinition[] { | ||
| return [ | ||
| // ── Chat ──────────────────────────────────────────────────── | ||
| { |
There was a problem hiding this comment.
buildAIRoutes takes a concrete AIService and reaches into service.conversationService, which couples the route layer to this specific implementation and makes it hard to expose the same routes for any other IAIService implementation. Consider depending on contracts instead (e.g., accept IAIService + IAIConversationService as parameters, or define a minimal interface for route wiring).
| const { messages, options } = (req.body ?? {}) as { | ||
| messages?: unknown[]; | ||
| options?: Record<string, unknown>; | ||
| }; | ||
|
|
||
| if (!Array.isArray(messages) || messages.length === 0) { | ||
| return { status: 400, body: { error: 'messages array is required' } }; | ||
| } | ||
|
|
||
| try { | ||
| const result = await service.chat(messages as any, options as any); | ||
| return { status: 200, body: result }; | ||
| } catch (err) { | ||
| logger.error('[AI Route] /chat error', err instanceof Error ? err : undefined); | ||
| return { status: 500, body: { error: 'Internal AI service error' } }; | ||
| } |
There was a problem hiding this comment.
Route input is only shallowly validated and then cast to any (e.g., /api/v1/ai/chat accepts any array and forwards it to the adapter). This can lead to runtime errors and 500s for malformed payloads (missing role, non-string content, etc.). Consider validating each message shape (role enum + string content) before calling service.chat/streamChat and returning a 400 with a clear error when invalid.
| description: 'List conversations', | ||
| handler: async (req) => { | ||
| try { | ||
| const conversations = await service.conversationService.list(req.query as any); |
There was a problem hiding this comment.
GET /api/v1/ai/conversations forwards req.query directly into conversationService.list(), but query values are strings; limit (and any future numeric fields) should be parsed/coerced to numbers (and validated) to match the IAIConversationService.list contract. Consider normalizing limit (e.g., Number.parseInt) and rejecting invalid values with a 400.
| const conversations = await service.conversationService.list(req.query as any); | |
| const rawQuery = req.query ?? {}; | |
| const options: Record<string, unknown> = { ...rawQuery }; | |
| if (typeof rawQuery.limit === 'string') { | |
| const parsedLimit = Number.parseInt(rawQuery.limit, 10); | |
| if (!Number.isFinite(parsedLimit) || Number.isNaN(parsedLimit) || parsedLimit <= 0) { | |
| return { status: 400, body: { error: 'Invalid limit parameter' } }; | |
| } | |
| options.limit = parsedLimit; | |
| } | |
| const conversations = await service.conversationService.list(options as any); |
| if (!message || typeof message.content !== 'string') { | ||
| return { status: 400, body: { error: 'message with content string is required' } }; | ||
| } | ||
|
|
There was a problem hiding this comment.
POST /api/v1/ai/conversations/:id/messages only validates content is a string, but does not validate role (required by the AIMessage contract). This can create stored messages with invalid roles and later break adapters expecting 'system' | 'user' | 'assistant'. Consider validating role and rejecting invalid roles with a 400.
| const role = (message as any).role; | |
| if (role !== 'system' && role !== 'user' && role !== 'assistant') { | |
| return { | |
| status: 400, | |
| body: { error: 'message.role must be one of "system", "user", or "assistant"' }, | |
| }; | |
| } |
|
@copilot 仔细评估是否已经完成了任务中要求的所有开发并修正测试出的问题 |
…t, use contracts, add input validation - Unify AIMessage type: 'tool' role + toolCalls/toolCallId now on base type - AIRequestOptions now includes tools/toolChoice for non-streaming tool calling - AIMessageWithTools/AIRequestOptionsWithTools kept as deprecated aliases - AIServiceConfig.conversationService typed as IAIConversationService - buildAIRoutes depends on IAIService + IAIConversationService contracts - Routes validate message role (system/user/assistant/tool) and content - GET /conversations parses and validates limit from query string - POST /conversations/:id/messages validates role before persisting - 60 service tests + 22 contract tests = 82 total passing Agent-Logs-Url: https://github.com/objectstack-ai/spec/sessions/be512058-7717-4f5d-b4ba-9861ca48a4e9 Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
All 8 review issues have been addressed in commit
82 tests passing (60 service + 22 contract), both packages build successfully. |
|
@copilot 合并最新代码并解决冲突 |
…HANGELOG conflict Co-authored-by: hotlong <50353452+hotlong@users.noreply.github.com>
Merged latest main (including plugin-setup and Vercel serverless fix) into this branch and resolved the CHANGELOG.md conflict in |
AI capabilities lack a standardized kernel service entry point. Contracts exist (
IAIService), data schemas exist (agent.zod.ts,conversation.zod.ts, etc.), but no service plugin wires them together — leading to ad-hoc LLM integrations scattered across projects.Contract expansion (
packages/spec/src/contracts/ai-service.ts)AIMessage: The baseAIMessagetype now supports thetoolrole and optionaltoolCalls/toolCallIdfields directly, eliminating the need for a separate tool-aware message type.AIMessageWithToolsis kept as a deprecated alias for backward compatibility.AIRequestOptions: Tool-related fields (tools,toolChoice) are now onAIRequestOptionsdirectly, enabling tool calling in bothchat()andstreamChat().AIRequestOptionsWithToolsis kept as a deprecated alias.AIStreamEvent+streamChat?()onIAIServicereturningAsyncIterable<AIStreamEvent>AIConversation(with unifiedAIMessage[]that carries tool metadata),IAIConversationService(CRUD + message append)New package:
@objectstack/service-aiFollows
service-analytics/service-automationpatterns:adapters/—LLMAdapterinterface +MemoryLLMAdapter(deterministic echo adapter for tests/dev)tools/—ToolRegistrywith registration, parallel execution, error handlingconversation/—InMemoryConversationServiceimplementingIAIConversationServiceroutes/— 8 REST/SSE route definitions (/api/v1/ai/{chat,chat/stream,complete,models,conversations}), depending onIAIService+IAIConversationServicecontracts (not concrete classes), with full input validation (message role/content, query param parsing)ai-service.ts— CoreAIServiceimplementingIAIService, delegates to pluggable adapter.conversationServicetyped asIAIConversationServicefor easy injection of custom implementations.plugin.ts—AIServicePluginregisters as kernel'ai'service, emitsai:readyandai:routeshooksHousekeeping
@objectstack/service-aito changeset fixed group