feat(appkit): shared agent types and LLM adapter implementations#301
Merged
MarioCadenas merged 11 commits intomainfrom May 5, 2026
Merged
feat(appkit): shared agent types and LLM adapter implementations#301MarioCadenas merged 11 commits intomainfrom
MarioCadenas merged 11 commits intomainfrom
Conversation
This was referenced Apr 21, 2026
bb4efff to
5d060a6
Compare
7 tasks
021cf0a to
a4d0130
Compare
calvarjorge
reviewed
Apr 28, 2026
ditadi
reviewed
Apr 29, 2026
18b8bed to
ee3a677
Compare
pkosiec
reviewed
May 5, 2026
Foundation layer for the agents feature. Adds the portable type surface that every downstream layer builds on, plus the Databricks Model Serving adapter so the agents plugin (later PR) can target workspace-hosted models. `packages/shared/src/agent.ts` — no behavior, just the type vocabulary: `AgentAdapter`, `AgentEvent`, `AgentInput`, `AgentRunContext`, `AgentToolDefinition`, `Message`, `Thread`, `ThreadStore`, `ToolAnnotations`, `ToolCall`, `ToolProvider`, `ResponseStreamEvent`. Exported from the shared barrel. `packages/appkit/src/agents/databricks.ts` — `DatabricksAdapter`: streams OpenAI-compatible completions against a Databricks Model Serving endpoint (raw fetch + SSE, no vendor SDKs). Also ships `createDatabricksModel`, a Vercel-AI-SDK helper that returns a model object you can pass to `streamText`/`useChat`/etc. — handles URL rewriting (`/chat/completions` -> `/invocations`), per-request auth refresh, and tool-name sanitization. `@ai-sdk/openai` is a devDependency consumed by `createDatabricksModel` via dynamic `import()`; consumers who use that helper install it alongside `@databricks/appkit`. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
…E reader - Bridge AbortSignal to SDK CancellationToken via Context on apiClient.request - Pass signal from fromServingEndpoint streamBody into serving stream() - Cancel SSE reader in streamCompletion finally for clean teardown Tests: expect second request arg and Context when signal provided Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
- Forward Message.toolCalls/toolCallId to OpenAI tool_calls/tool_call_id - Encode tool names with same wire map as run() (dots -> __) - Use null assistant content when only tool_calls are present Closes gap for resumed threads and hydrated conversation history. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
- Cap incomplete SSE tail, each complete line, assistant text, and per-index streamed tool arguments (UTF-16 code unit counts) - Defaults: 1Mi line, 4Mi text, 2Mi tool args; overridable via adapter options - Thread limits through fromServingEndpoint and fromModelServing Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
executeToolCalls now maps canonical tool names through nameToWire for messages[].tool_calls while keeping dotted names for tool_call events and executeTool. Test: Llama-json text path asserts second POST uses analytics__query. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
Matches the serving plugin env var name so deployments configure one variable. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
…it/beta Remove ./agents/databricks package export; rely on beta entry and tsdown beta chunk. Update JSDoc example import accordingly. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
Dots map to double underscores for serving; distinct names can share the same wire string (e.g. foo.bar vs foo__bar). Throw instead of overwriting maps. Add regression test before any HTTP call. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
…n paths Shared parse / tool_call / execute / tool_result / message-append logic between structured tool_calls handling and text-parse executeToolCalls(). Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
- Yield status:error then rethrow when streamBody rejects - AbortSignal.timeout(120s) for raw fetch when no runner signal - Replace harmful Llama-array regex with indexOf/lastIndexOf slice - Cap Python-style text parser input (64KiB); narrow SSE JSON with unknown guards - console.debug malformed SSE JSON and reader teardown failures - JSDoc: executeTool errors may reach the LLM Includes regression tests. Signed-off-by: MarioCadenas <MarioCadenas@users.noreply.github.com>
379c941 to
6dac357
Compare
pkosiec
approved these changes
May 5, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Foundation layer for the agents feature. Adds the portable type surface
that every downstream layer builds on, plus three LLM adapter
implementations so the agents plugin (later PR) can target whatever the
user has.
Shared agent types
packages/shared/src/agent.ts— no behavior, just the type vocabulary:AgentAdapter,AgentEvent,AgentInput,AgentRunContext,AgentToolDefinition,Message,Thread,ThreadStore,ToolAnnotations,ToolCall,ToolProvider,ResponseStreamEvent. Exported from theshared barrel.
Adapters
packages/appkit/src/agents/databricks.ts—DatabricksAdapter:streams OpenAI-compatible completions against a Databricks Model
Serving endpoint. The
fromServingEndpoint/fromModelServingfactories route through the shared
connectors/serving/streamhelper, which delegates to the SDK's
apiClient.request({ raw: true }).That gives the adapter centralised URL encoding + authentication
with the rest of the serving surface — no bespoke
fetch()+authenticate()plumbing. The rawnew DatabricksAdapter({ endpointUrl, authenticate })constructor is preserved as an escape hatch fortests and for pointing the adapter at non-workspace endpoints.
Each adapter is self-contained and independently testable.
Test plan
serving-connector routing, URL encoding of endpoint names with
special characters, streaming + tool-call dispatch.
PR Stack
agents()plugin +createAgent(def)+ markdown-driven agents — feat(appkit): agents() plugin, createAgent(def), and markdown-driven agents #304fromPlugin()DX +runAgentplugins arg + toolkit-resolver — feat(appkit): fromPlugin() DX, runAgent plugins arg, shared toolkit-resolver #305Demo
agent-demo.mp4