Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: copilot-extensions/preview-sdk.js
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: main
Choose a base ref
...
head repository: Nils-Schiwek/preview-sdk.js
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: main
Choose a head ref
Able to merge. These branches can be automatically merged.
  • 2 commits
  • 1 file changed
  • 2 contributors

Commits on Jan 15, 2025

  1. Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature.
    Copy the full SHA
    166ae96 View commit details
  2. Update dreamcode.md

    add transformer function
    Nils-Schiwek authored Jan 15, 2025

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature.
    Copy the full SHA
    bbfecd7 View commit details
Showing with 63 additions and 0 deletions.
  1. +63 −0 dreamcode.md
63 changes: 63 additions & 0 deletions dreamcode.md
Original file line number Diff line number Diff line change
@@ -70,6 +70,69 @@ createServer(createNodeMiddleware(agent)).listen(3000);
agent.log.info("Listening on http://localhost:3000");
```

### Transform stream

As this sdk will be primarily used as a tool to call LLMs and forward these calls to copilot chat, it would be advantageous to have a function that transforms this stream into copilot events.

```ts
export async function chat(
request: HttpRequest,
context: InvocationContext,
): Promise<HttpResponseInit> {
// verify git request
// get payload
const message = getContentFromPayload();

return {
body: Readable.from(callEnterpriseLLM(message, context))
}
}

async function* callEnterpriseLLM(message: string, context: InvocationContext) {
yield createAckEvent();
const baseUrl = process.env.LLM_BASE_URL;
// Call enterprise endpoint, which is able to SSE stream a response
const enterpriseResponse = await fetch(baseUrl + '/api/v1/chat', {
method: "POST",
body: JSON.stringify({
question: message,
stream: true
})
})
// Now while reading the stream, transform each event / data item (SSE) into copilot events
// parseChunk() - function to parse a Server-Sent Event (SSE) into its event and data components
const stream = enterpriseResponse.body.getReader();
for await (const chunk of stream) {
const (event, data) = parseChunk(chunk);
switch (event) {
case "event-type-1":
yield createTextEvent(JSON.parse(data).text);
break;
case "event-type-2":
yield createReferencesEvent( transformLLMDataToReference(data) )
break;
default:
console.log(`Found unidentified llm event-type: ${event}`)
}
}
// Even cooler would be, if it would be possible to pipe this transformation into the readable stream like this:
await enterpriseResponse.body.getReader().transformToCopilotEvents((event, data) => {
switch (event) {
case "event-type-1":
yield createTextEvent(JSON.parse(data).text);
break;
case "event-type-2":
yield createReferencesEvent( transformLLMDataToReference(data) )
break;
default:
console.log(`Found unidentified llm event-type: ${event}`)
}
});

yield createDoneEvent();
}
```

### Book a flight

I'm using [@daveebbelaar](https://github.com/daveebbelaar)'s example of a flight booking agent that they demonstrate at https://www.youtube.com/watch?v=aqdWSYWC_LI