Skip to content
This repository was archived by the owner on Jan 6, 2025. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 3 additions & 6 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,14 @@ on:
- 'test/**'
- 'package.json'
- 'tsconfig.json'
- .github/workflows/build.yml
pull_request:
paths:
- 'src/**'
- 'test/**'
- 'package.json'
- 'tsconfig.json'

- .github/workflows/build.yml
jobs:
Ubuntu:
runs-on: ubuntu-latest
Expand All @@ -21,12 +22,8 @@ jobs:
- uses: actions/setup-node@v4
with:
node-version: 20.x
- uses: pnpm/action-setup@v2
with:
version: 8

- name: Install dependencies
run: pnpm install
run: npm install
- name: Build
run: npm run build
- name: Test
Expand Down
233 changes: 232 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ main().catch(console.error);
## Features
About supported features, please read description comments of each component.

I'm preparing documentation and playground website of `@wrtnio/openai-function-schema` features. Until that, please read below components' description comments. Even though you have to read source code of each component, but description comments would satisfy you.
I'm preparing documentation and playground website of `@wrtnio/openai-function-schema` features. Until that, please read below components' description comments. Even though you have to read source code of each component, but description comments of them may satisfy you.

- Schema Definitions
- [`IOpenAiDocument`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts): OpenAI function metadata collection with options
Expand All @@ -76,3 +76,234 @@ I'm preparing documentation and playground website of `@wrtnio/openai-function-s
- [`OpenAiFetcher`](https://github.com/wrtnio/openai-function-schema/blob/master/src/OpenAiFetcher.ts): Function call executor with `IOpenAiFunction`
- [`OpenAiDataCombiner`](https://github.com/wrtnio/openai-function-schema/blob/master/src/OpenAiDataCombiner.ts): Data combiner for LLM function call with human composed data
- [`OpenAiTypeChecker`](https://github.com/wrtnio/openai-function-schema/blob/master/src/OpenAiTypeChecker.ts): Type checker for `IOpenAiSchema`

### Command Line Interface
```bash
########
# LAUNCH CLI
########
# PRIOR TO NODE V20
npm install -g @wrtnio/openai-function-schema
npx wofs

# SINCE NODE V20
npx @wrtnio/openai-function-schema

########
# PROMPT
########
--------------------------------------------------------
Swagger to OpenAI Function Call Schema Converter
--------------------------------------------------------
? Swagger file path: test/swagger.json
? OpenAI Function Call Schema file path: test/plain.json
? Whether to wrap parameters into an object with keyword or not: No
```

Convert swagger to OpenAI function schema file by a CLI command.

If you run `npx @wrtnio/openai-function-schema` (or `npx wofs` after global setup), the CLI (Command Line Interface) will inquiry those arguments. After you fill all of them, the OpenAI fuction call schema file of [`IOpenAiDocument`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts) type would be created to the target location.

If you want to specify arguments without prompting, you can fill them like below:

```bash
# PRIOR TO NODE V20
npm install -g @wrtnio/openai-function-schema
npx wofs --input swagger.json --output openai.json --keyword false

# SINCE NODE V20
npx @wrtnio/openai-function-schema
--input swagger.json
--output openai.json
--keyword false
```




### Library API
If you want to utilize `@wrtnio/openai-function-schema` in the API level, you should start from composing [`IOpenAiDocument`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts) through `OpenAiComposer.document()` method.

After composing the [`IOpenAiDocument`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts) data, you may provide the nested [`IOpenAiFunction`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiFunction.ts) instances to the OpenAI, and the OpenAI may compose the arguments by its function calling feature. With the OpenAI automatically composed arguments, you can execute the function call by `OpenAiFetcher.execute()` method.

Here is the example code composing and executing the [`IOpenAiFunction`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiFunction.ts).

- Test Function: [test_fetcher_positional_bbs_article_update.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/features/fetcher/positional/test_fetcher_positional_bbs_article_update.ts)
- Backend Server Code: [BbsArticlesController.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/controllers/BbsArticlesController.ts)

```typescript
import {
IOpenAiDocument,
IOpenAiFunction,
OpenAiComposer,
OpenAiFetcher,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";
import { v4 } from "uuid";

import { IBbsArticle } from "../../../api/structures/IBbsArticle";

const main = async (): Promise<void> => {
// COMPOSE OPENAI FUNCTION CALL SCHEMAS
const swagger = JSON.parse(
await fs.promises.readFile("swagger.json", "utf8"),
);
const document: IOpenAiDocument = OpenAiComposer.document({
swagger
});

// EXECUTE OPENAI FUNCTION CALL
const func: IOpenAiFunction = document.functions.find(
(f) => f.method === "put" && f.path === "/bbs/articles",
)!;
const article: IBbsArticle = await OpenAiFetcher.execute({
document,
function: func,
connection: { host: "http://localhost:3000" },
arguments: [
// imagine that arguments are composed by OpenAI
v4(),
typia.random<IBbsArticle.ICreate>(),
],
});
typia.assert(article);
};
main().catch(console.error);
```

By the way, above example code's target operation function has multiple parameters. You know what? If you configure a function to have only one parameter by wrapping into one object type, OpenAI function calling feature constructs arguments a little bit efficiently than multiple parameters case.

Such only one object typed parameter is called `keyword parameter`, and `@wrtnio/openai-function-schema` supports such keyword parameterized function schemas. When composing [`IOpenAiDocument`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts) by `OpenAiComposer.document()` method, configures `option.keyword` to be `true`, then every [`IOpenAiFunction`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiFunction.ts) instances would be keyword parameterized. Also, `OpenAiFetcher` understands the keyword parameterized function specification, so that performs proper execution by automatic decomposing the arguments.

Here is the example code of keyword parameterizing.

- Test Function: [test_fetcher_keyword_bbs_article_update.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/features/fetcher/keyword/test_fetcher_keyword_bbs_article_update.ts)
- Backend Server Code: [BbsArticlesController.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/controllers/BbsArticlesController.ts)

```typescript
import {
IOpenAiDocument,
IOpenAiFunction,
OpenAiComposer,
OpenAiFetcher,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";
import { v4 } from "uuid";

import { IBbsArticle } from "../../../api/structures/IBbsArticle";

const main = async (): Promise<void> => {
// COMPOSE OPENAI FUNCTION CALL SCHEMAS
const swagger = JSON.parse(
await fs.promises.readFile("swagger.json", "utf8"),
);
const document: IOpenAiDocument = OpenAiComposer.document({
swagger,
options: {
keyword: true, // keyword parameterizing
}
});

// EXECUTE OPENAI FUNCTION CALL
const func: IOpenAiFunction = document.functions.find(
(f) => f.method === "put" && f.path === "/bbs/articles",
)!;
const article: IBbsArticle = await OpenAiFetcher.execute({
document,
function: func,
connection: { host: "http://localhost:3000" },
arguments: [
// imagine that argument is composed by OpenAI
{
id: v4(),
body: typia.random<IBbsArticle.ICreate>(),
},
],
});
typia.assert(article);
};
main().catch(console.error);
```

At last, there can be some special API operation that some arguments must be composed by user, not by LLM (Large Language Model). For example, if an API operation requires file uploading or secret key identifier, it must be composed by user manually in the frontend application side.

For such case, `@wrtnio/openai-function-schema` supports special option [`IOpenAiDocument.IOptions.separate`](https://github.com/wrtnio/openai-function-schema/blob/master/src/structures/IOpenAiDocument.ts). If you configure the callback function, it would be utilized for determining whether the value must be composed by user or not. When the arguments are composed by both user and LLM sides, you can combine them into one through `OpenAiDataComposer.parameters()` method, so that you can still execute the function calling with `OpenAiFetcher.execute()` method.

Here is the example code of such special case:

- Test Function: [test_combiner_keyword_parameters_query.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/features/combiner/test_combiner_keyword_parameters_query.ts)
- Backend Server Code: [MembershipController.ts](https://github.com/wrtnio/openai-function-schema/blob/main/test/controllers/MembershipController.ts)

```typescript
import {
IOpenAiDocument,
IOpenAiFunction,
IOpenAiSchema,
OpenAiComposer,
OpenAiDataCombiner,
OpenAiFetcher,
OpenAiTypeChecker,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";

import { IMembership } from "../../api/structures/IMembership";

const main = async (): Promise<void> => {
// COMPOSE OPENAI FUNCTION CALL SCHEMAS
const swagger = JSON.parse(
await fs.promises.readFile("swagger.json", "utf8"),
);
const document: IOpenAiDocument = OpenAiComposer.document({
swagger,
options: {
keyword: true,
separate: (schema: IOpenAiSchema) =>
OpenAiTypeChecker.isString(schema) &&
(schema["x-wrtn-secret-key"] !== undefined ||
schema["contentMediaType"] !== undefined),
},
});

// EXECUTE OPENAI FUNCTION CALL
const func: IOpenAiFunction = document.functions.find(
(f) => f.method === "patch" && f.path === "/membership/change",
)!;
const membership: IMembership = await OpenAiFetcher.execute({
document,
function: func,
connection: { host: "http://localhost:3000" },
arguments: OpenAiDataCombiner.parameters({
function: func,
llm: [
// imagine that below argument is composed by OpenAI
{
body: {
name: "Wrtn Technologies",
email: "master@wrtn.io",
password: "1234",
age: 20,
gender: 1,
},
},
],
human: [
// imagine that below argument is composed by human
{
query: {
secret: "something",
},
body: {
secretKey: "something",
picture: "https://wrtn.io/logo.png",
},
},
],
}),
});
typia.assert(membership);
};
main().catch(console.error);
```
16 changes: 12 additions & 4 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,17 +1,22 @@
{
"name": "@wrtnio/openai-function-schema",
"version": "0.1.0",
"version": "0.1.1",
"description": "OpenAI LLM function schema from OpenAPI (Swagger) document",
"main": "lib/index.js",
"typings": "lib/index.d.ts",
"module": "lib/index.mjs",
"bin": {
"wofs": "lib/executable/wofs.js"
},
"scripts": {
"prepare": "ts-patch install",
"build": "npm run build:main && npm run build:test",
"build:main": "rimraf lib && tsc && rollup -c",
"build:test": "rimraf bin && tsc -p test/tsconfig.json",
"dev": "npm run build:test -- --watch",
"test": "node bin/test"
"test": "npm run test:api && npm run test:cli",
"test:api": "node bin/test",
"test:cli": "node bin/src/executable/wofs.js --input test/swagger.json --output test/plain.json --keyword false"
},
"keywords": [
"openai",
Expand All @@ -30,7 +35,10 @@
"license": "ISC",
"dependencies": {
"@nestia/fetcher": "^3.4.1",
"@samchon/openapi": "^0.3.0"
"@samchon/openapi": "^0.3.0",
"commander": "^10.0.0",
"inquirer": "^8.2.5",
"typia": "^6.4.0"
},
"devDependencies": {
"@nestia/core": "^3.4.1",
Expand All @@ -42,6 +50,7 @@
"@rollup/plugin-terser": "^0.4.4",
"@rollup/plugin-typescript": "^11.1.6",
"@trivago/prettier-plugin-sort-imports": "^4.3.0",
"@types/inquirer": "^8.2.5",
"@types/node": "^20.14.9",
"@types/uuid": "^10.0.0",
"nestia": "^5.3.1",
Expand All @@ -51,7 +60,6 @@
"ts-patch": "^3.2.1",
"typescript": "5.5.2",
"typescript-transform-paths": "^3.4.7",
"typia": "^6.4.0",
"uuid": "^10.0.0"
},
"files": [
Expand Down
4 changes: 3 additions & 1 deletion src/OpenAiComposer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import {
} from "@samchon/openapi";
import { OpenApiTypeChecker } from "@samchon/openapi/lib/internal/OpenApiTypeChecker";
import { OpenApiV3Downgrader } from "@samchon/openapi/lib/internal/OpenApiV3Downgrader";
import typia from "typia";

import { OpenAiSchemaSeparator } from "./internal/OpenAiSchemaSeparator";
import { IOpenAiSchema, ISwaggerOperation } from "./module";
Expand Down Expand Up @@ -92,7 +93,8 @@ export namespace OpenAiComposer {
*/
export const document = (props: IProps): IOpenAiDocument => {
// LIST UP ARGUMENTS
const swagger: ISwagger = OpenApi.convert<any, any>(props.swagger) as any;
typia.assert(props);
const swagger: ISwagger = OpenApi.convert(props.swagger);
const options: IOpenAiDocument.IOptions = {
keyword: props.options?.keyword ?? false,
separate: props.options?.separate ?? null,
Expand Down
Loading