-
-
Notifications
You must be signed in to change notification settings - Fork 21.7k
feat: AI / ML Api integration #4572
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
D1m7asis
commented
Jun 3, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces integration for the AI / ML API into the Chat Models component, wrapping the external API endpoints with a new node interface and corresponding credential management.
- Added a new ChatAIMLAPI node to interact with the AI / ML API.
- Implemented a loadModels method to dynamically fetch available chat-completion models.
- Introduced a new credential class to securely manage the AI / ML API key.
Reviewed Changes
Copilot reviewed 2 out of 3 changed files in this pull request and generated 2 comments.
File | Description |
---|---|
packages/components/nodes/chatmodels/ChatAIMLApi/ChatAIMLAPI.ts | Adds the ChatAIMLAPI node implementation and dynamic model loading method. |
packages/components/credentials/AIMLApi.credential.ts | Introduces the credential class for managing AI / ML API keys. |
Comments suppressed due to low confidence (1)
packages/components/credentials/AIMLApi.credential.ts:7
- [nitpick] Consider providing a value for the description property in the credential class to enhance clarity and documentation for end users.
description: string
async listModels(): Promise<INodeOptionsValue[]> { | ||
const returnData: INodeOptionsValue[] = [] | ||
try { | ||
const response = await fetch('https://api.aimlapi.com/v1/models') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider checking response.ok before processing the JSON response. For example, add logic to throw an error if the response status is not OK to ensure proper error handling.
const response = await fetch('https://api.aimlapi.com/v1/models') | |
const response = await fetch('https://api.aimlapi.com/v1/models') | |
if (!response.ok) { | |
throw new Error(`Failed to fetch models: ${response.status} ${response.statusText}`) | |
} |
Copilot uses AI. Check for mistakes.
//@ts-ignore | ||
loadMethods = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Review the necessity of the ts-ignore directive for loadMethods; updating the types may improve maintainability and help catch potential issues at compile time.
//@ts-ignore | |
loadMethods = { | |
loadMethods: { | |
listModels: () => Promise<INodeOptionsValue[]> | |
} = { |
Copilot uses AI. Check for mistakes.