-
Notifications
You must be signed in to change notification settings - Fork 191
T-System's LLMHUB is added as model provider backend. #139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
T-System's LLMHUB is added as model provider backend. #139
Conversation
Adding LLMHub model provider
…-Systems's LLM Hub Models
🦋 Changeset detectedLatest commit: dd79094 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update introduces support for T-Systems as a new model provider. It expands environment configurations and API interactions related to T-Systems' LLM Hub API, includes necessary dependencies, and updates the model configuration options and types. The changes also include enhancements to the Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant App
participant LLMHubAPI
User->>App: Choose T-Systems provider
App->>User: Prompt for API Key
User->>App: Enter API Key
App->>LLMHubAPI: Retrieve Model Choices with API Key
LLMHubAPI->>App: Return Model Choices
App->>User: Display Model Choices
User->>App: Select Desired Model
App->>LLMHubAPI: Configure Selected Model
LLMHubAPI->>App: Confirm Configuration
App->>User: Display Configuration Success
Poem
Tip Early access features: enabledWe are currently testing the following features in early access:
Note:
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
Outside diff range and nitpick comments (1)
helpers/python.ts (1)
Line range hint
137-150: Ensure proper scoping of local declarations in switch statements.The declaration of dependencies in the switch statement is potentially accessible by other cases. This can lead to unintended side effects.
switch (modelConfig.provider) { case "ollama": + { dependencies.push({ name: "llama-index-llms-ollama", version: "0.1.2", }); dependencies.push({ name: "llama-index-embeddings-ollama", version: "0.1.2", }); + } break; // other cases... }
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (7)
- helpers/env-variables.ts (1 hunks)
- helpers/providers/index.ts (3 hunks)
- helpers/providers/llmhub.ts (1 hunks)
- helpers/python.ts (1 hunks)
- helpers/types.ts (1 hunks)
- templates/types/streaming/fastapi/app/llmhub.py (1 hunks)
- templates/types/streaming/fastapi/app/settings.py (2 hunks)
Additional context used
Ruff
templates/types/streaming/fastapi/app/llmhub.py
1-1:
llama_index.llms.openai.OpenAIimported but unused (F401)Remove unused import:
llama_index.llms.openai.OpenAI
2-2:
llama_index.llms.openai_like.OpenAILikeimported but unused (F401)Remove unused import:
llama_index.llms.openai_like.OpenAILike
7-7: Statement ends with an unnecessary semicolon (E703)
Remove unnecessary semicolon
8-8: Statement ends with an unnecessary semicolon (E703)
Remove unnecessary semicolon
templates/types/streaming/fastapi/app/settings.py
156-156: Undefined name
model_map(F821)
Biome
helpers/env-variables.ts
[error] 137-150: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause. (lint/correctness/noSwitchDeclarations)The declaration is defined in this switch clause:
Unsafe fix: Wrap the declaration in a block.
Additional comments not posted (6)
templates/types/streaming/fastapi/app/llmhub.py (1)
10-13: Class definition is correctly implemented.The
TSIEmbeddingclass properly inherits fromOpenAIEmbeddingand initializes its internal state correctly.helpers/types.ts (1)
9-10: Type definitions are correctly updated to support the new model provider.The
ModelProvidertype now includes "t-systems", and theModelConfigtype has been updated with an optionalapiBasefield to accommodate the new provider's requirements.Also applies to: 13-13
helpers/providers/index.ts (1)
8-8: Correctly integrated new model provider in configuration logic.The import of
askLLMHubQuestionsand the addition of "T-Systems" to the choices inaskModelConfigfunction are correctly implemented. The handling of the "t-systems" case in the switch statement ensures that the new provider is integrated smoothly.Also applies to: 33-38, 61-63
templates/types/streaming/fastapi/app/settings.py (1)
22-23: Ensure consistent error handling for new model provider initialization.The addition of
init_llmhubto handle T-Systems settings is consistent with the structure used for other providers. However, ensure thatinit_llmhubis robust against potential configuration errors, similar to other provider initializations.helpers/python.ts (1)
170-179: Review added dependencies for T-Systems.The dependencies for T-Systems (
llama-index-agent-openaiandllama-index-llms-openai-like) have been added correctly according to the provider specification. Ensure these dependencies are available and the versions specified are correct.helpers/env-variables.ts (1)
245-259: Review added environment variables for T-Systems.The environment variables for T-Systems are added correctly with appropriate descriptions and values. Make sure the base URL (
T_SYSTEMS_LLMHUB_BASE_URL) and API key (T_SYSTEMS_LLMHUB_API_KEY) are securely handled and not exposed inappropriately.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- templates/types/streaming/fastapi/app/llmhub.py (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- templates/types/streaming/fastapi/app/llmhub.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Outside diff range and nitpick comments (2)
helpers/env-variables.ts (1)
Line range hint
139-152: Encapsulate switch case declarations to avoid scope leakage.Variables declared within a switch case are accessible in the entire switch block, which can lead to unintended behavior or bugs.
case "pinecone": + { return [ { name: "PINECONE_API_KEY", description: "Configuration for Pinecone vector store\nThe Pinecone API key.", }, { name: "PINECONE_ENVIRONMENT", }, { name: "PINECONE_INDEX_NAME", }, ]; + }questions.ts (1)
Line range hint
571-578: Remove unnecessary else clause for cleaner code.The
elseclause is redundant as all previous branches of the conditional structure end with abreakstatement, making theelseunnecessary and potentially confusing.switch (selectedSource) { case "exampleFile": { program.dataSources.push(EXAMPLE_FILE); break; } case "file": case "folder": { const selectedPaths = await selectLocalContextData(selectedSource); for (const p of selectedPaths) { program.dataSources.push({ type: "file", config: { path: p, }, }); } break; } case "web": { const { baseUrl } = await prompts( { type: "text", name: "baseUrl", message: "Please provide base URL of the website: ", initial: "https://www.llamaindex.ai", validate: (value: string) => { if (!value.includes("://")) { value = `https://${value}`; } const urlObj = new URL(value); if ( urlObj.protocol !== "https:" && urlObj.protocol !== "http:" ) { return `URL=${value} has invalid protocol, only allow http or https`; } return true; }, }, questionHandlers, ); program.dataSources.push({ type: "web", config: { baseUrl, prefix: baseUrl, depth: 1, }, }); break; } case "db": { const dbPrompts: prompts.PromptObject<string>[] = [ { type: "text", name: "uri", message: "Please enter the connection string (URI) for the database.", initial: "mysql+pymysql://user:pass@localhost:3306/mydb", validate: (value: string) => { if (!value) { return "Please provide a valid connection string"; } else if ( !( value.startsWith("mysql+pymysql://") || value.startsWith("postgresql+psycopg://") ) ) { return "The connection string must start with 'mysql+pymysql://' for MySQL or 'postgresql+psycopg://' for PostgreSQL"; } return true; }, }, // Only ask for a query, user can provide more complex queries in the config file later { type: (prev) => (prev ? "text" : null), name: "queries", message: "Please enter the SQL query to fetch data:", initial: "SELECT * FROM mytable", }, ]; program.dataSources.push({ type: "db", config: await prompts(dbPrompts, questionHandlers), }); } } - else { - // user doesn't want another data source or any data source - break; - }
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (7)
- helpers/env-variables.ts (2 hunks)
- helpers/providers/index.ts (3 hunks)
- helpers/providers/llmhub.ts (1 hunks)
- helpers/types.ts (1 hunks)
- questions.ts (1 hunks)
- templates/types/streaming/fastapi/app/llmhub.py (1 hunks)
- templates/types/streaming/fastapi/app/settings.py (3 hunks)
Files skipped from review as they are similar to previous changes (5)
- helpers/providers/index.ts
- helpers/providers/llmhub.ts
- helpers/types.ts
- templates/types/streaming/fastapi/app/llmhub.py
- templates/types/streaming/fastapi/app/settings.py
Additional context used
Biome
helpers/env-variables.ts
[error] 139-152: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.The declaration is defined in this switch clause:
Unsafe fix: Wrap the declaration in a block.
(lint/correctness/noSwitchDeclarations)
questions.ts
[error] 571-578: This else clause can be omitted because previous branches break early.
Unsafe fix: Omit the else clause.
(lint/style/noUselessElse)
Additional comments not posted (2)
helpers/env-variables.ts (1)
11-11: Ensure proper handling of external imports.The import of
TSYSTEMS_LLMHUB_API_URLfrom./providers/llmhubis crucial for the integration of T-Systems. Ensure that this module exports the constant correctly to avoid runtime errors.Verification successful
The export of
TSYSTEMS_LLMHUB_API_URLis correctly defined inllmhub.ts.
- The constant
TSYSTEMS_LLMHUB_API_URLis properly exported from./helpers/providers/llmhub.ts.Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the export of TSYSTEMS_LLMHUB_API_URL from llmhub module. # Test: Search for the export statement in llmhub.ts. Expect: Export of TSYSTEMS_LLMHUB_API_URL. rg --type typescript "export const TSYSTEMS_LLMHUB_API_URL =" ./helpers/providers/llmhub.tsLength of output: 128
Script:
#!/bin/bash # Description: Verify the export of TSYSTEMS_LLMHUB_API_URL from llmhub module. # Test: Search for the export statement in llmhub.ts. Expect: Export of TSYSTEMS_LLMHUB_API_URL. rg "export const TSYSTEMS_LLMHUB_API_URL =" ./helpers/providers/llmhub.tsLength of output: 114
questions.ts (1)
474-474: Verify integration of new framework parameter in function calls.The addition of the
frameworkparameter to theaskModelConfigfunction call is crucial for framework-specific configurations. Ensure that this change is propagated correctly throughout the codebase.Verification successful
The integration of the
frameworkparameter in theaskModelConfigfunction is correct.The
frameworkparameter is properly included in both the function call and the function definition.
Function Call:
questions.tsconst modelConfig = await askModelConfig({ openAiKey, askModels: program.askModels ?? false, framework: program.framework, });Function Definition:
helpers/providers/index.tsexport async function askModelConfig({ askModels, openAiKey, framework, }: ModelConfigQuestionsParams): Promise<ModelConfig> { // Function implementationScripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the correct usage of the `framework` parameter in `askModelConfig` function calls. # Test: Search for the function usage. Expect: Correct usage of the new `framework` parameter. rg --type typescript "askModelConfig\("Length of output: 76
Script:
#!/bin/bash # Description: Verify the correct usage of the `framework` parameter in `askModelConfig` function calls. # Test: Search for the function usage. Expect: Correct usage of the new `framework` parameter. rg --type ts "askModelConfig\("Length of output: 159
Script:
#!/bin/bash # Description: Verify the correct usage of the `framework` parameter in `askModelConfig` function call and definition. # Check the `askModelConfig` function call in `questions.ts` rg --type ts "askModelConfig\(" questions.ts # Check the `askModelConfig` function definition in `helpers/providers/index.ts` rg --type ts "export async function askModelConfig" helpers/providers/index.tsLength of output: 213
Script:
#!/bin/bash # Description: Extract the relevant lines to verify the `framework` parameter in `askModelConfig` function call and definition. # Extract the `askModelConfig` function call in `questions.ts` rg --type ts -A 5 "askModelConfig\(" questions.ts # Extract the `askModelConfig` function definition in `helpers/providers/index.ts` rg --type ts -A 5 "export async function askModelConfig" helpers/providers/index.tsLength of output: 523
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- .changeset/silent-buses-dance.md (1 hunks)
Files skipped from review due to trivial changes (1)
- .changeset/silent-buses-dance.md
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- helpers/providers/index.ts (3 hunks)
Files skipped from review as they are similar to previous changes (1)
- helpers/providers/index.ts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (4)
- helpers/env-variables.ts (2 hunks)
- helpers/python.ts (1 hunks)
- helpers/types.ts (1 hunks)
- questions.ts (1 hunks)
Files skipped from review as they are similar to previous changes (4)
- helpers/env-variables.ts
- helpers/python.ts
- helpers/types.ts
- questions.ts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- helpers/providers/llmhub.ts (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- helpers/providers/llmhub.ts
marcusschiesser
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @mohdamir
We have added T-Systems as model provider where user can choose various state of the art LLMs provided by T-Systems.
User will need API-Key to use models which can be requested to T-Systems LLM Hub team. To know more about models available in LLM Hub. Please refer following URL.
https://docs.llmhub.t-systems.net/
Summary by CodeRabbit
New Features
Enhancements
ModelConfigto include an optionalapiBasefield.askModelConfigfunction to handle T-Systems provider specifics.Dependencies