-
Notifications
You must be signed in to change notification settings - Fork 289
Ospp/new llm embedding #19727
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ospp/new llm embedding #19727
Conversation
As part of our document LLM support, we are introducing the `LLM_EXTRACT_TEXT` function. This function extracts text from PDF files and writes the extracted text to a specified text file, extractor type can be specified by the third argument.
|
|
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Code Suggestions ✨Explore these optional code suggestions:
|
|
Close due to no activity. |
User description
What type of PR is this?
Which issue(s) this PR fixes:
issue #18664
What this PR does / why we need it:
As part of our document LLM support, we are introducing the
LLM_EMBEDDINGfunction. This function can a string or embed the content from a specified txt file by using LLM platforms like Ollama and LLM models like llama3.Three global variables are introduced for users to customize their own LLM platforms, proxy and models.
Three global variables:
llm_embedding_platform: default isollamallm_server_proxy: default ishttp://localhost:11434/api/embedllm_model: default isllama3Usage:
llm_embedding(<input txt datalink>);orllm_embedding(<input string>);Return Value: a vector of 4096 32-bit floating point.
Note:
ollama run llama3in the shell before using the embedding function.Example SQL:
Example return:
PR Type
Enhancement, Tests
Description
LLM_CHUNK,LLM_EXTRACT_TEXT, andLLM_EMBEDDING.Changes walkthrough 📝
6 files
func_llm.go
Implement LLM functions for chunking, text extraction, and embeddingpkg/sql/plan/function/func_llm.go
document.
list_builtIn.go
Register new LLM functions and define overloadspkg/sql/plan/function/list_builtIn.go
LLM_EMBEDDING.
ollama_service.go
Implement Ollama service interaction for embeddingspkg/sql/plan/function/ollama_service.go
embedding_service.go
Define EmbeddingService interface and implement Ollama servicepkg/sql/plan/function/embedding_service.go
variables.go
Add system variables for LLM embedding configurationpkg/frontend/variables.go
function_id.go
Add function IDs for new LLM functionspkg/sql/plan/function/function_id.go
7 files
func_llm_test.go
Add unit tests for LLM chunking and extraction functionspkg/sql/plan/function/func_llm_test.go
func_llm_chunk.result
Add expected results for LLM chunking function teststest/distributed/cases/function/func_llm_chunk.result
func_llm_chunk.sql
Add SQL test cases for LLM chunking functiontest/distributed/cases/function/func_llm_chunk.sql
func_llm_extract_file.result
Add expected results for LLM text extraction function teststest/distributed/cases/function/func_llm_extract_file.result
func_llm_extract_file.sql
Add SQL test cases for LLM text extraction functiontest/distributed/cases/function/func_llm_extract_file.sql
4.txt
Add text file for LLM chunking test resourcestest/distributed/resources/llm_test/chunk/4.txt
func_llm_embedding.sql
Add SQL test case for LLM embedding functiontest/distributed/cases/function/func_llm_embedding.sql
2 files
go.sum
Update dependencies for PDF processinggo.sum
go.mod
Add PDF processing library to module dependenciesgo.mod
1 files
func_llm_embedding.result
...test/distributed/cases/function/func_llm_embedding.result
...