Skip to content

Conversation

@zees-dev
Copy link
Collaborator

@zees-dev zees-dev commented Jan 6, 2025

Summary of Changes

Introducing Blockless LLM integration for the SDK.
This allows SDK consumer to integrate with host runtime LLM module and interact with it.
Currently only setting the model (via model-name) and prompting the model is supported.

FFI interface to call into host LLM module:

#[link(wasm_import_module = "blockless_llm")]
extern "C" {
    fn llm_set_model_request(model_ptr: *const u8, model_len: u32, fd: *mut u32) -> i32;
    fn llm_get_model_response(buf: *mut u8, size: u32, num: *mut u32, fd: u32) -> i32;
    fn llm_set_model_options_request(options_ptr: *const u8, options_len: u32, fd: u32) -> i32;
    fn llm_get_model_options(buf: *mut u8, size: u32, num: *mut u32, fd: u32) -> i32;
    fn llm_prompt_request(prompt_ptr: *const u8, prompt_len: u32, fd: u32) -> i32;
    fn llm_read_prompt_response(buf: *mut u8, size: u32, num: *mut u32, fd: u32) -> i32;
    fn llm_close(fd: u32) -> i32;
}

An example has been included which demonstrates how to interact with two different LLM models.

Note: The LLM module only works in the browser runtime.

@zees-dev zees-dev requested review from Joinhack, dmikey and uditdc January 6, 2025 23:00
@Joinhack Joinhack merged commit 575bb49 into main Jan 6, 2025
@zees-dev zees-dev deleted the llm-module branch January 6, 2025 23:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants