Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

epic: Refactor Inference logic from Server Logic #516

Closed
dan-homebrew opened this issue Apr 17, 2024 · 5 comments
Closed

epic: Refactor Inference logic from Server Logic #516

dan-homebrew opened this issue Apr 17, 2024 · 5 comments
Assignees
Milestone

Comments

@dan-homebrew
Copy link
Contributor

dan-homebrew commented Apr 17, 2024

  • Dynammic linking of "Inference Engines
@tikikun
Copy link
Contributor

tikikun commented Apr 17, 2024

@vansangpfiev to research and estimate feasibility

@tikikun tikikun changed the title Refactor Inference logic from Server Logic epic: Refactor Inference logic from Server Logic Apr 17, 2024
@vansangpfiev
Copy link
Contributor

vansangpfiev commented Apr 25, 2024

I have created new repository for llamacpp backend https://github.com/janhq/cortex.llamacpp. Still need some CI and examples to run cortex.llamacpp standalone.
Basically, cortex_cpp will pull shared library from cortex.llamacpp repository, then load it to use.

@tikikun
Copy link
Contributor

tikikun commented Apr 25, 2024

Posted here for more visibility a

Image

The details of the structure of refactoring

@tikikun
Copy link
Contributor

tikikun commented Apr 25, 2024

@louis-jan
Copy link
Contributor

It's done alrd @vansangpfiev

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

No branches or pull requests

5 participants