Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EPIC] v2 UX refactoring #1373

Open
mudler opened this issue Dec 1, 2023 · 4 comments
Open

[EPIC] v2 UX refactoring #1373

mudler opened this issue Dec 1, 2023 · 4 comments
Assignees
Labels

Comments

@mudler
Copy link
Owner

mudler commented Dec 1, 2023

This is a tracker for a small overhaul in term of UX of LocalAI. This includes logging, debugging, error messages and user interaction including model galleries. There are at least enough issues to treat this as an epic where we can collect feedback and try to find a strategy on how to improve the user experience with LocalAI, this spans from logging to have detailed error messages, up to how an user configures LocalAI models.

Help is welcome and this is a call for contributors and to help into getting LocalAI better. There are at least few issues that fall into scope:

@mudler mudler added the enhancement New feature or request label Dec 1, 2023
@mudler mudler self-assigned this Dec 1, 2023
@mudler mudler pinned this issue Dec 1, 2023
@mudler mudler changed the title [Epic] v2 UX refactoring [EPIC] v2 UX refactoring Dec 1, 2023
@localai-bot
Copy link
Contributor

⚠️⚠️⚠️⚠️⚠️

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me,
I don't support (yet) replying!

⚠️⚠️⚠️⚠️⚠️

Thank you for bringing this up! I can see that there are several issues related to improving the user experience with LocalAI.
To start, we can focus on improving the logging and error messages to make it more clear and concise for users. We can also work on making the configuration process easier and more intuitive for users.
We can explore different options such as adding more detailed error messages, improving the user interface, and providing more documentation and examples to help users get started with LocalAI.
If you have any specific ideas or suggestions, please feel free to share them on the issues mentioned above or create a new issue if needed. We are always looking for contributions to improve LocalAI and make it more accessible for everyone!

Sources:

@LunarEngineer
Copy link

Hello! If you've got some time I'd love to chat; I'm about to open source a package that I think might be interesting and / or useful. Would this be an appropriate venue?

@LunarEngineer
Copy link

For context I think it might be useful to integrate here.

@mudler mudler unpinned this issue Feb 15, 2024
mudler added a commit that referenced this issue Mar 12, 2024
This changeset aim to have better defaults and to properly detect when
no inference settings are provided with the model.

If not specified, we defaults to mirostat sampling, and offload all the
GPU layers (if a GPU is detected).

Related to #1373 and #1723
mudler added a commit that referenced this issue Mar 12, 2024
This changeset aim to have better defaults and to properly detect when
no inference settings are provided with the model.

If not specified, we defaults to mirostat sampling, and offload all the
GPU layers (if a GPU is detected).

Related to #1373 and #1723
mudler added a commit that referenced this issue Mar 13, 2024
* fix(defaults): set better defaults for inferencing

This changeset aim to have better defaults and to properly detect when
no inference settings are provided with the model.

If not specified, we defaults to mirostat sampling, and offload all the
GPU layers (if a GPU is detected).

Related to #1373 and #1723

* Adapt tests

* Also pre-initialize default seed
mudler added a commit that referenced this issue Mar 15, 2024
Certain engines requires to know during model loading
if the embedding feature has to be enabled, however, it is impractical
to have to set it to ALL the backends that supports embeddings.

There are transformers and sentencentransformers that seamelessly handle
both cases, without having this settings to be explicitly enabled.

The case sussist only for ggml-based models that needs to enable
featuresets during model loading (and thus settings `embedding` is
required), however most of the other engines does not require this.

This change disables the check done at code side, making easier to use
embeddings by not having to specify explicitly `embeddings: true`.

Part of: #1373
mudler added a commit that referenced this issue Mar 15, 2024
Certain engines requires to know during model loading
if the embedding feature has to be enabled, however, it is impractical
to have to set it to ALL the backends that supports embeddings.

There are transformers and sentencentransformers that seamelessly handle
both cases, without having this settings to be explicitly enabled.

The case sussist only for ggml-based models that needs to enable
featuresets during model loading (and thus settings `embedding` is
required), however most of the other engines does not require this.

This change disables the check done at code side, making easier to use
embeddings by not having to specify explicitly `embeddings: true`.

Part of: #1373
@mudler mudler mentioned this issue Mar 18, 2024
4 tasks
@mudler
Copy link
Owner Author

mudler commented Apr 4, 2024

#1956

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants