Skip to content
This repository has been archived by the owner on Apr 18, 2024. It is now read-only.

Offline-first features #8

Open
2 tasks
0x77dev opened this issue Mar 20, 2024 · 2 comments
Open
2 tasks

Offline-first features #8

0x77dev opened this issue Mar 20, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@0x77dev
Copy link
Owner

0x77dev commented Mar 20, 2024

@0x77dev 0x77dev added the enhancement New feature or request label Mar 20, 2024
@0x77dev 0x77dev self-assigned this Mar 20, 2024
@0x77dev 0x77dev changed the title Move from OpenAI library to LiteLLM Move from OpenAI to LiteLLM Mar 20, 2024
@0x77dev 0x77dev changed the title Move from OpenAI to LiteLLM Offline processing support Mar 25, 2024
@0x77dev 0x77dev changed the title Offline processing support Offline-first features Mar 25, 2024
@0x77dev
Copy link
Owner Author

0x77dev commented Mar 26, 2024

whisper.cpp does not provide SRT ouput, but there is a project that can convert its output to .srt: WhisperSub.

@Manamama
Copy link

I have a probably the same request: offer the use offline models: T5, mistral, gemma, or use integration with https://github.com/Mozilla-Ocho/llamafile, llm, etc for Claude2 and 3 and the lot.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants