Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

large-v1 model support in CoreML #925

Closed
abCods opened this issue May 15, 2023 · 8 comments
Closed

large-v1 model support in CoreML #925

abCods opened this issue May 15, 2023 · 8 comments
Labels
enhancement New feature or request

Comments

@abCods
Copy link
Contributor

abCods commented May 15, 2023

Hi @ggerganov

First of all, thank you for your amazing work with Whisper.cpp. I have been especially taking advantage of your work around the CoreML support for the Apple Silicon. It is truly a cost effective solution.

In my case, I have been using the multilingual model including the large and large-v1. Over the period of time the data suggests that large-v1 is better off for the multilingual case as it offers better performance and it is less prone to the odd issue of repetitions in the transcriptions ( as the multilingual recordings are more prone to it. )

When it comes to the CoreML, the models/download-coreml-model.sh has the option of large-v1 model.
models=( "tiny.en" "tiny" "base.en" "base" "small.en" "small" "medium.en" "medium" "large-v1" "large" )

But the models/convert-whisper-to-coreml.py doesn't offer the further support on it.
parser.add_argument("--model", type=str, help="model to convert (e.g. tiny, tiny.en, base, base.en, small, small.en, medium, medium.en, large)", required=True)

Am I missing something here or is the large-v1 isn't supported yet? If so, I would be happy to contribute here.

Thanks

@ggerganov ggerganov added the enhancement New feature or request label May 15, 2023
@ggerganov
Copy link
Owner

Yes, please add large-v1 support - I just missed that

@abCods
Copy link
Contributor Author

abCods commented May 15, 2023

#926

@abCods abCods closed this as completed May 15, 2023
@eugenepyvovarov
Copy link

@abCods how fast does the large-v1 model compiles into cormel format for you? I have it running for hours, when I do force stop as suggested in other threads - I just get an error that it is not completed. hence can't compile it yet.

@abCods
Copy link
Contributor Author

abCods commented May 31, 2023

@eugenepyvovarov

On an M2 mini machine, it takes about a couple of seconds for me.

It shouldn't even take minutes, let alone hours. I guess there could be an issue with any dependency either missing or non compatible.

@eugenepyvovarov
Copy link

@abCods it really takes seconds on base/small models, but as I go to bigger ones it takes more time - medium already going some time, waiting until it is done..

@abCods
Copy link
Contributor Author

abCods commented May 31, 2023

@abCods it really takes seconds on base/small models, but as I go to bigger ones it takes more time - medium already going some time, waiting until it is done..

What are your machine's specs?

@eugenepyvovarov
Copy link

@abCods it really takes seconds on base/small models, but as I go to bigger ones it takes more time - medium already going some time, waiting until it is done..

What are your machine's specs?

M1 Ultra / 128GB

@abCods
Copy link
Contributor Author

abCods commented May 31, 2023

@abCods it really takes seconds on base/small models, but as I go to bigger ones it takes more time - medium already going some time, waiting until it is done..

What are your machine's specs?

M1 Ultra / 128GB

I'll get my hands on one of the M1s and share my experience.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants