Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate open source models with PEFT adapters #2

Open
gvijqb opened this issue Aug 24, 2023 · 5 comments
Open

Evaluate open source models with PEFT adapters #2

gvijqb opened this issue Aug 24, 2023 · 5 comments

Comments

@gvijqb
Copy link

gvijqb commented Aug 24, 2023

I need a way to evaluate a model like this:
https://huggingface.co/qblocks/falcon-7b-python-code-instructions-18k-alpaca

This is a finetuned model on base open source model falcon-7b for code generation. The output is a adapter file using LORA. How can I do this with your tool?

@terryyz
Copy link
Owner

terryyz commented Aug 27, 2023

Hi,

I'm working (slowly) on a new version on the dev branch, which supports the open model inference. I previously tested with LLaMA only and haven't had time to compute the results thoroughly.

Cheers,
Terry

@gvijqb
Copy link
Author

gvijqb commented Aug 28, 2023

Hi @terryyz

Thanks for the update. I am looking forward to it. Please keep me posted once you have an update.

At MonsterAPI we have developed a no-code LLM finetuner and are exploring different ways to do quick evaluation on finetuned adapters.

Thanks,
Gaurav

@terryyz
Copy link
Owner

terryyz commented Aug 28, 2023

Hi @gvijqb,

No problem!

Please let me know if you'd like to collaborate on this project and beyond :)

Cheers,
Terry

@gvijqb
Copy link
Author

gvijqb commented Aug 28, 2023

Sure, would love to explore.

Please share how we can collaborate?

@terryyz
Copy link
Owner

terryyz commented Aug 30, 2023

Not sure if MonsterAPI may support some computational resources. I'm a bit short of good GPUs these days 😞

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants