-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About applying the headline classification task to the finLLaMA2 #121
Comments
I have no idea what finllama2 refers to. If you want to perform headline classification using the llama2 base model, I don't think there's a specific format required. You can either do a zero-shot/few-shot in-context learning in your favored ways or refer to FinGPT_Benchmark' task-specific training phase and just use our format for training (in this way you're actually using it as a bert-like model). But if you want to use the llama2-chat model, you can refer to this https://gpus.llm-utils.org/llama-2-prompt-template/ |
Thank you for your tips!
Following your advice, I have attempted various approaches.
However, perhaps due to my own limitations, I have not yet found a
satisfactory solution.
So i am reaching out again with a cautious request.
I am interested in presenting headline to a LLM sucah as llama2 or
Qwen using zero shot or few shot learning methods.
Could I get some examples of the inference format of the input data that
should be provided for this purpose?
I look forward to hearing from you! Thank you:)
Sincerely,
Jiyoung Jeon
2023년 11월 17일 (금) 오후 7:04, Noir97 ***@***.***>님이 작성:
… I have no idea what finllama2 refers to. If you want to perform headline
classification using the llama2 base model, I don't think there's a
specific format required.
You can either do a zero-shot/few-shot in-context learning in your favored
ways or refer to FinGPT_Benchmark' task-specific training phase
<https://github.com/AI4Finance-Foundation/FinGPT/tree/master/fingpt/FinGPT_Benchmark#task-specific-instruction-tuning>
and just use our format for training (in this way you're actually using it
as a bert-like model).
But if you want to use the llama2-chat model, you can refer to this
https://gpus.llm-utils.org/llama-2-prompt-template/
Modify the template_dict here
<https://github.com/AI4Finance-Foundation/FinGPT/blob/master/fingpt/FinGPT_Benchmark/utils.py>
might help you.
—
Reply to this email directly, view it on GitHub
<#121 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/BD5JC5ZHIFT52D4ZD6ML4LTYE4ZCDAVCNFSM6AAAAAA7KC6ZU6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMJWGA3TONRXHE>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I would like to perform headline classification using finllama2. Can anyone tell me what format of input I should use for this model?
Where can I find explanations or examples on the input format for headline classification?
The text was updated successfully, but these errors were encountered: