-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding message parameter support for OpenAI models #382
Labels
Comments
ok, it could be an idea but please provide me a use case for the system prompt. |
Even if the output format is in json? |
Yes, according to the groq playground, the system message should contain
the word JSON
El El sáb, 15 de jun de 2024 a la(s) 12:38 a.m., Marco Vinciguerra <
***@***.***> escribió:
… Even if the output format is in json?
—
Reply to this email directly, view it on GitHub
<#382 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AADSHIVDNBCXIYAE2PY3ZGLZHPOPNAVCNFSM6AAAAABJKD63P6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRZGE3DGMJTGQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
give me an example please |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
OpenAI models uses the message parameter for the prompt. ScrapegraphAI also use this parameter to link to the prompt argument on scrapper invocation.
However, sometimes when using openAI models, we need to make multiple prompts to better guide the response (like in this article and this documentation).
Is it possible to replace the standart scrapGraphAI prompt when providing message argument in the graph_config ?
Example :
The text was updated successfully, but these errors were encountered: