-
Notifications
You must be signed in to change notification settings - Fork 217
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I add examples of questions + expected SQL over my own schema (zero-shot style)? #48
Comments
Maybe special instructions now can help with this by providing the examples in the special instructions tab? (Talking about the browser demo application) |
Yes, I'd like to know this too. In general, it seems the open source release of their SQL generating LLM is not well supported: there really isn't much documentation for it, and they don't really answer issues that are submitted here at GitHub. I did find this which is a "Schema Cookbook" for their API (presumably a paid product): https://defog.notion.site/Cookbook-for-schema-definitions-1650a6855ea447fdb0be75d39975571b and they cover this info, but it isn't clear how the API calls could be translated into a format you would send to SQLcoder. I guess they are more focused on monetizing this, which makes sense and is their right, but they did release this open source and it would be nice if they supported its use as an open source product too. |
This issue is stale because it has been open for 30 days with no activity. |
This issue was closed because it has been inactive for 14 days since being marked as stale. |
Hi there, apologies – I had totally compeletely missed this earlier. If still relevant (or for future visitors that come in from Google): you can prompt the SQLCoder-7b-2 model with reference queries, like this:
|
Some of our user questions are a bit tricky - for example, Geo Spatial queries.
I would like to provide the model with zero-shot examples of user questions and the expected SQL.
I've tried adding this to the prompt (an examples section in the prompt.md). For example this is my prompt markdown:
But when I do this, the model always returns the SQL for the first example, regardless of the user question.
Any hints on how to approach this?
Thanks!
The text was updated successfully, but these errors were encountered: