Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Expand the LLM to learn context from MIT or Creative Commons licensed documentation #14

Open
1 of 2 tasks
bdougie opened this issue Aug 2, 2023 · 3 comments

Comments

@bdougie
Copy link
Member

bdougie commented Aug 2, 2023

Type of feature

馃崟 Feature

Current behavior

While testing open-sauced/pizza-cli#15 I was asked the following.

Want to ask a question about open-sauced/insights?
> What is the hex code for orange in the tailwind config?
Looking for tailwind.config.js in the codebase...馃攳
Searching tailwind.config.js for your query...馃攳
Generating a response...馃

The hex code for orange in the tailwind.config.js file of the open-sauced/insights repository is "hsl(30, 70.0%, 7.2%)".

It would be nice to get an explanation that the project uses HSL and not Hex codes, event perhaps convert it for the user.

Suggested solution

It would be nice to have a basic understanding or HSL and Hex Codes. I am not sure what is the optimal way to proceed, but perhaps we could include plugins from openai.

I don't think we have access yet, but something we can explore.

Additional context

https://platform.openai.com/docs/plugins/introduction

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Contributing Docs

  • I agree to follow this project's Contribution Docs
@bdougie
Copy link
Member Author

bdougie commented Aug 2, 2023

When asking to convert the HSL I got this response

> can you convert hsl(30, 70.0%, 7.2%) to hex code?
Searching the codebase for your query...馃攳
Searching components/molecules/PaginationGotoPage/pagination-goto-page.tsx for your query...馃攳
Generating a response...馃

Yes, there is a function called `hslToHex` in the codebase. It is used in the `components/molecules/PaginationGotoPage/pagination-goto-page.tsx` file. Here is an example of how it is used:\n\n```tsx\nconst PaginationGotoPage = ({ page, name, setPage, totalPage }: PaginationGotoPageProps): JSX.Element => {\n  const [pageNumber, setPageNumber] = useState<number | string>(page);\n  const [value, setValue] = useState<number | string>(\"\");\n\n  const handleGotoPage = (e: React.FormEvent<HTMLFormElement>) => {\n    e.preventDefault();\n    // Convert HSL to Hex using the hslToHex function\n    const hexColor = hslToHex(30, 70.0, 7.2);\n    // Rest of the code...\n  }\n\n  // Rest of the code...\n}\n```\n\nPlease note that the exact implementation of the `hslToHex` function is not provided in the code snippet. You may need to search for its implementation in other files or external dependencies

@jpmcb
Copy link
Member

jpmcb commented Aug 2, 2023

The likely best option (right now) would be to utilize Open API's "fine tuning" capabilities:

Fine-tuning lets you get more out of the models available through the API by providing: Higher quality results than prompt design ... Ability to train on more examples than can fit in a prompt ...

https://platform.openai.com/docs/guides/fine-tuning

I.e., we could train a fine tuned model on a subset of documentation we think would be critical to the repo-query engine knowing about. Note that this could be a large undertaking since we'd need to setup a ML pipeline with training data to feed to the openai models and keep it up to date.

I am not sure what is the optimal way to proceed, but perhaps we could include plugins from openai.

It doesn't look like openai has an API for the code interpreter or plugins yet:

https://community.openai.com/t/feedback-please-make-a-code-interpreter-api/292165

@bdougie
Copy link
Member Author

bdougie commented Aug 2, 2023

I.e., we could train a fine tuned model on a subset of documentation we think would be critical to the repo-query engine knowing about. Note that this could be a large undertaking since we'd need to setup a ML pipeline with training data to feed to the openai models and keep it up to date.

These things are in a private preview. I spoke with an openai team member and they mention they may be able to get us access. The pitch there is that we need to have the use case for it, which we do now.

@bdougie bdougie changed the title Feature: Expand the LLM to learn context from MIT documentation Feature: Expand the LLM to learn context from MIT or Creative Commons licensed documentation Aug 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants