-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add LLM suggestions to CLI #94
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looking good! A couple comments:
- You mentioned moving over to use promptui--I would definitely recommend it! It's super easy to use and works great.
- I think it would good if this was also accessible through a higher-level CLI command, e.g.
speakeasy suggest -s openapi.yaml
. It will read better in an article IMO and be more discoverable in the CLI instead of buried as an optional flag. Can leave it accessible via--fix
as well though
I think a higher-level CLI command is reasonable. In your mind would this CLI command automatically run validate, then suggest fixes based on that validation? Same behavior just set at a higher level |
Done with promptui |
8a99357
to
ae4b665
Compare
ae4b665
to
81487c1
Compare
|
||
var suggestCmd = &cobra.Command{ | ||
Use: "suggest", | ||
Short: "Validate an OpenAPI document and get fixes suggested by ChatGPT", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit/question: Are these suggestions technically being generated by ChatGPT? Or is it simply GPT3?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
GPT3.5 which I believe is a chat model
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some very minor things then gtg!
2ca1c97
to
73598b0
Compare
A sample CLI that communicates with downstream speakeasy API and LLM server.
Normal Testing (CLI)
Testing with Local API
Local Testing
cd speakeasy-registry (API)
1 Run local registry API
2. Make docker-llm (wait until Chroma is up to continue, you should see Uvicorn running on http://0.0.0.0:8000)
3. export SPEAKEASY_SERVER_URL=http://localhost:35290