You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
There is currently no way to pass metadata around when using the project via Unix pipes.
Describe the solution you'd like
I think this really comes down to needing some metadata back in addition to the response from the LLM, and the ability to feed that metadata back for things like continuing an existing conversation.
I've considered two formats for passing this metadata around:
JSON: Kind of a no-brainer given most languages have an easy way to work with it.
So, a --json flag would cause the project to accept JSON on STDIN, and output JSON on STDOUT:
chatgpt --json '{"user_id": 1, "conversation_id": 13, "llm_input": "How many moons are there in the solar system?"}'
{
"user_id": 1,
"conversation_id: 13,"message_id": 35,
"llm_output": "There are blah moons in the solar system..."
}
Email: headers, then two newlines, then the body that's the LLM input/output:
chatgpt --email "X-ChatGPT-user_id: 1X-ChatGPT-conversation_id: 13How many moons are there in the solar system?"
X-ChatGPT-user_id: 1
X-ChatGPT-conversation_id: 13
X-ChatGPT-message_id: 35
There are blah moons in the solar system...
There would be some other considerations, like you'd have to force disable streaming, and make sure any errors are coming back on STDERR instead of STDOUT, and probably some other stuff I haven't considered. But both of these approaches seem doable.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
There is currently no way to pass metadata around when using the project via Unix pipes.
Describe the solution you'd like
I think this really comes down to needing some metadata back in addition to the response from the LLM, and the ability to feed that metadata back for things like continuing an existing conversation.
I've considered two formats for passing this metadata around:
JSON: Kind of a no-brainer given most languages have an easy way to work with it.
So, a
--json
flag would cause the project to accept JSON on STDIN, and output JSON on STDOUT:chatgpt --json '{"user_id": 1, "conversation_id": 13, "llm_input": "How many moons are there in the solar system?"}'
Email: headers, then two newlines, then the body that's the LLM input/output:
There would be some other considerations, like you'd have to force disable streaming, and make sure any errors are coming back on STDERR instead of STDOUT, and probably some other stuff I haven't considered. But both of these approaches seem doable.
The text was updated successfully, but these errors were encountered: