Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I get the following error when trying to use model gpt-4 #128

Closed
TheBuilderJR opened this issue Nov 1, 2023 · 3 comments
Closed

I get the following error when trying to use model gpt-4 #128

TheBuilderJR opened this issue Nov 1, 2023 · 3 comments

Comments

@TheBuilderJR
Copy link

Works with turbo but not gpt4. Error message doesn't help much.

error: stream failed: Invalid status code: 404 Not Found

Code

#[derive(Deserialize)]
struct ChatInput {
    content: String,
    model: Option<String>,
}

#[post("/stream", format = "json", data = "<chat_input>")]
async fn stream(chat_input: Json<ChatInput>) -> TextStream![String] {
    let client = Client::new();
    let model = chat_input.model.clone().unwrap_or_else(|| "gpt-3.5-turbo".to_string());

    println!("{}", model);
    let model = "gpt-4-32k".to_string();
    let request = CreateChatCompletionRequestArgs::default()
        .model(model)
        // .max_tokens(512u16)
        .messages([
            ChatCompletionRequestMessageArgs::default()
                .content(chat_input.content.clone())
                .role(Role::User)
                .build()
                .unwrap(),
        ])
        .build()
        .unwrap();

    dbg!(&request);

    let mut stream = client.chat().create_stream(request).await.unwrap();

    TextStream! {
        while let Some(result) = stream.next().await {
            match result {
                Ok(response) => {
                    for chat_choice in &response.choices {
                        if let Some(ref content) = chat_choice.delta.content {
                            yield content.clone();
                        }
                    }
                }
                Err(err) => {
                    yield format!("error: {}", err);
                }
            }
        }
    }
}
@64bit
Copy link
Owner

64bit commented Nov 2, 2023

OpenAI docs might provide more info on why "not found"

Does this apply to you?

GPT-4 is currently accessible to those who have made at least one successful payment through our developer platform.

@TheBuilderJR
Copy link
Author

yes it did. paying for $5 worth of credits fixed it!

I think ideally we have better error messages but I guess that's more on openai.

one thing that would be nice is if model was an enum instead of a string!

@64bit
Copy link
Owner

64bit commented Nov 3, 2023

Yeah there's opportunity to have better errors from streaming endpoint - like OpenAIError::ApiError from non-streaming counterpart. PRs welcome :)

The model is not enum for couple of reasons:

  • The spec never had it as enum
  • Even if model was an enum and given that OpenAI releases new models and deprecate old ones - its would create maintenance burden to keep library in sync. String is good enough choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants