What version of Codex is running?
v0.63.0
What subscription do you have?
■ Error running remote compact task: unexpected status 400 Bad Request: { "error": { "message": "Invalid prompt: your prompt was flagged as potentially violating our usage policy. Please try again with a different prompt: https://platform.openai.com/docs/guides/reasoning#advice-on-prompting", "type": "invalid_request_error", "param": null, "code": "invalid_prompt" } }
Which model were you using?
gpt-5.1-codex-max
What platform is your computer?
WSL2
What issue are you seeing?
Got
"■ Error running remote compact task: unexpected status 400 Bad Request: {
"error": {
"message": "Invalid prompt: your prompt was flagged as potentially violating our usage policy. Please try again with a different
prompt: https://platform.openai.com/docs/guides/reasoning#advice-on-prompting",
"type": "invalid_request_error",
"param": null,
"code": "invalid_prompt"
}
}"
btw my prompt:
"Let's fix the bugs now, then we'll run the test. See how they work."
:/
What steps can reproduce the bug?
Uploaded thread: 019ab510-eee9-7f13-b25d-d88b3b162344
What is the expected behavior?
No response
Additional information
No response
What version of Codex is running?
v0.63.0
What subscription do you have?
■ Error running remote compact task: unexpected status 400 Bad Request: { "error": { "message": "Invalid prompt: your prompt was flagged as potentially violating our usage policy. Please try again with a different prompt: https://platform.openai.com/docs/guides/reasoning#advice-on-prompting", "type": "invalid_request_error", "param": null, "code": "invalid_prompt" } }
Which model were you using?
gpt-5.1-codex-max
What platform is your computer?
WSL2
What issue are you seeing?
Got
"■ Error running remote compact task: unexpected status 400 Bad Request: {
"error": {
"message": "Invalid prompt: your prompt was flagged as potentially violating our usage policy. Please try again with a different
prompt: https://platform.openai.com/docs/guides/reasoning#advice-on-prompting",
"type": "invalid_request_error",
"param": null,
"code": "invalid_prompt"
}
}"
btw my prompt:
"Let's fix the bugs now, then we'll run the test. See how they work."
:/
What steps can reproduce the bug?
Uploaded thread: 019ab510-eee9-7f13-b25d-d88b3b162344
What is the expected behavior?
No response
Additional information
No response