-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference API always returns error: Invalid token #38
Comments
Hi @greyblue9 , Are you sure you're correctly passing the token ? f"Bearer {API_TOKEN}" #works
# vs
"Bearer {API_TOKEN}" # doesn't work, no formatting, `"{API_TOKEN}"` is actually invalid. Both old and new tokens should work the same way by the way. Sorry about this issue. |
Thank you so much for your response. I actually created a new account with a new (write) token, and that worked 🤔 What I was using previously, the token was much longer (about 3x longer and had no Maybe the |
Probably, we really need to get Glad you could work it out. |
I will close this, don ´t hesitate to open again if you think it's incorrect. |
Sorry if this is not the best place to post this issue. I am having an issue with the inference API suddenly, after it has worked perfectly for months:
After seeing '{ "error": "invalid token" }' coming back in response to queries, I created a new API token (old one was not showing up, abd changed the header from
Authorization: Bearer api_XXX
toAuthorization: Bearer hf_YYY
as outlined in the docs, but am still facing the same error.Any idea what could be the issue? I am so hoping this could be fixed soon.
The text was updated successfully, but these errors were encountered: