Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic error: "anthropic-version header is required" #323

Closed
cdolek opened this issue Oct 10, 2023 · 10 comments · Fixed by #325
Closed

Anthropic error: "anthropic-version header is required" #323

cdolek opened this issue Oct 10, 2023 · 10 comments · Fixed by #325
Labels
bug Something isn't working feat/model Feature: models

Comments

@cdolek
Copy link

cdolek commented Oct 10, 2023

Whatever I do, I can't use Anthropic's claude with spacy-llm. I tried both spacy.Claude-instant-1.v1 and spacy.Claude-2.v1.

I am getting the following error during nlp = assemble(config_file_path):

...
File ~/miniconda3/envs/_my_env_/lib/python3.9/site-packages/requests/models.py:1021, in Response.raise_for_status(self)
   1020 if http_error_msg:
-> 1021     raise HTTPError(http_error_msg, response=self)

HTTPError: 400 Client Error: Bad Request for url: https://api.anthropic.com/v1/complete
...
...

File ~/miniconda3/envs/_my_env_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py:80, in Anthropic.__call__.<locals>._request(json_data)
     78     if error["type"] == "not_found_error":
     79         error_msg += f". Ensure that the selected model ({self._name}) is supported by the API."
---> 80     raise ValueError(error_msg) from ex
     81 response = r.json()
     83 # c.f. https://console.anthropic.com/docs/api/errors

ValueError: Request to Anthropic API failed: {'type': 'invalid_request_error', 'message': 'anthropic-version header is required'}

Here is my config file:

[nlp]
lang = "en"
pipeline = ["llm"]

[components]

[components.llm]
factory = "llm"

[components.llm.model]
@llm_models = "spacy.Claude-instant-1.v1"
name = "claude-instant-1"
max_request_time = 5000
max_tries = 10
interval = 3
# config = {"temperature": 0.0}

[components.llm.task]
@llm_tasks = "spacy.TextCat.v3"
labels = ["MY_LABEL"]
exclusive_classes = True
@rmitsch rmitsch added bug Something isn't working feat/model Feature: models labels Oct 11, 2023
@rmitsch
Copy link
Collaborator

rmitsch commented Oct 11, 2023

Hi @cdolek, thanks for reporting this! This is strange, we test for this in our CI. We'll look into this and report back here.

@rmitsch
Copy link
Collaborator

rmitsch commented Oct 11, 2023

Hm, I tried this locally and it works for me. Could you provide which spacy-llm version you are using and a minimal, reproducible Python script?

For reference my setup:

from pathlib import Path
from spacy_llm.util import assemble

if __name__ == '__main__':
    nlp = assemble(Path(__file__).parent.parent / "config.cfg")
    doc = nlp("This is a test.")

And your config (max_tokens_to_sample has to be set, we should probably have a default value for that) in config.cfg:

[nlp]
lang = "en"
pipeline = ["llm"]

[components]

[components.llm]
factory = "llm"

[components.llm.model]
@llm_models = "spacy.Claude-instant-1.v1"
name = "claude-instant-1"
max_request_time = 5000
max_tries = 10
interval = 3
config = {"temperature": 0.0, "max_tokens_to_sample": 1024}

[components.llm.task]
@llm_tasks = "spacy.TextCat.v3"
labels = ["MY_LABEL"]
exclusive_classes = True

@cdolek
Copy link
Author

cdolek commented Oct 11, 2023

Same example and config as yours, gives the error for me.

from spacy_llm.util import assemble

nlp = assemble("./config.cfg")
doc = nlp("This is a test.")

I'm on a jupyter notebook and here is the output for pip list | grep spacy

spacy                         3.7.1
spacy-alignments              0.9.0
spacy-legacy                  3.0.12
spacy-llm                     0.6.0
spacy-loggers                 1.0.4
spacy-transformers            1.2.5

I'm sure that my Anthropic setup works, since the code below works:

from anthropic import Anthropic, HUMAN_PROMPT, AI_PROMPT

anthropic = Anthropic()
completion = anthropic.completions.create(
    model="claude-instant-1",
    max_tokens_to_sample=300,
    prompt=f"{HUMAN_PROMPT} How many toes do dogs have?{AI_PROMPT}",
)
print(completion.completion)

Another note, I'm using the same code and config structure for openai and palm, and they work as expected.

@cdolek
Copy link
Author

cdolek commented Oct 12, 2023

Tried your code (not in notebook) and got the following error again with Python 3.9.16:

Traceback (most recent call last):
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 72, in _request
    r.raise_for_status()
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.anthropic.com/v1/complete

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/_my_username_/_src/spacy-llm-test/main.py", line 5, in <module>
    nlp = assemble(Path(__file__).parent / "config.cfg")
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/util.py", line 48, in assemble
    return assemble_from_config(config)
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/util.py", line 28, in assemble_from_config
    nlp = load_model_from_config(config, auto_fill=True)
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/util.py", line 587, in load_model_from_config
    nlp = lang_cls.from_config(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 1864, in from_config
    nlp.add_pipe(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 821, in add_pipe
    pipe_component = self.create_pipe(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 709, in create_pipe
    resolved = registry.resolve(cfg, validate=validate)
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 756, in resolve
    resolved, _ = cls._make(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 805, in _make
    filled, _, resolved = cls._fill(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 860, in _fill
    filled[key], validation[v_key], final[key] = cls._fill(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 877, in _fill
    getter_result = getter(*args, **kwargs)
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/registry.py", line 103, in anthropic_claude_instant_1
    return Anthropic(
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/base.py", line 65, in __init__
    self._verify_auth()
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 51, in _verify_auth
    raise err
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 43, in _verify_auth
    self(["test"])
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 96, in __call__
    responses = [
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 97, in <listcomp>
    _request({"prompt": f"{SystemPrompt.HUMAN} {prompt}{SystemPrompt.ASST}"})
  File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 80, in _request
    raise ValueError(error_msg) from ex
ValueError: Request to Anthropic API failed: {'type': 'invalid_request_error', 'message': 'anthropic-version header is required'}
[1]    58865 bus error  python main.py

Code:

from pathlib import Path
from spacy_llm.util import assemble

if __name__ == "__main__":
    nlp = assemble(Path(__file__).parent / "config.cfg")
    doc = nlp("This is a test.")

Config:

[nlp]
lang = "en"
pipeline = ["llm"]

[components]

[components.llm]
factory = "llm"

[components.llm.model]
@llm_models = "spacy.Claude-instant-1.v1"
name = "claude-instant-1"
max_request_time = 5000
max_tries = 10
interval = 3
config = {"temperature": 0.0, "max_tokens_to_sample": 1024}

[components.llm.task]
@llm_tasks = "spacy.TextCat.v3"
labels = ["INSULT", "PRAISE"]
exclusive_classes = True

@rmitsch
Copy link
Collaborator

rmitsch commented Oct 13, 2023

Thanks for the debug info!

It's strange that I can't reproduce it. Anyway, the issues seems to be that the version header isn't included in our requests - see here. That should be an easy fix (although by not being able to replicate it I can't guarantee it'll work before releasing). I hope I get around to doing this later today, otherwise a patch release 0.6.1 should be available on Monday.

@cdolek
Copy link
Author

cdolek commented Oct 13, 2023

Great news, thank you!

@rmitsch rmitsch linked a pull request Oct 13, 2023 that will close this issue
3 tasks
@rmitsch rmitsch reopened this Oct 13, 2023
@rmitsch
Copy link
Collaborator

rmitsch commented Oct 13, 2023

@cdolek v0.6.1 is out. Give it a try and let me know if that fixes things?

@cdolek
Copy link
Author

cdolek commented Oct 13, 2023

@rmitsch Just tried it, same error unfortunately...

Would using anthropic-version instead of anthropic_version in spacy_llm/models/rest/anthropic/model.py work?

...
"anthropic_version": self._config.get("anthropic_version", "2023-06-01"),
...

https://github.com/search?q=repo%3Aanthropics%2Fanthropic-sdk-python+anthropic-version&type=code

@rmitsch
Copy link
Collaborator

rmitsch commented Oct 16, 2023

spacy-llm 0.6.2 is out and uses anthropic-version instead of anthropic_version. Give it a try?

@cdolek
Copy link
Author

cdolek commented Oct 16, 2023

@rmitsch thank you so much, it worked!

@cdolek cdolek closed this as completed Oct 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feat/model Feature: models
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants