-
-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Anthropic error: "anthropic-version header is required" #323
Comments
Hi @cdolek, thanks for reporting this! This is strange, we test for this in our CI. We'll look into this and report back here. |
Hm, I tried this locally and it works for me. Could you provide which For reference my setup: from pathlib import Path
from spacy_llm.util import assemble
if __name__ == '__main__':
nlp = assemble(Path(__file__).parent.parent / "config.cfg")
doc = nlp("This is a test.") And your config ( [nlp]
lang = "en"
pipeline = ["llm"]
[components]
[components.llm]
factory = "llm"
[components.llm.model]
@llm_models = "spacy.Claude-instant-1.v1"
name = "claude-instant-1"
max_request_time = 5000
max_tries = 10
interval = 3
config = {"temperature": 0.0, "max_tokens_to_sample": 1024}
[components.llm.task]
@llm_tasks = "spacy.TextCat.v3"
labels = ["MY_LABEL"]
exclusive_classes = True |
Same example and config as yours, gives the error for me. from spacy_llm.util import assemble
nlp = assemble("./config.cfg")
doc = nlp("This is a test.") I'm on a jupyter notebook and here is the output for
I'm sure that my Anthropic setup works, since the code below works: from anthropic import Anthropic, HUMAN_PROMPT, AI_PROMPT
anthropic = Anthropic()
completion = anthropic.completions.create(
model="claude-instant-1",
max_tokens_to_sample=300,
prompt=f"{HUMAN_PROMPT} How many toes do dogs have?{AI_PROMPT}",
)
print(completion.completion) Another note, I'm using the same code and config structure for |
Tried your code (not in notebook) and got the following error again with Python 3.9.16: Traceback (most recent call last):
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 72, in _request
r.raise_for_status()
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.anthropic.com/v1/complete
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/_my_username_/_src/spacy-llm-test/main.py", line 5, in <module>
nlp = assemble(Path(__file__).parent / "config.cfg")
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/util.py", line 48, in assemble
return assemble_from_config(config)
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/util.py", line 28, in assemble_from_config
nlp = load_model_from_config(config, auto_fill=True)
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/util.py", line 587, in load_model_from_config
nlp = lang_cls.from_config(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 1864, in from_config
nlp.add_pipe(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 821, in add_pipe
pipe_component = self.create_pipe(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy/language.py", line 709, in create_pipe
resolved = registry.resolve(cfg, validate=validate)
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 756, in resolve
resolved, _ = cls._make(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 805, in _make
filled, _, resolved = cls._fill(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 860, in _fill
filled[key], validation[v_key], final[key] = cls._fill(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/confection/__init__.py", line 877, in _fill
getter_result = getter(*args, **kwargs)
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/registry.py", line 103, in anthropic_claude_instant_1
return Anthropic(
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/base.py", line 65, in __init__
self._verify_auth()
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 51, in _verify_auth
raise err
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 43, in _verify_auth
self(["test"])
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 96, in __call__
responses = [
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 97, in <listcomp>
_request({"prompt": f"{SystemPrompt.HUMAN} {prompt}{SystemPrompt.ASST}"})
File "/Users/_my_username_/miniconda3/envs/_my_env_name_/lib/python3.9/site-packages/spacy_llm/models/rest/anthropic/model.py", line 80, in _request
raise ValueError(error_msg) from ex
ValueError: Request to Anthropic API failed: {'type': 'invalid_request_error', 'message': 'anthropic-version header is required'}
[1] 58865 bus error python main.py Code:from pathlib import Path
from spacy_llm.util import assemble
if __name__ == "__main__":
nlp = assemble(Path(__file__).parent / "config.cfg")
doc = nlp("This is a test.") Config:[nlp]
lang = "en"
pipeline = ["llm"]
[components]
[components.llm]
factory = "llm"
[components.llm.model]
@llm_models = "spacy.Claude-instant-1.v1"
name = "claude-instant-1"
max_request_time = 5000
max_tries = 10
interval = 3
config = {"temperature": 0.0, "max_tokens_to_sample": 1024}
[components.llm.task]
@llm_tasks = "spacy.TextCat.v3"
labels = ["INSULT", "PRAISE"]
exclusive_classes = True |
Thanks for the debug info! It's strange that I can't reproduce it. Anyway, the issues seems to be that the version header isn't included in our requests - see here. That should be an easy fix (although by not being able to replicate it I can't guarantee it'll work before releasing). I hope I get around to doing this later today, otherwise a patch release 0.6.1 should be available on Monday. |
Great news, thank you! |
@rmitsch Just tried it, same error unfortunately... Would using ...
"anthropic_version": self._config.get("anthropic_version", "2023-06-01"),
... https://github.com/search?q=repo%3Aanthropics%2Fanthropic-sdk-python+anthropic-version&type=code |
|
@rmitsch thank you so much, it worked! |
Whatever I do, I can't use Anthropic's claude with spacy-llm. I tried both
spacy.Claude-instant-1.v1
andspacy.Claude-2.v1
.I am getting the following error during
nlp = assemble(config_file_path)
:Here is my config file:
The text was updated successfully, but these errors were encountered: