-
I am trying to use the HuggingFace version of Phi3-mini, but I get the error hf_tkz = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-4k-instruct")
tokenizer = TransformerTokenizer(hf_tkz)
llm = AutoModelForCausalLM.from_pretrained(
"microsoft/Phi-3-mini-4k-instruct",
output_attentions=True,
trust_remote_code=True,
)
model = Transformers(llm, tokenizer) Traceback: File ~/.../site-packages/outlines/models/transformers.py:74, in TransformerTokenizer.__init__(self, tokenizer, **kwargs)
71 self.pad_token_id = self.tokenizer.pad_token_id
72 self.pad_token = self.tokenizer.pad_token
---> 74 self.special_tokens = set(self.tokenizer.all_special_tokens)
76 self.vocabulary = self.tokenizer.get_vocab()
77 self.is_llama = isinstance(self.tokenizer, get_llama_tokenizer_types()) Package versions: python 3.11.8 hdf0ec26_0_cpython conda-forge
outlines 0.0.43 pyhd8ed1ab_0 conda-forge
transformers 4.41.2 pyhd8ed1ab_0 conda-forge |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
I think this is caused by the messy namespace collisions possible with the overlapping modules ( |
Beta Was this translation helpful? Give feedback.
-
You shouldn't be passing a
Also I'm noticing people using the internals rather than creating via |
Beta Was this translation helpful? Give feedback.
You shouldn't be passing a
TransformerTokenizer
, it is created during initialization. Could you please try passing theAutoTokenizer
instead?Also I'm noticing people using the internals rather than creating via
model = outlines.models.transformers(<model_uri>)
. Could you help me understand why you're using the classes directly rather than the helper function / what your use case is?